Soft robots that learn to follow the light

Soft robots made of simple, identical building blocks can learn to move toward a light source without a central controller or explicit communication, according to new research published in Advanced Functional Materials. The study shows how phototaxis, directed movement in response to light, can emerge from local sensing and learning alone. This finding is relevant because it demonstrates how adaptive, goal-oriented behavior can arise in robotic systems without centralized control.
A collective discovery
The research was carried out by AMOLF researcher Mannus Schomaker (Soft Robotic Matter) in collaboration with Eindhoven University of Technology. The team developed a modular soft robotic platform in which each unit contains its own motor, light sensor, and microcontroller. By assembling these identical units into different shapes, the researchers demonstrated that robots can consistently move across a surface toward a light source, despite having no knowledge of their overall geometry.
Why phototaxis without a brain matters
In most robots, sensing and decision making are handled by a central computer. This makes them powerful but also vulnerable in unpredictable environments. In contrast, many natural systems rely on distributed intelligence, where global behavior emerges from local interactions. By focusing on phototaxis, a clear and measurable task, the AMOLF team investigated how complex behavior can arise without centralized control. The study directly contributes to research on emergent behavior by showing how coordinated motion can appear even though no individual module has an overview of the system.
What makes the findings special
A key result is that the robotic modules do not communicate with each other directly. Each unit only measures changes in light intensity at its own location and adjusts the timing of its movement based on a simple learning rule. Coordination emerges implicitly through the physical connections between modules and their interaction with the environment. The same rule works across different robot geometries, from small clusters to larger assemblies, without retuning. This combination of simplicity, generality, and robustness makes the findings distinctive.
Robust behavior in changing environments
The researchers tested the system under a range of challenging conditions. When the direction of the light source was suddenly changed, the robots adapted and learned to move in the new direction. When obstacles were placed in their path, the assemblies adjusted their motion and continued progressing toward the light. In a striking demonstration, a larger robot was physically cut in half during an experiment. Both halves independently relearned how to perform phototaxis, highlighting the resilience of the decentralized approach.
Learn more
If you have questions about this research, please contact Bas Overvelde (email: b.overvelde@amolf.nl).
This paper was published in Advanced Functional Materials, Robust Phototaxis by Harnessing Implicit Communication in Modular Soft Robotic Systems, by H.A.H. Schomaker, S. Picella, A. Küng Garcia, L.C. van Laake, J.T.B. Overvelde.
Read the full paper