Embodied computation for emergent goal-oriented behavior in soft robots
In living organisms, directed behavior arises from repeated rhythmic (oscillatory) motions whose sequence and timing are robustly coordinated. This coordination is typically in part or even fully distributed throughout the organism: animals employ central pattern generators within their nervous systems, plants utilize distributed mechanoreceptors, and fungi leverage expansive mycelial networks. Such decentralized orchestration offloads computation from a central brain to the body, allowing behaviors to emerge naturally through interactions between the body and its environment.
This thesis explores alternatives to centralized control inspired by decentralized systems in nature. It identifies sequences and timing of oscillations leading to directed locomotion in soft robots. We aim to embody directed behavior in the physical system so that purposeful actions emerge from local body-environment interactions and feedback. Through an exploratory study spanning design, simulation, and hardware, we demonstrate how soft robotic systems can leverage their embodied mechanical intelligence using embodied computation to achieve complex autonomous behaviors without a centralized processor.
As a start, we draw inspiration from the physiology and decentralized nervous system of echinoderms (e.g., sea urchins, brittle stars, feather stars, and sea cucumbers) to examine how decentralized feedback can facilitate directed locomotion towards a light source (phototaxis) in limbed soft robots. We build a modular system where each limb is a self-contained module that stochastically optimizes its behavior with a feedback loop based on limited sensing, short-term memory, and computation. By harnessing the inherent mechanical intelligence of soft pneumatic actuators, cyclic on-off inputs to a pump at a fixed frequency are converted into complex bending and stepping motions. By physically connecting multiple limbs and letting each limb independently learn the phase of its oscillating motion, coordination between the limbs emerges.
We show that, similar to echinoderms, such as sea stars, interactions of the individual limbs with the environment guide the robot toward coordinated movement patterns without relying on comprehensive full-body representations or complex algorithms. The soft robot dynamically re-coordinates its movement in response to changing conditions (changes in actuators and damage) without any central controller. Resilient, whole-body locomotion thus emerges from the interplay of many basic units, each with limited memory and no body awareness, demonstrating a route of adaptable goal-directed movement sequencing in soft robotics through embodied computation.
To gain a better understanding of how this coordination emerges, we build a second modular system of self-contained units. In this system, the modules use the same strategy for sensing and processing, but we limit the actuation, making them immobile on their own. Instead, they expand and contract their connections to the other physically connected modules on a two-dimensional plane. When interconnected in two-dimensional grid configurations, the system as a whole can break the symmetry of the friction to achieve locomotion, similar to earthworms that expand and contract segments. By combining simulations and experiments, we gain an understanding of how this decentralized strategy can follow locally optimal sequences solely from the implicit communication facilitated through their physical connection (as the system moved toward the light, the connected units all increased their light intensity).
The simulations also provide insight into how the sequences that the system produces are linked to the potential behaviors of the system and how these change with different configurations and in dynamic, unstructured environments. These results not only demonstrate that robust, directed locomotion in soft robots can emerge entirely from local environmental interactions but also show the profound link between the coordination strategy and the body morphology. They also illustrate the dynamic nature of the learning process as it adapts to changing, partially observable environments.
While the work mentioned above focuses on reducing the hardware and complexity of algorithms needed to coordinate the sequences of movements starting from random behaviors, it still requires many electronic components to make the individual modules. Therefore, we next aim to embody sequences without relying on electronics, by harnessing soft fluidic circuits with integrated magnetic components. By designing a fluidic relaxation oscillator that produces an oscillating output for a fixed input flow, we can encode the rhythmic inflation-deflation cycles into a single component. We implement directional air-driven coupling between the relaxation oscillators to emulate biological central pattern generators, orchestrating the rhythmic motions without electronics. By altering the fluidic coupling between them, we demonstrate rapid and reversible reprogramming of the oscillation sequences and timings. Such physically embodied control paves the way for soft robotic systems equipped with decentralized locomotion primitives, eliminating dependence on complex electronics and centralized controllers.
Lastly, the approaches above start with predetermined morphologies, whereas natural organisms demonstrate how body morphology and embodied computation evolve synergistically over longer time scales. Inspired by this co-evolutionary principle, we simulate coupled oscillator networks as mentioned above, integrated within evolving soft robotic morphologies. We show that oscillator networks with minimal complexity (number of oscillators and number of connections), when co-designed with the body morphology, can enable robots to transition spontaneously between distinct behaviors, such as climbing or running, in response to environmental feedback. These findings emphasize how thoughtful morphological and feedback co-design can embed rich, context-specific behaviors into relatively simple physical structures.
Collectively, this work contributes to the broader vision of autonomous soft robots with distributed intelligence, where sophisticated, goal-directed behaviors emerge from the continuous interplay of body and environment rather than explicit centralized command. These results, using embodied computation, pave the way towards soft robots that harness their mechanical intelligence to complete tasks autonomously in real-world scenarios.