From virtual worlds to medical applications: microrobots learn to autonomously navigate blood vessels
Imagine a microrobot navigating the vessels of the human brain by learning from its own “dreams”. Researchers led by Daniel Ahmed, Professor of Acoustic Robotics for Life Sciences and Healthcare at ETH Zurich’s Department of Mechanical and Process Engineering (D-MAVT), have developed a new method that enables microrobots to learn autonomous navigation through simulation.
“Our goal is to create microrobots that can intelligently navigate the human body, adapting to its complexity in real time,” explains Ahmed, whose research project receives support from the European Research Council (ERC) through its SONOBOTS Starting Grant. The potential of microrobots in future medical applications is vast. They could one day deliver drugs to hard-to-reach tumors, clear blockages in blood vessels, or perform microsurgeries—all without invasive procedures.
However, controlling microrobots inside the human body presents significant challenges. At such small scales, traditional sensing and navigation technologies like GPS or LiDAR are impractical. Instead, microrobots must rely on indirect actuation methods such as magnetic fields, light, or ultrasound. Among these, ultrasound presents a promising non-invasive approach, offering both deep tissue penetration and tunable propulsion. However, precisely steering microrobots using ultrasound proves exceptionally complex, particularly within the data-scarce, high-dimensional, and constantly changing biological surroundings of the body. Here, real-time control requires accounting for fluid flow, tissue interactions, and the microrobots’ variable responses to acoustic signals.
To overcome these challenges, Ahmed’s team developed a new method that allows microrobots to "dream" in simulated, imagined environments. “We created virtual worlds that mimic the physics of real-world conditions, enabling the microrobots to learn how to move, avoid obstacles, and adapt to new environments,” explains Daniel Ahmed. “These skills they later apply in real biomedical settings.”
At the core of their research is model-based reinforcement learning (RL), a form of artificial intelligence that allows machines to learn optimal behaviors through trial and error. Unlike traditional RL, which demands vast datasets and long training times, this approach is sample-efficient and tailored to the constraints of microrobotic systems. The microrobots learn from recurrent imagined environments — essentially, simulations that mimic the physics of real-world conditions.
In experimental trials, microrobots pre-trained in simulation achieved a 90 percent success rate in navigating complex microfluidic channels after one hour of fine-tuning. Even in unfamiliar environments, they initially succeeded in 50 percent of tasks, improving to over 90 percent after 30 minutes of additional training. These capabilities were further validated in real-time experiments involving static and flowing conditions in vascular-like networks.
This research project “shows that ultrasound-driven microrobots can learn to adapt in real time,” says Mahmoud Medany, co-lead author of the study. “It’s a step toward enabling autonomous navigation in living systems.” By combining ultrasound propulsion with AI-driven control, Ahmed’s team is laying the foundation for a new generation of intelligent, non-invasive medical tools.
Reference
Medany M, Piglia L, Achenbach L, Karthik Mukkavilli S, Ahmed D. Model-based reinforcement learning for ultrasound-driven autonomous microrobots. Nature Machine Intelligence, 26 June 2025. DOI: external page 10.1038/s42256-025-01054-2