Skip to main content

What Google’s New AI Fruit Fly Can Teach Us about Real Behavior

To learn how to move, groom itself and flap its wings, a fruit fly AI devoured hours of video of real insects

Close up of fruit fly head

The head of a real fruit fly (Drosophila melanogaster).

Luciano Richino/Alamy Stock Photo

The tiny fruit fly, one of the most popular model organisms in science, lives fast and dies at about 50 days old. But this brief life is anything but unremarkable. The fly fills its days with intricate routines and schemes—and, on occasion, romance. To better understand how such a minuscule brain can power these complex behaviors, scientists have already created a connectome, a virtual “map” showing the links between each of the fruit fly’s 200,000 neurons.

And now they’ve built a body.

Researchers at the Howard Hughes Medical Institute’s (HHMI’s) Janelia Research Campus in Virginia and Google DeepMind recently designed a virtual fruit fly that looks and moves like the real thing, making it easier for scientists to observe this favorite research animal’s surprisingly nuanced habits and movements. They posted their paper on the project, which has not yet been peer-reviewed, to the preprint server bioRxiv in mid-March.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


“Few organisms have been studied in as much detail along the complete scale of biology, from molecule to the behavior, as the fruit fly Drosophila melanogaster,” says developmental biologist Ruth Lehmann. Lehmann, who directs the Whitehead Institute for Biomedical Research and was not involved in the new project, has studied fruit fly genetics and body development. The virtual insect “portrays realistic behavior of a fly walking, flying and even grooming,” she says. “This type of research tests the limits of our fundamental understanding of biology.”

Compared with a human brain or an artificial neural network, both of which have trillions of connections, the fly brain is teeny and simple. But that doesn’t mean it’s easy to understand what goes on inside it. The connectome’s network of neurons tells you “who’s talking to whom, not what messages are being sent” within the brain, says the preprint paper’s senior author Srinivas Turaga, a neuroscientist at the Howard Hughes Medical Institute. The virtual fly project (which has not yet incorporated the digital connectome) instead focuses on behavior—the result, Turaga notes, of how a body translates nervous system connections.

To build this virtual insect, researchers first used high-resolution microscopes to scan a real female fruit fly’s anatomy—its limbs, wings and joints. From these fine-scale measurements, the team assembled a three-dimensional model within a physics simulation program called MuJoCo, short for Multi-Joint Dynamics with Contact, developed by Google’s DeepMind lab subsidiary. To get the virtual fly to move like a real one, the simulated body needed to learn from the source. That’s where artificial intelligence came in—more specifically, a subset of AI called reinforcement learning.

Reinforcement learning allows a machine to improve its performance by understanding an environment, watching a behavior, doing that behavior and receiving feedback. Then the process repeats until the machine gets the task right. (This same mechanism lies behind training self-driving cars, for example.)

To give the fly AI data to watch and learn from, Turaga and his colleagues recorded the movements of roving fruit flies’ joints and body and then tracked this motion with machine-learning algorithms. Imagine a motion-capture system such as the kind used by movie animators but for insects instead of, say, actor Andy Serkis: The algorithms placed virtual dots on living flies’ major joints and other body parts. These dots were matched to the corresponding positions on the virtual fly’s body. By ingesting hours of videos of real flies in motion (including 272 clips of flight trajectories alone), the AI learned how to move like a fly—without researchers’ explicit instructions to turn its legs or flap its wings. When scientists typed a command such as “walk straight at 2 centimeters per second,” the virtual fly would figure out by itself how to position its joints or how forcefully to push its feet against the ground.

Once DeepMind’s AI learned the various ways by which a real fruit fly contorts its body and how physical forces act on each part, the resulting virtual organism could manipulate its body with more than 100 degrees of freedom (referring to the possible positions the body could take). Because each degree of freedom adds a parameter that a machine must track, “that’s beyond what is currently considered state-of-the-art in the vast majority of robotics,” says Zach Patterson, a postdoctoral associate at the Massachusetts Institute of Technology’s Computer Science & Artificial Intelligence Laboratory, who was not involved in the study. Most realistic humanoid robots—including virtual ones—have about 30 to 70 degrees of freedom.

The virtual fly’s creators compared the walking speeds, gait, body orientation, flying trajectories and wing-beat patterns of their AI fly with real ones. Each of the virtual fly’s movements almost perfectly matched those of the living animals. When asked to take flight, the simulated fly performed, step for step, the exact same series of movements as real insects. “Everything is fitting together correctly, which gives us some confidence in the accuracy of our modeling,” says Matt Botvinick, senior director of research at Google DeepMind. “And we can offer this to the [research] community as a tool through which they might discover new things that are useful.”

Simulating animals is not a new idea; one years-long effort, OpenWorm, is trying to replicate the nematode Caenorhabditis elegans on a computer. What’s particularly innovative about the DeepMind insect is that its AI continuously learns from demonstrations and videos of real flies, Patterson says, without human intervention and without anyone “actually having explicitly programmed that behavior in.” That’s “pretty far beyond what most people in robotics do,” he adds.

Developers at DeepMind, which is known for its work simulating realistic virtual environments and how objects move in them, are building detailed computer models of several lab animals. Work on their first attempt, a rat, is still ongoing. Because of the fly project’s success, they’re looking to expand their virtual zoo to one day include dogs, ostriches and zebrafish.

The virtual fruit fly’s ability to imitate specific behaviors by “watching” videos of real ones could also be used to study the real insects’ emergent behaviors, Turaga says. For example, the AI could process video of living flies that had specific genes or neurons switched on or off. An AI fly that learns to walk like an animal with a genetic mutation could help researchers quantify the way a gene changes activity. “Someone can say, ‘We knocked this neuron out, and [the fly] doesn’t walk as well,’” he says. “Now with this, we can say it doesn’t walk well in this particular way.” Biochemists who disrupt, say, a fly’s dopamine signaling, could likewise observe precise effects on movement and the specific joints involved.

The new features HHMI and DeepMind added into the virtual model have also led Patterson, who creates animal-inspired soft robots, to adapt the MuJoCo simulator to his work. “We’re going to be using it to do stuff on a real robot,” Patterson says, adding that this will help him understand how a machine flies through air or swims through water. “If it does work well enough for trajectory generation, trajectory optimization and control purposes, there’s a good chance that this will see widespread use.”

Charlotte Hu is a science and technology journalist based in Brooklyn, N.Y. She's interested in stories at the intersection of science and society. Her work has appeared in Popular Science, GenomeWeb, Business Insider and Discover magazine.

More by Charlotte Hu