A neural network based on a nematode worm’s connectome can puppeteer a digital fruit fly’s body, a new preprint shows. The work comes just two weeks after Eon Systems, a neurotechnology company based in San Francisco, announced that it had “uploaded” a fly brain and released a video of that brain controlling a biomechanical fly model in a virtual world.
“We need to be really careful in interpreting this kind of work,” says Bing Wen Brunton, professor of biology at the University of Washington, who posted the new preprint on bioRxiv in response to Eon Systems’ announcement. Working with her team, Brunton strung a biophysical model of a Drosophila body to a simulation of the Caenorhabditis elegans connectome and trained that “digital sphinx,” a term coined in the preprint, to walk using deep reinforcement learning—all with a “brain” that wasn’t a fly brain at all.
Brunton’s work points to an important control for other researchers looking to combine deep learning and connectomics to simulate fly behavior, says Benjamin Cowley, assistant professor at Cold Spring Harbor Laboratory, who was not involved in the preprint. It enables them to ask, “If I just created a randomly connected connectome, could it also do the same behaviors?” he says.
T
he problem with connectome models is that they do not capture the biophysical properties of neurons or the pools of neurotransmitters that modulate neural communication, and they exist without a body, says Srinivas Turaga, group leader at the Howard Hughes Medical Institute’s Janelia Research Campus, who was not involved in the work.
To circumvent these shortcomings, researchers are starting to use deep reinforcement learning to relate the connectomes to behavior. But using these techniques to model biological processes also has pitfalls, as illustrated by the model Brunton and her team created.
Deep reinforcement learning is a process of optimization, Brunton says, and biological systems don’t always work optimally. This approach can work “really well” in capturing fly behavior, she says, even when the model isn’t biologically realistic.
For lack of a better term, there’s so much BS out there.
—
—
Bing Wen Brunton
Eon assembled its fly from three previously published datasets: a biophysical model of the fly body called NeuroMechFly, a fly brain connectome, and part of the fly visual system. And the company also used deep reinforcement learning to stitch the pieces together, training the networks to emulate a walking fly. But the Eon video quickly received pushback, and Turaga says that any random network connected to the NeuroMechFly model in this way might generate a walking fly. Because Eon hasn’t published the specifics of how they built their fly, it’s unclear if the model is any more accurate than a random network, Turaga says.
Philip Shiu, head of engineering at Eon Systems, doesn’t entirely disagree. “I think it’s fair to say that this is not a full blown copy of a fly. My personal preference might be to say maybe we ought to call this a digital twin or an embodied model,” he says. “Obviously, part of the intention of the company was to say: This is something that’s really exciting and cool and not science fiction as it has been in the past.”
Even a “small network of 300 neurons” contains enough information for deep learning to extract patterns that drive realistic behaviors, Cowley adds. “There’s enough randomness in this network that you can map it to fly legs and make them move in a reasonable way.”
B
runton was familiar with Eon’s work before it was released, she says; early last year, the company approached her and her colleague, John Tuthill, professor of neurobiology and biophysics at the University of Washington and an investigator on the preprint, in hopes of collaborating, though that never came to fruition. Brunton says she also saw Eon present a poster on its virtual fly at last year’s Society for Neuroscience conference.