Researchers have built an artificial intelligence system that detects a fruit fly’s courtship song the moment it begins and immediately shuts down the neurons that produce it.
The result turns fleeting animal interactions into direct tests of which brain cells cause them.
In a small courtship chamber, a male fruit fly began to extend his wing to sing, and the system cut the movement off before the note could unfold.
Working with those recordings, Professor Azusa Kamikouchi at Nagoya University showed that the software could single out the courting male even when multiple flies moved in close contact.
Each time the wing started to rise, the program identified the behavior immediately and triggered a light pulse that silenced the targeted neurons without affecting nearby animals.
That capacity to link one individual’s action to instant, selective neural control set the stage for testing how brains drive social behavior in real time.
AI detects behavior fast
Instead of tracking individual legs or wings over time, the new AI system, called YORU, identifies an entire posture as a single behavior in one video frame.
After training on labeled examples, the software drew a box around the action and named it as it appeared.
Across flies, ants, and zebrafish, YORU hit 90% to 98% accuracy for several hard social behaviors.
Such fast calls mattered because the system had to act before a brief wing flick or head turn ended.
Whole posture detection works
When animals overlap, body-part tracking can lose a leg or swap identities, and the behavior label falls apart.
Traditional tools tracked key points frame by frame, but social contact hid those points and made the math unstable.
Even widely used marker-free tracking tools struggled in crowded arenas, where overlapping bodies hid limbs and confused which animal was which.
By treating the whole posture as the clue, YORU kept working in crowds, although longer sequences still posed challenges.
Brain control in real time
In a closed-loop, a system that reacts immediately to detected behavior, speed mattered as much as accuracy.
From camera frame to trigger pulse, the full loop averaged about 31 milliseconds in tests, quick enough for many acts.
Against a popular pose tracker, the same setup ran about 30% faster, cutting average delay from about 47 milliseconds.
With delays that low, YORU could switch light on while the behavior still happened, not after the fact.
Light silences specific neurons
To control neurons on cue, the researchers first engineered flies so chosen brain cells responded to green light.
Through optogenetics, using light to switch specific neurons on or off, a 2015 review traced how light-sensitive proteins control signaling.
Once YORU detected wing extension, it sent a signal to a lamp, and the light silenced courtship neurons, lowering mating success.
“We can silence fly courtship neurons the instant YORU detects wing extension,” said Kamikouchi, the study’s senior author.
Targeting a fly in the crowd
Earlier brain-control setups lit an entire arena at once, so every animal received the same command together.
By feeding location data from each frame to a projector, YORU aimed light at one fly while others kept moving.
During a two-fly test, the moving light stayed on the intended target for 89.5% of the stimulation time.
That precision let researchers change one animal’s neural input during a social moment, without scrambling the rest of the group.
Brain activity and behavior patterns
Beyond controlling behavior, the same software helped interpret brain activity by matching what an animal did to what its cortex showed.
Using calcium imaging, which tracks glowing signals that follow neuron activity, the team linked mouse running and grooming to distinct patterns.
Afterward, maps built from YORU labels matched maps built from human scoring, supporting the tool as a reliable readout.
Those links can help scientists decide which neural signals reflect real behavior, instead of treating every brain flicker as meaning.
Limitations of the system
Some social acts look different only across several frames, so a single-frame detector can miss the start or end.
Without built-in identity tracking, YORU could spot a behavior but not always confirm which individual kept doing it later.
Hardware also set a limit, because projectors and controllers introduced extra delay that could let a fast animal escape illumination.
Better prediction and lower-latency gear could extend the approach, yet each lab will still need careful calibration.
Future research directions
Making the system usable mattered, because many biology labs lack staff who can code or tune models.
With a graphical interface, users trained new behavior detectors from a small set of labeled frames and clicked to run tests.
Because YORU treated behaviors as objects, it could plug into lights, cameras, and other gear already sitting on benches.
Wider access may speed up studies that connect circuits to social choices, though ethical rules will need to keep pace.
By pairing instant behavior detection with equally rapid neural control, the system allows scientists to test cause and effect at the exact moment an action unfolds.
Future work will focus on capturing longer, more complex behaviors and trimming hardware delays. This will ensure that individual animals can be targeted accurately even within larger, more dynamic groups.
The study is published in the journal Science Advances.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
—–