AI decodes Fruit Fly Vision and paves the way for human insight

Summary: Researchers developed an AI model of the fruit fly brain to understand how vision controls behavior. By genetically silencing specific visual neurons and observing changes in behavior, they trained an AI to accurately predict neural activity and behavior.

Their findings reveal that multiple combinations of neurons, rather than individual types, process visual data in a complex “population code.” This breakthrough paves the way for future research into the human visual system and related disorders.

Key facts:

  • CSHL researchers created an AI model of the fruit fly brain to study vision-driven behavior.
  • AI predicts neural activity by analyzing changes in behavior after silencing specific visual neurons.
  • The research revealed a complex “population code” where multiple combinations of neurons process visual data.

Source: CSHL

We are told, “The eyes are the window to the soul.” Windows work in two ways. Our eyes are also our windows to the world. What we see and how we see it help determine how we move through the world. In other words, our vision helps guide our actions, including social behavior.

Now, a young Cold Spring Harbor Laboratory (CSHL) scientist has revealed a major key to how it works. He was able to do this by building a special AI model of the brain of a common fruit fly.

Still, Cowley hopes his AI model will one day help us decode the computations that underlie the human visual system. Credit: Neuroscience News

CSHL assistant Benjamin Cowley and his team perfected their AI model using a technique they developed called “knockout training.” First, they recorded the courtship of the male fruit fly – the chasing and singing of the female.

Next, they genetically silenced specific types of visual neurons in male flies and trained their AI to detect any changes in behavior. By repeating this process with many different types of visual neurons, they were able to get the AI ​​to accurately predict how a real fruit fly would behave in response to any sight of a female.

“We can actually predict neural activity computationally and ask how specific neurons contribute to behavior,” says Cowley. “This is something we couldn’t do before.

With their new AI, Cowley’s team discovered that fruit fly brains use a “population code” to process visual data. Instead of one type of neuron linking each visual feature to a single action, as previously thought, many combinations of neurons were needed to shape behavior.

The diagram of these neural pathways looks like an incredibly complex subway map and will take years to unravel. Still, it gets us where we need to go. It allows Cowley’s AI to predict how a fruit fly will behave in real life when exposed to visual stimuli.

Does this mean that artificial intelligence can one day predict human behavior? Not so fast. Fruit fly brains contain about 100,000 neurons. The human brain has almost 100 billion.

“That’s how it is with a fruit fly. You can imagine what our visual system is like,” says Cowley, referring to the subway map.

Still, Cowley hopes his AI model will one day help us decode the computations that underlie the human visual system.

“It will be a decade of work. But if we figure it out, we’re ahead of the curve,” says Cowley. “By learning [fly] calculations, we can create a better artificial visual system. More importantly, we will understand disorders of the visual system in much greater detail.”

How much better? You have to see it to believe it.

About this AI and neuroscience research news

Author: Sara Giarnieri
Source: CSHL
Contact:Sara Giarnieri – CSHL
Picture: Image is credited to Neuroscience News

Original Research: Open access.
“Mapping model units onto visual neurons reveals a population code for social behavior” Benjamin Cowley et al. Nature


Abstract

Mapping model units onto visual neurons reveals a population code for social behavior

The rich variety of behavior observed in animals arises from the interplay between sensory processing and motor control. To understand these sensorimotor transformations, it is useful to construct models that predict not only neural responses to sensory input, but also how each neuron causally contributes to behavior.

Here, we demonstrate a novel modeling approach to identify one-to-one mappings between intrinsic units in a deep neural network and real neurons by predicting behavioral changes that arise from systematic perturbations of more than a dozen neuronal cell types.

A key component we introduce is “knockout training,” which involves perturbing the network during training to match the perturbations of real neurons during behavioral experiments. We apply this approach to the modeling of sensorimotor transformations Drosophila melanogaster males during complex, visually guided social behavior.

Visual projection neurons at the interface between the optic lobe and the central brain form a set of discrete channels, and previous work shows that each channel encodes a specific visual feature that drives a particular behavior.

Our model reaches a different conclusion: combinations of visual projection neurons, including neurons involved in nonsocial behavior, drive male-female interactions and constitute a rich population code for behavior.

Overall, our framework consolidates the behavioral effects produced by different neural perturbations into a single, unified model that provides a stimulus-to-neuronal-cell-type map to behavior and allows for future incorporation of brain wiring schemes into the model.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top