An AI tool called SLEAP labels the body parts of flies. Credit: Talmo Lab at the Salk Institute
AI-enabled micro-measurements of animals running, hunting, preening and playing are unlocking troves of new data that scientists now want to use to simulate animals and test theories about behavior and the brain.
Why it matters: A primary function of the brain is to produce behavior and help animals move through the world — but there are questions about how that happens, with ramifications for medicine and efforts to create artificial general intelligence (AGI).
- "The body gives us a missing link between what the brain evolved to do and how it operates in the real world," says Talmo Pereira, a fellow at the Salk Institute for Biological Sciences who uses machine learning to study animal behavior. "You don’t think without a body."
- A better understanding of the circuits of neurons involved in different behaviors could aid researchers trying to develop treatments and medicines for psychiatric and neurodegenerative diseases, whose early symptoms can manifest as subtle changes in behavior.
What's happening: AI methods are increasingly being used to help scientists measure the behaviors of animals, a laborious task that typically involves researchers watching animals and tracking and annotating their movements.
- The tools — SLEAP, DeepLabCut, and others — rely on deep neural networks and computer vision techniques called pose estimation that identify the joints of a body (the left knee, right shoulder or tip of the tail) in an image or video and output them as coordinates in space.
- Pose estimation powered Microsoft's Xbox Kinect, which used data from infrared sensors to track the movements of players and then fed their actions back into the character in a game.
- It has also been used for analyzing the performance of athletes, tracking the poses of dairy cows to monitor their health and other applications. But there are potentially more nefarious uses too: for example, surveilling people based on their gait.
Animal behavior scientists, or ethologists, use the AI tools to track the natural behaviors of single — and more recently multiple — animals. The information can be used to recreate the behavior in a lab, where at the same time researchers can measure the activity of neurons in the brain or silence them, and see the effect on behavior.
- The tools have yielded insights about how marmoset monkeys catch flying insects, the neural basis for different behaviors — fighting back or running away — mice exhibit when they are bullied, and more.
The AI tools are "really powerful because you are getting behavior quantification at the scale the brain works — at millisecond precision," says Cory Miller, a neurobiologist at the University of California San Diego, who studies the neural mechanisms of behaviors in marmoset monkeys.
- Pereira is using SLEAP, the tool he developed, to quantify the body language of museum-goers, to try to detect early changes in behavior related to ALS and to look at how genetic changes to plants affect their root systems.
Another tool called MoSeq finds smaller components of movement — what the tool's developer, Harvard University neurobiologist Bob Datta, calls "syllables." His research group has identified about 50 of these short units of behavior and the sequences in which they tend to occur in order to identify and predict different behaviors.
- They recently used MoSeq to study the impact of hormones on the behavior of laboratory mice. They found the behavior of female mice, which are often excluded from scientific studies because there is a notion that changes in their hormones influence their behavior, is more predictable than that of males.
Yes, but: The brain does not output coordinates, Pereira says. "It does not think in x,y, z changes in position of wrist."
- "We need a way to connect what we’re seeing — these traces of movement — to the system that actually gives rise to it, the brain," Pereira says.
What's next: An effort in its early stages is underway to use behavioral data to create simulated bodies, or animals.
- These "digital twins" — doubles of mice, rats and flies with fully modeled limbs, skeletons, and muscles — could be put in a video game and trained to behave like the real animal, Pereira says.
- The idea is to compare the behavior of the fake mouse, which is based on a model of the brain-body relationship, to a real mouse. Based on the differences, the model would be updated until the fake mouse behaves like a real mouse.
- "If you can generate behavior computationally, then the machine will give you insight about how the brain actually does it," says Datta, adding that the possibility is "really exciting."
Between the lines: Building these simulated animals will likely require combining the different approaches of AI models used to track — and developing new tools, Datta says.
The big picture: There is an active debate about whether any artificial general intelligence will need to be embodied.
- One camp of prominent AI researchers recently advocated for an "embodied Turing test" to shift the focus away from AI mastering games and language to models that "interact with the sensorimotor world."
- "The body is always there, serving as the primary filter between the brain and the world," Pereira says. Its biomechanics, biophysics and physiology are "the key translator between what the brain actually wants to do and what actually happens in the world."
"behavior" - Google News
July 08, 2023 at 06:35PM
https://ift.tt/v6kpaJS
AI tools trace the body's link between the brain and behavior - Axios
"behavior" - Google News
https://ift.tt/rAXaxuZ
Bagikan Berita Ini
0 Response to "AI tools trace the body's link between the brain and behavior - Axios"
Post a Comment