Newswise — Neuroscience experts from across Georgia Tech have come together to form the Institute for Neuroscience, Neurotechnology, and Society (INNS), an interdisciplinary research institute launched in July. Faculty in INNS are helping to solve some of neuroscience’s most pressing problems, and many of these solutions have promising medical applications. One important aspect of studying the brain is understanding how the brain and the body work together. Meet the researchers who study brain-body interactions, from monitoring the neuron degradation that causes Alzheimer’s to enhancing mobility for stroke survivors. Their efforts could improve the health and quality of life for millions of Americans. 

MOTOR CONTROL: Using Imaging to Help Limb Loss and Stroke Survivors

Prosthetics are often one of the first medical interventions for upper limb loss — but many patients don’t wear them. The advanced prosthetics from research labs are rarely covered by health insurance, and the types readily available to patients aren’t particularly comfortable or have limited apparent functionality. Lewis Wheaton, a professor in the School of Biological Sciences, is working to change that.

After a stroke or upper limb amputation, the brain must reconfigure neuromotor control. Wheaton’s research breaks down this problem into three areas: how neural networks organize to plan and execute complex behavior (like pouring a cup of coffee), how the brain responds after limb loss or stroke, and how researchers can use these brain changes to support motor learning through rehabilitation. Wheaton has been passionate about this work since graduate school.

“Ever since I saw my first patient in a clinical center, I realized how significantly these traumatic events can impact people’s lives — and also how many people desire to have some independence and sense of normalcy,” Wheaton said. “My heart was calling me to seek a career helping people experiencing this.”

Wheaton’s research tries to understand how the brain is affected after limb loss or stroke from a neural network perspective using imaging tools like fMRIs and EEGs. With this imaging, his group can determine how brain activity is communicated across cortices when someone is relearning fundamental motor skills. From there, Wheaton determines if neural activity can predict motor skill loss or acquisition and how to change therapeutic approaches to maximize motor learning in individual participants.

Wheaton’s ongoing work also seeks to understand how we perceive disability in others. 

“If you see me reaching to grab a cup of coffee with my hands, you probably know exactly what my intentions are, but if you see me going to reach it with a prosthesis, what are you predicting about the outcome?” Lewis said. “Imaging this can give us important data that we link back to behavior, perception, and basic understanding of actions in all people. This could help us discover how the neural networks are being modified through the process of learning and engagement.”

Wheaton’s research is primarily funded by the National Institutes of Health (NIH).

“Ever since I saw my first patient in a clinical center, I realized how significantly these traumatic events can impact people’s lives — and also how many people desire to have some independence and sense of normalcy. My heart was calling me to seek a career helping people experiencing this.” —Lewis Wheaton

MOBILITY: Using Robots to Rehabilitate Gait Impairment

Strokes can upend a person’s life in a moment. One of their most debilitating consequences is difficulty walking and balancing, but George W. Woodruff School of Mechanical Engineering Associate Professor Aaron Young is working to steady things. At his Exoskeleton and Prosthetic Intelligent Controls Lab, Young focuses on lower limb mobility for individuals with neurological disorders, including stroke and cerebral palsy.

“After a neurological event like a stroke, the central nervous system attempts to reorganize itself to compensate for those losses,” Young said. “Many times, this creates movement problems, such as gait issues involving speed and balance.”

Young places electrodes on the skin of both able-bodied and injured subjects to measure neural signals, a process called electromyography. Combined with a machine learning algorithm, Young can determine how to better control prosthetics and develop rehabilitation exoskeletons. These exoskeletons are external wearable devices that can help a person retrain their movement patterns. For example, people with gait impairments often hyperextend their knees, but this exoskeleton can guide their knees to prevent these movements.

“It’s about training the nervous system to hopefully encourage healthier behavior that supports long-term clinical goals,” Young noted.

Young’s work is mostly funded by NIH.

MENTAL HEALTH: Mapping the Brain to Solve Treatment-Resistant Depression

A road map of the brain could give us better directions for interpreting and treating mental health issues, and better understand how the brain interacts with the body in these conditions. For decades, treatment-resistant depression has frustrated patients and clinicians because — as the name suggests — it cannot be alleviated by medication or therapy. Part of the disorder’s challenge is that patients not only struggle with typical depressive symptoms such as apathy and anhedonia (the inability to experience pleasure from previously rewarding activities), but they can also feel a heaviness and lethargy in their body that makes accomplishing tasks challenging. Even activities that might be rewarding and improve their mood seem all but impossible.

Chris Rozell is the founding director of the Institute for Neuroscience, Neurotechnology & Society.

Chris Rozell, a professor in the School of Electrical and Computer Engineering, approaches this problem from a neuroscience perspective: He records brain activity to see how neural circuits fire when performing tasks that people with depression often struggle with. 

“One of our particular interests is trying to understand not just the mood symptoms but how depression shows up in the body,” Rozell said. “We want to understand people’s willingness to make choices and the effort/reward tradeoff of those choices versus the pull of the depression.”

Rozell has been an early pioneer of applying a neurotechnology lens to treating psychiatric disorders. Since 2014, he has worked with clinical collaborators to understand how implanted pacemakers for deep brain stimulation can rewire brain circuits that have gone wrong during depression.

“While many neurological disorders like Parkinson’s primarily affect one part of the brain, psychiatric conditions are often network disorders, where multiple parts of the brain are miscommunicating,” Rozell said. “It was pivotal when we showed you could use data about brain activity to understand the circuits and symptoms of treatment-resistant depression.”

The researchers in Rozell’s lab not only map the brain — they may be able to improve it. With a small injection of a current from the pacemaker, treatment-resistant depression patients may experience symptom relief. Rozell and his colleagues are developing ways to use data to guide the clinical teams as they care for patients getting these new therapies, which may have application beyond depression.

Rozell’s research is mostly funded through NIH’s BRAIN Initiative

“While many neurological disorders like Parkinson’s primarily affect one part of the brain, psychiatric conditions are often network disorders, where multiple parts of the brain are miscommunicating. It was pivotal when we showed you could use data about brain activity to understand the circuits and symptoms of treatment-resistant depression.” —Chris Rozell

MOVEMENT: Harnessing Machine Learning to Break Down the Basics of Movement

Grabbing a coffee cup may seem like a simple action, but many joints work together to make the movement happen fluidly. With diseases like Parkinson’s and Huntington’s, the brain regions that control movement start malfunctioning, and suddenly, the easy task of picking up a cup becomes herculean. To better understand these movement disorders, we need to understand the basics of movement, and that’s where Jeff Markowitz’s lab comes in.

“If we watch a person playing tennis, we might label their movement as a tennis swing, but a tennis swing involves a complicated set of movements we need to quantify,” said Markowitz, an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering. To track these movements, Markowitz’s lab developed injectable nanoparticles that can light up inside a mouse model and help researchers see how each joint, ligament, and muscle moves independently. Mice move too fast for humans to note their every movement, but machine learning with high-resolution motion capture can label each action, enabling researchers to track them.

Eventually, Markowitz hopes to be able to use this motion capture method in mouse models the same way we can for humans. From there, the researchers could mimic, in a mouse, symptoms like the tremors Parkinson’s patients experience.

“To quantify movement, we need considerable precision in recording movement, which is difficult to achieve in tiny, fuzzy mice typically used in the lab,” Markowitz said. “We’re trying to combine our ability to manipulate genes and neural activity in mice with our new motion capture system, so we can come up with a foundational understanding of motor control in both health and disease.”

Markowitz’s work is funded through the McCamish Foundation, David and Lucille Packard Foundation, Burroughs Wellcome Fund, and the Sloan Foundation.

PERFORMANCE: Studying How Wearable Robots Can Improve Movement

What if a robot could make a 60-year-old’s body move like a 40-year-old’s? That’s one of the goals of Greg Sawicki’s lab, where they examine the relationship between robots and the human body.

“We’re interested in how we can use wearable robotics to not only restore movement for people but also maintain movement into older age,” said Sawicki, a professor in the School of Biological Sciences and the Woodruff School of Mechanical Engineering. His work even has applications for professional athletes or soldiers seeking to enhance their performance with robotics.

Of course, human movement changes when wearables are involved. So, much of Sawicki’s work involves carefully measuring people’s physiology in real time. He does this with high-speed cameras, force sensors, electromyography, and bioimaging tools like ultrasound that can monitor muscles while a wearable is on a person. His research team’s current project is measuring how people maintain balance to determine the best way to recover from a slip or stumble. Once they accumulate a substantial volume of data, they can use machine learning tools to generate algorithms that emulate human responses. That code can be uploaded to a small, fast computer for a wearable robot. This robot can respond faster than muscles and eventually prevent people from stumbling as much

“As engineers, we think of the human body as a machine and apply engineering principles to understand movement, as if the body were a robot,” Sawicki noted. “But we need to understand how human physiology is different from engineered things, like the contrast between a muscle and a motor. By examining physiology through the lens of engineering, we can understand what’s special about the physiology.”

Sawicki’s work is predominantly funded through NIH’s National Institute on Aging.

“We’re interested in how we can use wearable robotics to not only restore movement for people but also maintain movement into older age.” —Greg Sawicki

PERCEPTION: Discovering How We Make Decisions Through Movement

Simon Sponberg holds a hawk moth.

When a person maneuvers through a crowd, they perceive the world quickly, making thousands of snap decisions. Simon Sponberg studies this perception, but through an unexpected lens: movement.

“I explore how animals move, what they need to do in their environment to get around, and how their perception enables that,” explained Sponberg, an associate professor in the Schools of Physics and Biological Sciences. “We work from the periphery of the motor system back to perception as an underappreciated path to understanding how our brains work.”

In particular, Sponberg studies insects to learn more about perception. In the lab, he places tiny electrodes on the heads of hawk moths, a large flying insect with strong sensory perceptions. The researchers emit sensory stimulants like lights or smells to attract the moths and see how their brains process the stimulants to make rapid decisions in complex environments.

This research shows how moths make decisions, but it could also have applications for artificial intelligence, robotics, and the military. “How do I pick out what’s interesting, what’s a potential threat, and what I need to respond to very rapidly?” Sponberg said. “Insects are able to do this with extremely low computational power, and that has huge implications.”

Sponberg’s work is primarily funded by Air Force Office of Scientific Research.

BEHAVIOR: Applying Machine Learning to Neuroscience

When neuroscientists collect data on how animals or people perform tasks, they end up with thousands of data points. Researchers like Anqi Wu help them make sense of this data. As a computational neuroscientist, Wu works at the intersection of machine learning and neuroscience.

“I develop data-driven statistical tools to understand neural signals and behavior as well,” said Wu, an assistant professor in the School of Computational Science and Engineering.

She collaborates with experimental scientists at Georgia Tech, Emory University, and the University of Washington to pull insights from their work. Many datasets represent a complex interplay between neural and behavioral signals — for example, how neuron activity unfolds as a mouse navigates a maze. Wu applies machine learning techniques to uncover meaningful insights from such data.

“You can’t just look at raw pixels of video recordings and expect to extract meaningful insights,” Wu said. “Neuroscientists transform these videos into lower-dimensional representations that make it easier to interpret animal behavior — like detecting whether the animal is turning left or right, sniffing, or running.”

The goal is to understand patterns in animal behavior. For example, behavioral data might show that a mouse is moving, then pausing, then stopping to interact with others. Beyond simply describing these actions, Wu develops advanced models to uncover the underlying goals driving these movement sequences — such as searching for water or returning home to rest.

“I want to extract all the information from this complex, seemingly random behavior,” she said. “With this structured information, we can figure out what internal goal the animal is trying to achieve.”

Wu’s work is funded mostly by NIH and the National Science Foundation.