SAN DIEGO—Engineers at the University of California San Diego have developed a next-generation wearable device that enables people to control robots and other machines using everyday gestures. It combines stretchable electronics with artificial intelligence to overcome a long-standing challenge in wearable technology: reliable recognition of gesture signals in real-world environments.

Wearable technologies with gesture sensors work fine when a user is sitting still, but the signals often start to become distorted with excessive motion noise.

The new device is a soft electronic patch that is glued onto a cloth armband. It integrates motion and muscle sensors, a Bluetooth microcontroller and a stretchable battery into a compact, multilayered system.

The system was trained from a composite dataset of real gestures and conditions, from running and shaking to the movement of ocean waves. Signals from the arm are captured and processed by a customized deep-learning framework that strips away interference, interprets the gesture, and transmits a command to control a machine—such as a robotic arm—in real time.

“This advancement brings us closer to intuitive and robust human-machine interfaces that can be deployed in daily life,” says Xiangjun Chen, Ph.D., a postdoctoral researcher working on the project. “By integrating AI to clean noisy sensor data in real time, the technology enables everyday gestures to reliably control machines even in highly dynamic environments.”

People used the wearable device to control a robotic arm while running, exposed to high-frequency vibrations and under a combination of disturbances.

“This work establishes a new method for noise tolerance in wearable sensors,” claims Chen. “It paves the way for next-generation wearable systems that are not only stretchable and wireless, but also capable of learning from complex environments and individual users.”

According to Chen, industrial workers and first responders could potentially use the technology for hands-free control of tools and robots in high-motion or hazardous environments. “It could even enable divers and remote operators to command underwater robots despite turbulent conditions,” he points out. “In consumer devices, the system could make gesture-based controls more reliable in everyday settings.”





Looking for quick answers on assembly and manufacturing topics?


Try Ask ASM, our new smart AI search tool.













Ask ASM →