Apple’s next AirPods Pro upgrade is starting to make more sense. A recent theory links Apple’s $2 billion acquisition of the AI company Q.ai with long-running rumors about AirPods Pro models that include infrared cameras. Together, these moves point to a new way of interacting with devices without speaking out loud.
Apple bought the Israeli startup Q.ai in one of its most expensive acquisitions after Beats. Q.ai focuses on machine learning that can read whispered or silent speech and improve audio understanding in noisy environments. It also analyzes microfacial movements, tiny muscle changes around the mouth and face that appear even when a person does not speak.
It is also worth noting what Apple has already put on record. In July 2025, Apple received a patent that describes using camera-based systems similar to the Face ID dot projector for proximity detection and 3D depth mapping. While the patent does not name AirPods directly, the technology fits neatly with the idea of tiny infrared cameras that can track facial movement at close range
IR-Equipped AirPods Pro
In earlier reports, supply chain analyst Ming-Chi Kuo said Apple plans to add infrared cameras to future AirPods Pro models. He explained that these sensors could support gesture control, spatial awareness, and better integration with Apple Vision Pro.
More recently, prototype collector Kosutami claimed each earbud will include a camera that can sense the space around the wearer, opening the door to hands-free controls beyond touch and voice.
The theory connecting these two threads focuses on silent speech. The idea is simple. Infrared cameras on AirPods Pro would track micro facial movements, while Q.ai’s software would translate those movements into text or commands. That combination would allow users to send messages, control apps, or talk to Siri without speaking aloud.
Here’s what analysts on X claimed:
The cameras will pick up the user’s silent speech/whispers by analysing facial micro movements. This will let the user use voice-to-text in apps like iMessage without speaking out loud, or interact with Siri in a busy train without raising their voice. It will finally end the social stigma around saying “Hey Siri” or taking a phone call in public places.
The background of Q.ai’s founder adds weight to this idea. Aviad Maizels previously helped build PrimeSense, the company behind the core technology used in Face ID. That history suggests Apple already trusts this type of camera-based sensing and knows how to scale it across products.
AirPods Pro are not the only devices that stand to benefit. The same technology fits naturally with Vision Pro, rumored Apple Glasses, and other wearable hardware that relies on subtle, private input. Silent speech control would reduce the need for loud voice commands and constant hand gestures.
Pricing remains unclear. Some leaks suggest Apple will keep the AirPods Pro price unchanged, while others point to a higher-tier model that sits above the current lineup. Either way, the Q.ai acquisition now looks less mysterious. It signals Apple’s push toward quieter, more natural ways to interact with its devices.