As I’ve established previously, AI is coming for your wireless earbuds, whether you like it or not, and that trend may extend to AirPods. According to leaked code from an iPhone prototype running an early build of iOS 26, AirPods might get a hefty dose of Apple Intelligence as soon as spring 2026.
Per MacRumors, among those incoming features is Visual Look Up, which is an existing Apple Intelligence feature that surfaces information on things you point your camera at or photos on your phone. That feature has traditionally worked in Safari and the Photos/Camera app and functions similarly to other computer vision-based lookup features seen in Pixel phones, as well as AI gadgets like Ray-Ban Meta AI glasses. Does this just mean AirPods will integrate better with Visual Look Up on your iPhone? Or is there a more pivotal (camera-based) AirPods upgrade at play here, too?
There is also a reference to Contextual Reminders, according to MacRumors, which is another Apple Intelligence feature that uses AI to surface reminders based on stuff like location data. Theoretically, you could use this feature to set reminders like “remind me to get milk when I’m at the store” or “remind me to drink water when I’m at the gym,” but hopefully you don’t need your phone to do the second one.
© Adriano Contreras / Gizmodo
Maybe the most mysterious AI feature reference in MacRumors’ findings is “ConversationBreakthroughVQA,” which may reference Intelligent Breakthrough, an Apple Intelligence feature that uses AI to determine when to break the parameters of your Focus modes or Do Not Disturb to deliver important notifications. I’m not 100% sure on this one, though, since it doesn’t actually reference “Intelligent Breakthrough” word for word. Maybe it’s a similar feature that decides when to interrupt you with notification readouts while translating or listening to music? Maybe it’s just Intelligent Breakthrough by a different internal name?
One thing that piques my interest is the fact that it features “VQA” at the end, which could be a reference to “visual question answering,” a technical way of describing computer vision that answers questions about your environment. Speaking of “visual,” two of the three features are also setting off alarm bells with the context of rumors that Apple is considering including cameras in future versions of its AirPods. Needless to say, both Visual Look Up and Conversation Breakthrough with “VQA” slapped on the end evoke computer vision and the cameras it requires. Maybe those camera pods really are around the corner?
It’s not all AI in the code reviewed by MacRumors. There’s also a reference to “precise outdoor location understanding,” which feels like a reference to some kind of AirTags-level location finding. While certain models of AirPods (AirPods Pro 2 and above, as well as AirPods Max) have precision finding, the range is currently limited. About 30 to 60 feet if you have AirPods Pro 2, and 200 feet if you have AirPods Pro 3 with Apple’s U2 Ultra Wideband chip. That chip, for reference, is in the charging case, not the earbuds. It’s hard to say if Apple plans on extending that range or if it just found a way to make the tracking more precise within that range, but it’s hard to argue in either case since both could save your ass.
Obviously, code references aren’t much to go off, so a lot of this is still speculation, but if those bits of code are right, we won’t have to wait very long to find out for sure what new features Apple has in store for AirPods.