Google is launching its own pair of AI-powered glasses in collaboration with Samsung, Gentle Monster, and Warby Parker, the company announced in a blog post on Monday, Dec. 8.

Google is working with the electronics company and glasses brands to design “stylish, lightweight glasses that you can wear comfortably all day,” the blog states. The companies will create two types of glasses: AI glasses designed for screen-free assistance and display AI glasses.

“For over a decade, we’ve been working on the concept of smart glasses. With Android XR, we’re taking a giant leap forward,” Shahram Izadi, vice president and general manager of Android XR at Google, said in a news release.

Here’s what to know about Google’s new AI glasses.

Google is developing different glasses

The AI glasses designed for screen-free assistance will use built-in speakers, microphones and cameras to “let you chat naturally with Gemini, take photos and get help,” according to Google. The display AI glasses will include an in-lens display that privately shows users helpful information, such as turn-by-turn navigation and translation captions, the tech giant said.

Google said the AI glasses designed for screen-free assistance will be released first and “arrive next year.”

Both glasses will be built using Android XR, Google’s operating system for headset computers, according to the company.

An attendee tries out prototypes of the Android XR glasses, equipped with Gemini AI, during Google's annual I/O developers conference in Mountain View, California, on May 20, 2025.

An attendee tries out prototypes of the Android XR glasses, equipped with Gemini AI, during Google’s annual I/O developers conference in Mountain View, California, on May 20, 2025.

Google announces Android XR wired glasses

Google also announced this week the development of Android XR wired glasses, called Project Aura, that will enable customers to experience a headset-like immersion and real-world presence in a portable form factor.

The glasses will have a 70-degree field of view and optical see-through technology, Google said, adding that the device layers digital content into the user’s physical worldview. With this, users can operate multiple windows and bring workspaces or entertainment with them without blocking their surroundings, according to the company.

An example would be users being able to view a recipe as they cook or see step-by-step guides for an appliance they’re fixing.

Google said it will share more about Project Aura in 2026.

Warby Parker, Google formalize partnership

In May, Warby Parker formalized a $150 million partnership with Google to develop AI glasses. Google said it would put forward $75 million for development spending and an additional $75 million into the company if it meets certain milestones.

The partnership rivals Meta’s partnership with Ray-Ban maker EssilorLuxottica. Meta first began selling AI glasses, with a built-in Meta AI assistant, in 2023.

In September, Meta announced a new version of the glasses that will include a built-in display. The $799 glasses will similarly allow users to use the technology the same way they would a smartphone by seeing messages, photo previews and live captions, Meta said, adding that the display will be built into one of the lenses.

Google first attempted to sell Google Glass, its first set of smart glasses, in 2013, but the item was discontinued after customers expressed privacy concerns, USA TODAY previously reported.

Michelle Del Rey is a trending news reporter at USA TODAY. Reach her at mdelrey@usatoday.com

This article originally appeared on USA TODAY: Google will launch AI glasses in 2026. Here’s what we know so far.