{"id":374887,"date":"2026-01-17T08:40:11","date_gmt":"2026-01-17T08:40:11","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/374887\/"},"modified":"2026-01-17T08:40:11","modified_gmt":"2026-01-17T08:40:11","slug":"will-google-be-third-time-lucky-with-new-ai-powered-smart-glasses","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/374887\/","title":{"rendered":"Will Google be third time lucky with new, AI-powered smart glasses?"},"content":{"rendered":"<p>It has been over a decade since Google Glass smart glasses were announced in 2013, followed by their swift withdrawal \u2013 in part because of low adoption. Their subsequent (and lesser known) second iteration was released in 2017 and aimed at the workplace. They were withdrawn in 2023. <\/p>\n<p>In December 2025, Google made a <a href=\"https:\/\/blog.google\/products\/android\/android-show-xr-edition-updates\/\" rel=\"nofollow noopener\" target=\"_blank\">new promise for smart glasses<\/a> \u2013 with two new products to be released in 2026. But why have Google smart glasses struggled where others are succeeding? And will Google see success the third time around?<\/p>\n<p>What is clear from developments in wearable tech over the last decade, is that<br \/>\nsuccessful products are being built into things that people already like to wear:<br \/>\nwatches, rings, bracelets and glasses. <\/p>\n<p>These are the types of accessories that have emerged over centuries and currently adopted as normal in society.<\/p>\n<p>Some of the most recent academic research is taking this approach, building<br \/>\n<a href=\"https:\/\/doi.org\/10.1145\/3706598.3713856\" rel=\"nofollow noopener\" target=\"_blank\">sensors into jewellery<\/a> that people would actually want to wear. Research has developed a scale to measure the social acceptability of wearable technology (the <a href=\"https:\/\/link.springer.com\/article\/10.1186\/s40691-019-0203-3\" rel=\"nofollow noopener\" target=\"_blank\">WEAR scale<\/a>, or Wearable Acceptability Range), which includes questions like: \u201cI think my peers would find this device acceptable to wear.\u201d<\/p>\n<p>Noreen Kelly, from Iowa State University, and colleagues <a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/2851581.2892331\" rel=\"nofollow noopener\" target=\"_blank\">showed that<\/a> at its core, this scale measured two things: that the device helped people reach a goal (that made it worth wearing), and that it did not create social anxiety about privacy and being seen as rude.<\/p>\n<p>This latter issue was highlighted most prominently by the term that emerged for<br \/>\nGoogle Glass users: Glassholes. Although many studies have considered the potential benefits of smart glasses, <a href=\"https:\/\/doi.org\/10.1080\/10447318.2022.2111046\" rel=\"nofollow noopener\" target=\"_blank\">from mental health to use in surgery<\/a>, privacy concerns and other issues  are ongoing for <a href=\"https:\/\/doi.org\/10.1007\/s43681-022-00155-7\" rel=\"nofollow noopener\" target=\"_blank\">newer smart glasses<\/a>. <\/p>\n<p>All that said, <a href=\"https:\/\/doi.org\/10.1080\/10447318.2017.1357902\" rel=\"nofollow noopener\" target=\"_blank\">\u201clook-and-feel\u201d<\/a> keeps coming up the most common concern for potential buyers. The most successful products have been designed to be desirable as accessories first, and with smart technologies second. Typically, in fact, by designer brands. <\/p>\n<p>A fine spectacle<\/p>\n<p>After Google Glass, Snapchat released smart glasses called \u201cspectacles\u201d, which had cameras built in, focused on fashion and were more easily accepted into society. The now most prominent smart glasses were released by Meta (Facebook\u2019s parent company), in collaboration with designer brands like Ray-Ban and Oakley. Most of these products include front facing cameras and conversational voice agent<br \/>\nsupport from Meta AI. <\/p>\n<p>So what do we expect to see from Google Smart Glasses in 2026? Google <a href=\"https:\/\/blog.google\/products\/android\/android-show-xr-edition-updates\/\" rel=\"nofollow noopener\" target=\"_blank\">has promised two products<\/a>: one that is audio only, and one that has \u201cscreens\u201d shown on the lenses (like Google Glass). <\/p>\n<p>            <img decoding=\"async\" alt=\"Google Glass\" src=\"https:\/\/www.newsbeep.com\/uk\/wp-content\/uploads\/2026\/01\/file-20260112-56-qhp37v.jpg\" class=\"native-lazy\" loading=\"lazy\"  \/><\/p>\n<p>              The original version of Google Glass was released in 2014.<br \/>\n              <a class=\"source\" href=\"https:\/\/www.shutterstock.com\/image-photo\/boston-ma-usa-may-1-2014-190185665?trackingId=22bb6229-1f43-4213-9588-ee7472fa9b9c&amp;listId=searchResults\" rel=\"nofollow noopener\" target=\"_blank\">Hattanas \/ Shutterstock<\/a><\/p>\n<p>The biggest assumption (based on the promo videos) is that these will see a significant change in form factor, from the futuristic if not scary and unfamiliar design of Google Glass, to something that is more normally seen as glasses.<\/p>\n<p>Google\u2019s announcement also focused on the addition of AI (in fact, they announced<br \/>\nthem as \u201cAI Glasses\u201d rather than smart glasses). The two types of product (audio<br \/>\nonly AI Glasses, and AI Glasses with projections in the field of view), however, are not especially novel, even when combined with AI. <\/p>\n<p>Meta\u2019s Ray-Ban products are available in both modes, and include voice interaction with their own AI. These have been more successful than the recent Humane AI Pin, for example, which included front-facing cameras, other sensors, and voice support from an AI agent. This was the closest thing we\u2019ve had so far to the Star Trek lapel communicators.<\/p>\n<p>Direction of travel<\/p>\n<p>Chances are, the main directions of innovation in this are, first, reducing the<br \/>\nchonkyness of smart glasses, which have necessarily been bulky to include<br \/>\nelectronics and still look like that are normally proportioned. <\/p>\n<p>\u201cBuilding glasses you\u2019ll want to wear\u201d is how Google phrases it, and so we may see innovation from the company that just improves the aesthetic of smart glasses. They are also working with popular brand partners. Google also advertised the release of wired XR (Mixed Reality) glasses, which are significantly reduced in form factor compared to Virtual Reality headsets on the market. <\/p>\n<p>Second, we could expect more integration with other Google products and services, where Google has many more commonly used products than Meta including Google Search, Google Maps, and GMail. Their promotional material shows examples of seeing Google Maps information in view in the AI Glasses, while walking through the streets. <\/p>\n<p>Finally, and perhaps the biggest area of opportunity, is to innovate on the inclusion of additional sensors, perhaps integrating with other Google wearable health products, where we are seeing many of their current ventures, including introducing their own <a href=\"https:\/\/theconversation.com\/smart-rings-ultra-precise-movement-tracking-takes-wearable-technology-to-the-next-level-225604\" rel=\"nofollow noopener\" target=\"_blank\">smart rings<\/a>. <\/p>\n<p>Much research has focused on things that can be sensed from common touchpoints on the head, which has included heart rate, body temperature and galvanic skin response (skin moistness, which changes with, for example, stress), and even brain activation through EEG for example. With the current advances in consumer neurotechnology, we could easily see <a href=\"https:\/\/doi.org\/10.1038\/s41598-025-29893-4\" rel=\"nofollow noopener\" target=\"_blank\">Smart Glasses that use EEG<\/a> to track brain data in the nextfew years.<\/p>\n","protected":false},"excerpt":{"rendered":"It has been over a decade since Google Glass smart glasses were announced in 2013, followed by their&hellip;\n","protected":false},"author":2,"featured_media":374888,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,86,56,54,55],"class_list":{"0":"post-374887","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/374887","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=374887"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/374887\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/374888"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=374887"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=374887"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=374887"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}