{"id":574621,"date":"2026-04-01T06:55:09","date_gmt":"2026-04-01T06:55:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/574621\/"},"modified":"2026-04-01T06:55:09","modified_gmt":"2026-04-01T06:55:09","slug":"i-wore-metas-smartglasses-for-a-month-and-it-left-me-feeling-like-a-creep-ai-artificial-intelligence","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/574621\/","title":{"rendered":"I wore Meta\u2019s smartglasses for a month \u2013 and it left me feeling like a creep | AI (artificial intelligence)"},"content":{"rendered":"<p class=\"dcr-130mj7b\">Lately, I\u2019ve been hearing Judi Dench\u2019s voice in my head. She tells me tomorrow\u2019s forecast, when to turn right, that there\u2019s been another message in my group chat. Day or night, Dame Judi is eager to assist. When I ask the eight-time Academy Award nominee what I\u2019m looking at, she answers: a residential area, a person in a pub, daffodils. \u201cThey are a bright yellow colour and are often associated with spring.\u201d<\/p>\n<p class=\"dcr-130mj7b\">This isn\u2019t a delusion. This is, apparently, progress. I am test-driving Meta\u2019s smartglasses and Dench voices its integrated AI assistant: \u201cHere to chat, answer questions, create images and provide advice and inspiration,\u201d said \u201cJudi\u201d when I selected her over the actors John Cena and Kristen Bell. \u201cShall we begin?\u201d<\/p>\n<p class=\"dcr-130mj7b\">Over the next decade, predicts the Meta founder Mark Zuckerberg, smartglasses will gradually become \u201cthe main way we do computing\u201d, fulfilling many of the same functions as smartphones \u2013 taking photos, playing music, making calls, giving directions. For people who wear glasses, Zuckerberg <a href=\"https:\/\/www.theverge.com\/24253481\/meta-ceo-mark-zuckerberg-ar-glasses-orion-ray-bans-ai-decoder-interview\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">has suggested<\/a>, the upgrade is a no-brainer, bundling more features into an essential accessory. And for those of us who don\u2019t, it is only a matter of time. In 2025, Meta <a href=\"https:\/\/www.essilorluxottica.com\/en\/cap\/content\/283150\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">sold more than 7m pairs<\/a> globally.<\/p>\n<p class=\"dcr-130mj7b\">Are they really the future, bringing us the benefits of tech without the tyranny of screens, or will they trap us deeper in the digital world? To see for myself, I wore a pair for a month.<\/p>\n<p class=\"dcr-130mj7b\">The most common response to my new frames is: \u201cWhy?\u201d I don\u2019t usually wear glasses, and these clear-lens Wayfarers (part of Meta\u2019s collaboration with Ray-Ban) are on the heavy side. I look like the nerdy girl in a 90s romcom, or the old guy from Up, but the sunglasses would have made me even more conspicuous if worn indoors. The other question I get asked is: \u201cAre you filming me?\u201d In general, I find people do not like being around someone wearing Meta glasses, not least because sometimes the answer is: \u201cYes.\u201d<\/p>\n<p>Elle\u2019s friend Stevie photographed while Elle was wearing the smartglasses. Photograph: Elle Hunt<\/p>\n<p class=\"dcr-130mj7b\">When I arrive home, my boyfriend, Marco, clocks me wearing them, then instantly freezes, as though he has unexpectedly come face to face with a predator. (Marco is not his real name; I have violated his privacy enough.) Although they are still fringe tech, concerns about privacy are mounting, amid <a href=\"https:\/\/www.wired.com\/story\/the-rise-of-the-ray-ban-meta-creep\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">reports of users<\/a> covertly recording in public. Last month, journalists in Sweden found that moderators employed by Meta <a href=\"https:\/\/www.bbc.co.uk\/news\/articles\/c0q33nvj0qpo\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">review intimate footage<\/a> from the glasses, including of people using the toilet and having sex. (Meta responded that it took the protection of people\u2019s data very seriously and was constantly refining its efforts and tools in that area. It said that unless users choose to share media they\u2019ve captured with Meta or others, that media stays on the user\u2019s device.) That\u2019s not to mention the company\u2019s <a href=\"https:\/\/www.nytimes.com\/2026\/02\/13\/technology\/meta-facial-recognition-smart-glasses.html\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">reported plans<\/a> to implement facial recognition.<\/p>\n<p class=\"dcr-130mj7b\">Meta\u2019s newer Display glasses have tiny screens embedded in the lenses for displaying text messages, maps and the like, but they cost $799 (\u00a3600) and are not yet available outside the US. My entry-level Gen 1 Wayfarers, which retail for \u00a3299, aren\u2019t as functional as a smartphone.<\/p>\n<p class=\"dcr-130mj7b\">It\u2019s possible that smartglasses could bring about a paradigm shift, as an accessible first step towards <a href=\"https:\/\/www.theguardian.com\/technology\/2024\/mar\/31\/wearable-ai-smartphones-fashion-ai-pin-rabbit-r1-meta-smart-glasses-pendant-tab\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">wearable AI<\/a> and augmented reality. Meta is certainly betting big on them, expanding its partnership with the eyewear giant EssilorLuxottica (the parent company of Ray-Ban, Oakley and more than 150 other brands) and heavily investing in its Meta AI. The pitch is that smartglasses remove friction between our physical and digital worlds, enabling the wearer to be more present \u201cin the moment, with your head up and your hands free\u201d, as Alex Himel, Meta\u2019s vice-president of wearables, <a href=\"https:\/\/www.bbc.co.uk\/sounds\/play\/m002sr4h\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">recently told Radio 4<\/a>.<\/p>\n<p class=\"dcr-130mj7b\">A Meta spokesperson said in an emailed statement that the company foresaw smartglasses being increasingly used alongside smartphones, not replacing them. Eventually, they will have more specific, unique functions, the spokesperson said, but for now they aim to be less distracting and more convenient than taking your phone out of your pocket. Wearing a camera does encourage me to take more photos while out with friends, but, when I later download them on to my phone, most of them are unfocused and awkwardly framed. It\u2019s not easy, composing an image with your eyes.<\/p>\n<p class=\"dcr-130mj7b\">I get more use out of the glasses as headphones. They pipe audio directly into your ear without blocking ambient sound; only someone standing very close to you could tell that music was playing. It\u2019s nifty tech, but the trade-off is continual interruptions. Headphones signal to others that you are not listening; glasses don\u2019t. I am self-conscious while taking calls in public, apparently talking to myself.<\/p>\n<p class=\"dcr-130mj7b\">So far, so nonessential. Smartglasses\u2019 real promise lies in the integrated AI: what Meta is hailing as \u201cthe most natural and seamless way to access an AI assistant in your daily life\u201d. Judi can operate my phone via voice commands (\u201ccall Ian\u201d, \u201ctake a picture\u201d), answer questions and engage with what I can see \u2013 for example, by reading aloud printed text or identifying objects. Blind people and those with low vision have reported this as transformative. Using the <a href=\"https:\/\/www.bemyeyes.com\/be-my-eyes-smartglasses\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Be My Eyes<\/a> feature, they can even connect with a sighted volunteer, who then \u201clooks through\u201d their glasses\u2019 camera and gives live feedback. It is easy to see smartglasses\u2019 potential as assistive technology; they are already being used to help <a href=\"https:\/\/www.theguardian.com\/society\/2026\/mar\/18\/ai-smart-glasses-1m-prize-technology-dementia\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">people with dementia<\/a>, <a href=\"https:\/\/eye-see-mag.com\/en\/high-tech\/les-lexilens-datol\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">dyslexia and other needs<\/a>. But, for now, they are not reliable or functional enough to offer consistent support or replace other aids.<\/p>\n<p class=\"dcr-130mj7b\">Even Zuckerberg, presenting the new Meta Display on stage last year, repeatedly tried and failed <a href=\"https:\/\/www.theguardian.com\/commentisfree\/2025\/sep\/27\/zuckerberg-ai-glasses-fail\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">to make a simple video call<\/a>. My experience is similarly haphazard. Judi frequently mishears me, cuts out midway through answering my questions or reads aloud only some new texts in a chain, meaning I don\u2019t feel confident giving her tasks without then checking them on my phone \u2013 defeating the point of \u201chands-free\u201d tech.<\/p>\n<p class=\"dcr-130mj7b\">The AI assistance is also limited. Judi can distinguish daffodils from daisies and confirm when I am in a pub (as opposed to outdoors), but she struggles with even slightly more involved requests. When I ask her to ID a friend\u2019s hat, for example, Judi says she can\u2019t yet help \u201cwith requests about product pricing or availability\u201d. At Tate Modern, when I ask her to tell me about the artwork I\u2019m looking at, Judi describes \u201ca bed with white bedding and a blue mat underneath it [with] various items scattered\u201d. It\u2019s not wrong, exactly, but it\u2019s not what I want to know about My Bed by Tracey Emin.<\/p>\n<p>The AI assistant\u2019s description of Tracey Emin\u2019s My Bed installation at Tate Modern was very sketchy.  Photograph: Tracey Emin<\/p>\n<p class=\"dcr-130mj7b\">Of all the glasses\u2019 selling points, I\u2019d been most enthusiastic about the real-time translation, imagining it smoothing my interactions abroad and allowing me to eavesdrop on Marco, a native Italian speaker. If Marco was initially wary of the glasses, after two weeks he is sick of the sight of them, but indulges my request to role-play tourist and local.<\/p>\n<p class=\"dcr-130mj7b\">I ask him for directions to \u201cthe nearest duomo\u201d, looking directly at him, as instructed, with the Meta AI app open on my phone. Marco rattles off a response with authentic speed and dismissiveness. An English translation appears on my phone\u2019s screen, after a slight lag, but it is incomprehensible. The interaction is significantly less fluid than typing into Google Translate and passing the phone back and forth. This is frequently the case in my month of Meta glasses: instead of a more seamless experience, they add another layer of tech, one that\u2019s more fiendish to navigate.<\/p>\n<p class=\"dcr-130mj7b\">There\u2019s a reason smartglasses are not all that popular, suggests Iain Rice, a professor of industrial AI at Birmingham City University. As the technology evolves and improves, they may become standard for certain activities, but at present they conspicuously lack clear, valuable \u201cuse cases\u201d. His take is that Meta wants to be seen as innovating alongside Google and Apple, but lacks the vision and business acumen to come up with genuinely essential tech. Rice points to the <a href=\"https:\/\/www.theguardian.com\/technology\/2022\/dec\/07\/metaverse-slow-death-facebook-losing-100bn-gamble-virtual-reality-mark-zuckerberg\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">metaverse, Zuckerberg\u2019s long-trailed digital world,<\/a> from which Meta began shifting resources away last month, after $80bn in investment. Meta\u2019s glasses could prove similarly misjudged, he thinks.<\/p>\n<p class=\"dcr-130mj7b\">Most people don\u2019t want to wear glasses if they don\u2019t need to, Rice points out \u2013 which is partly why Google Glass failed 10 years ago. \u201cMeta has spent time trying to make them well designed and trendy \u2026 but they didn\u2019t seem to have the market research at the forefront: do people actually want this? Will they buy into it? Or is the tech just not there?\u201d<\/p>\n<p class=\"dcr-130mj7b\">It\u2019s no surprise that take-up seems to have been highest among content creators, perpetually in need of more footage and fresh angles on their day-to-day lives. More specifically, Meta glasses have <a href=\"https:\/\/www.wired.com\/story\/the-rise-of-the-ray-ban-meta-creep\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">become associated<\/a> with pick-up artists, social-media pranksters and other public irritants. It has become increasingly common online to see footage shot with glasses, showing the wearer\u2019s point of view, although the person featured may not have consented to being recorded or even been aware.<\/p>\n<p>Elle Hunt testing the smartglasses in a cafe in Norwich.  Photograph: Ali Smith\/The Guardian<\/p>\n<p class=\"dcr-130mj7b\">A blinking LED on the frame alerts others that recording is in progress (and was made more visible for the Gen 2 model), but workarounds are widely shared online. Even as is, the light is easy to miss, especially in bright light. To test this, on the tube home one evening, I guiltily snap a picture of the people across from me. No one notices, absorbed in their own devices. They might have twigged had I angled my phone in that now familiar cagey way, but people don\u2019t yet think to check glasses for a light or a camera.<\/p>\n<p class=\"dcr-130mj7b\">The number of content creators, and worse, capitalising on this, plus the absence of other obvious uses, has earned Meta glasses the <a href=\"https:\/\/futurism.com\/future-society\/meta-ray-ban-smart-pervert-glasses\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">nickname \u201cpervert glasses\u201d<\/a>. Of course, not all users are ill-intentioned, but I can\u2019t deny: not only do I feel creepy wearing them, but they also lead me to think like a creep. When I see a stranger walking her lookalike dog, and when I run into an ex for the first time in years, both times I have the fleeting thought: I wish I\u2019d been recording.<\/p>\n<p class=\"dcr-130mj7b\">I\u2019m perturbed by how quickly my initial discomfort passes and the glasses become second nature. Just having a covert camera makes me want to use it, the possibilities of the tech overriding my better judgment and even basic decency.<\/p>\n<p class=\"dcr-130mj7b\">The Meta spokesperson told me that, as outlined in the terms of service, the user is responsible for using them lawfully and \u201cin a safe, respectful manner\u201d: \u201cAs with any recording device, people shouldn\u2019t use them for engaging in harmful activities like harassment, infringing on privacy rights or capturing sensitive information.\u201d<\/p>\n<p class=\"dcr-130mj7b\">The trouble is, as emerging tech, smartglasses complicate what we consider harassment, privacy and sensitive information. Some users, for example, argue that being able to film makes them feel safer, while others feel threatened by their presence. With no laws against recording in public places in the UK, whose rights take precedence: those who want to wear smartglasses or those who don\u2019t want to be perceived by them?<\/p>\n<p>Elle taking the glasses for a Sunday-morning jaunt to Ikea. Photograph: Elle Hunt<\/p>\n<p class=\"dcr-130mj7b\">This comes to a head for me on a solo Sunday-morning jaunt to Ikea. Realising I don\u2019t have to suffer alone, I use the glasses to place my first video call, to Marco. At first, it\u2019s all fun and games, showing him the Swedish snacks and soft toys. Then he tries to order me to fling some around and have a weird interaction with an employee. \u201cWhy?\u201d I ask, pointing out that neither of us are typically disruptive in public. Marco apologises, blaming force of habit: the first-person point of view reminds him of playing a video game.<\/p>\n<p class=\"dcr-130mj7b\">No other shoppers seem to notice me talking to myself, but the knowledge that I am broadcasting without their knowledge is genuinely queasy-making. When a small child crosses my path, I instinctively, immediately whip my head towards the shelves with a sinking feeling. I cannot fathom what sort of person would be untroubled by doing this or, worse, feel entitled to do it. After that, I never make another video call and stop dismissing Meta glasses as a gimmick.<\/p>\n<p class=\"dcr-130mj7b\">Even in their present shonky state, Rice agrees, they are shaping up as a flashpoint in a bigger, existential discussion about just how much integration we want with tech. \u201cThey\u2019re pervading slowly into society \u2026 If you see a person wearing them and don\u2019t want to be recorded, unfortunately, the only way to make sure is by moving out of the way.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Nonconsensual and covert filming are only the tip of the iceberg, he says. For Meta, smartglasses are a way to gather masses of data on individual users and whatever \u2013 or whomever \u2013 they\u2019re seeing. Some media and interactions captured on Meta glasses could be used to train its AI, <a href=\"https:\/\/www.express.co.uk\/life-style\/science-technology\/2140008\/oakey-meta-ai-glasses-review\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">the company admits<\/a>. \u201cIf you knew what was being done with all the imagery after you\u2019d recorded it, I don\u2019t think you\u2019d be doing it the way you are,\u201d Rice says.<\/p>\n<p class=\"dcr-130mj7b\">Meta could make changes to protect bystanders\u2019 privacy, he suggests, such as blurring and removing unapproved faces at the preprocessing stage. His concern is that, as the tech gets cheaper and better, all glasses will be made smartglasses, just as Bluetooth became the norm. \u201cI think they\u2019ve released a technology that the world wasn\u2019t ready for, and definitely hasn\u2019t regulated for, so there has to be some ownership.\u201d<\/p>\n<p class=\"dcr-130mj7b\">In the meantime, he says, we should feel confident pushing back: by asking friends wearing Meta glasses to take them off around us; and strangers we suspect of filming us to stop. What would Rice say to someone who was thinking about getting a pair? \u201cI\u2019d say: I don\u2019t think you should.\u201d<\/p>\n<p class=\"dcr-130mj7b\">After a month, I\u2019m glad to return the glasses to the office. I\u2019m sick of seeing friends\u2019 faces fall at the sight of me, of feeling trapped inside the computer, of hearing Judi say I\u2019ve got a new text. Before I wipe the device\u2019s memory, I alert Judi to my intentions and ask if she has any last words. \u201cThanks for letting me know,\u201d she says.<\/p>\n<p class=\"dcr-130mj7b\"> Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our <a href=\"https:\/\/www.theguardian.com\/tone\/letters\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">letters<\/a> section, please <a href=\"https:\/\/www.theguardian.com\/technology\/2026\/apr\/01\/mailto:guardian.letters@theguardian.com?body=Please%20include%20your%20name%E2%80%8B%E2%80%8B,%20full%20postal%20address%20and%20phone%20number%20with%20your%20letter%20below.%20Letters%20are%20usually%20published%20with%20the%20author%27s%20name%20and%20city\/town\/village.%20The%20rest%20of%20the%20information%20is%20for%20verification%20only%20and%20to%20contact%20you%20where%20necessary.\" data-link-name=\"in body link \" https:=\"\" rel=\"nofollow noopener\" target=\"_blank\">click here<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"Lately, I\u2019ve been hearing Judi Dench\u2019s voice in my head. She tells me tomorrow\u2019s forecast, when to turn&hellip;\n","protected":false},"author":2,"featured_media":574622,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[49,48,61],"class_list":{"0":"post-574621","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-technology","8":"tag-ca","9":"tag-canada","10":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/574621","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=574621"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/574621\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/574622"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=574621"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=574621"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=574621"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}