{"id":312676,"date":"2026-02-27T08:04:10","date_gmt":"2026-02-27T08:04:10","guid":{"rendered":"https:\/\/www.newsbeep.com\/il\/312676\/"},"modified":"2026-02-27T08:04:10","modified_gmt":"2026-02-27T08:04:10","slug":"woolworths-ai-agent-rambled-about-its-mother-its-a-sign-of-deeper-problems-with-the-tech-rollout","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/il\/312676\/","title":{"rendered":"Woolworths\u2019 AI agent rambled about its \u2018mother\u2019. It\u2019s a sign of deeper problems with the tech rollout"},"content":{"rendered":"<p>Recently some Australian shoppers got more than they bargained for when they chatted with Woolworths\u2019 artificial intelligence (AI) assistant, Olive. <\/p>\n<p>Instead of sticking to groceries, recipes and basket suggestions, Olive <a href=\"https:\/\/www.reddit.com\/r\/woolworths\/comments\/1r2lzdt\/olive_ai_started_telling_me_about_its_mother_on\/\" rel=\"nofollow noopener\" target=\"_blank\">reportedly<\/a> produced strange, overly human-like responses. It talked about its \u201cmother\u201d and offered other personal-sounding details.  <\/p>\n<p><a href=\"https:\/\/www.theage.com.au\/business\/companies\/woolworths-forced-to-rein-in-chatbot-that-claimed-to-have-angry-mother-20260226-p5o5mh.html\" rel=\"nofollow noopener\" target=\"_blank\">Further testing<\/a> revealed pricing errors for basic items. And when I asked about the price of a specific product, Olive didn\u2019t provide a clear answer. Instead, it checked whether the item was in stock and explained pickup fees.<\/p>\n<p>So what exactly is going on here? And what lessons might these incidents hold for businesses and consumers alike? <\/p>\n<p>What actually happened?<\/p>\n<p>Olive is powered by a large language model (LLM). These models don\u2019t \u201cknow\u201d things <a href=\"https:\/\/theconversation.com\/ai-doesnt-really-learn-and-knowing-why-will-help-you-use-it-more-responsibly-250923\" rel=\"nofollow noopener\" target=\"_blank\">the way humans do<\/a>, nor do they have mothers. Using elaborate statistical analyses, they generate language that sounds plausible. <\/p>\n<p>Comments from a Woolworths spokesperson to <a href=\"https:\/\/www.afr.com\/technology\/woolworths-ai-assistant-goes-rogue-starts-talking-about-its-mother-20260226-p5o5vi\" rel=\"nofollow noopener\" target=\"_blank\">the Australian Financial Review<\/a> suggest that in Olive\u2019s case, the references to its supposed mother appear to have been pre-written scripts dating back several years. <\/p>\n<p>When users entered something that looked like a birthdate, the system likely triggered a matching \u201cfun fact\u201d from an old decision tree with pre-programmed responses.<\/p>\n<p>Woolworths <a href=\"https:\/\/www.afr.com\/technology\/woolworths-ai-assistant-goes-rogue-starts-talking-about-its-mother-20260226-p5o5vi\" rel=\"nofollow noopener\" target=\"_blank\">says<\/a> it has now removed this particular scripting \u201cas a result of customer feedback\u201d. <\/p>\n<p>The pricing errors point to a different problem. <\/p>\n<p>Because LLMs generate responses based on learned patterns rather than real-time data, they do not automatically know today\u2019s prices unless they are explicitly connected to a live database. <\/p>\n<p>If that grounding step is weak, the system can produce outdated prices. <\/p>\n<p>Not a new problem<\/p>\n<p>Woolworths is not the first company to discover, after the fact, that its customer-facing AI had unexpectedly \u201cmisbehaved\u201d.<\/p>\n<p>In 2022, <a href=\"https:\/\/www.cbc.ca\/news\/canada\/british-columbia\/air-canada-chatbot-lawsuit-1.7116416\" rel=\"nofollow noopener\" target=\"_blank\">Air Canada\u2019s chatbot incorrectly told a passenger<\/a>, Jake Moffatt, that he could purchase tickets at full price and later apply for a bereavement fare refund. No such policy existed. <\/p>\n<p>When Air Canada refused to honour the chatbot\u2019s advice, Moffatt sued the airline and won. <\/p>\n<p>Air Canada\u2019s defence was striking. It argued the chatbot was a separate legal entity, responsible for its own actions and therefore beyond the airline\u2019s liability. The tribunal rejected this outright. It ruled that a chatbot is part of a company\u2019s website, and that the company owns its outputs.<\/p>\n<p>In January 2024, UK parcel delivery firm <a href=\"https:\/\/www.bbc.co.uk\/news\/technology-68025677\" rel=\"nofollow noopener\" target=\"_blank\">DPD faced a different kind of embarrassment<\/a>. A frustrated customer who could not get help to locate a missing parcel asked DPD\u2019s chatbot to write a poem that criticised the company. It did. He then asked it to swear. It did that too. The exchange went viral on social media. DPD disabled the chatbot shortly after.<\/p>\n<p>Both cases point to the same underlying failure: companies launched customer-facing AI without adequate oversight and were caught off-guard by the consequences.<\/p>\n<p>What is Woolworths\u2019 responsibility?<\/p>\n<p>Woolworths operates the largest supermarket chain in Australia. It has promoted Olive as a trusted, convenient interface for its customers, who are reasonable to expect that the information Olive provides is accurate. <\/p>\n<p>            <a href=\"https:\/\/images.theconversation.com\/files\/720865\/original\/file-20260227-77-fhm5sk.png?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip\" rel=\"nofollow noopener\" target=\"_blank\"><img decoding=\"async\" alt=\"A screenshot of the Woolworth's chatbot.\" src=\"https:\/\/www.newsbeep.com\/il\/wp-content\/uploads\/2026\/02\/file-20260227-77-fhm5sk.png\" class=\"native-lazy\" loading=\"lazy\"  \/><\/a><\/p>\n<p>              Woolworths admits its AI assistant can make mistakes.<br \/>\n              <a class=\"source\" href=\"https:\/\/www.woolworths.com.au\/shop\/discover\/about-us\/contact-us?srsltid=AfmBOopPhTWOaZ3K2Y7S8_aSRP4us_DJUkq6qNcGftbjE6HB29fmx33Y\" rel=\"nofollow noopener\" target=\"_blank\">Woolworths<\/a><\/p>\n<p>Admitting that Olive may make mistakes, <a href=\"https:\/\/www.woolworths.com.au\/shop\/discover\/about-us\/contact-us?srsltid=AfmBOop4iSR-RNIRrcgUe6u9bS4jRQVgdEjuTrDquFX_OgRev1XMzqJ5\" rel=\"nofollow noopener\" target=\"_blank\">as Woolworths does<\/a> when a user opens the chatbot, does not sit easily with that expectation. <\/p>\n<p>There is also a broader ethical dimension. Woolworths serves customers who, in many cases, are making careful decisions about household budgets. <\/p>\n<p>The ACCC has already <a href=\"https:\/\/www.accc.gov.au\/media-release\/accc-takes-woolworths-and-coles-to-court-over-alleged-misleading-prices-dropped-and-down-down-claims\" rel=\"nofollow noopener\" target=\"_blank\">commenced proceedings against Woolworths<\/a> over allegedly misleading discount pricing practices. <\/p>\n<p>That context makes the Olive pricing errors harder to dismiss as an isolated technical glitch. <\/p>\n<p>Companies that deploy AI in customer-facing roles take on a duty of care to ensure those systems are accurate and honestly presented. That duty does not diminish because the technology is new.<\/p>\n<p>Why do companies keep making chatbots that pretend to be your friend?<\/p>\n<p>The logic behind Olive\u2019s programmed personality is not without basis.<\/p>\n<p><a href=\"https:\/\/doi.org\/10.1108\/JCM-06-2024-6922\" rel=\"nofollow noopener\" target=\"_blank\">Research<\/a> on human-computer interaction consistently finds that people respond positively to interfaces that feel conversational and warm. Human-like chatbots that have a name and personality tend to generate higher customer engagement, satisfaction, and trust. <\/p>\n<p>For companies, the commercial appeal is straightforward: a customer who feels at ease with a chatbot is more likely to complete a transaction and return. <\/p>\n<p>However, this comes with a significant risk. When an anthropomorphised chatbot fails to meet the expectations its personality has created, customers tend to be more dissatisfied than they would have been with a plainly mechanical system. <\/p>\n<p>This \u201c<a href=\"https:\/\/doi.org\/10.1007\/978-3-031-93736-1_24\" rel=\"nofollow noopener\" target=\"_blank\">expectation violation<\/a>\u201d means that the warmer the persona, the harder the fall. <\/p>\n<p>The larger stakes<\/p>\n<p>The Olive episode is a reminder that deploying AI in customer-facing roles is not a set-and-forget exercise. <\/p>\n<p>A chatbot that quotes wrong prices and rambles about its family is not a quirky inconvenience but a sign that something in the development and oversight process has broken down. <\/p>\n<p>For Woolworths, and for the many other companies now rushing to put AI in front of their customers, the lesson is clear: accountability cannot be outsourced to an algorithm. When a business puts a system in front of the public, it owns what that system says and does. <\/p>\n<p>There is a lesson for consumers, too. <\/p>\n<p>AI assistants may feel confident and conversational, but they are still tools, not authorities. If something seems unclear, inconsistent or too good to be true, it is worth double-checking. <\/p>\n<p>As AI becomes a routine part of everyday transactions, a small measure of healthy scepticism may prove just as important as technological innovation.<\/p>\n","protected":false},"excerpt":{"rendered":"Recently some Australian shoppers got more than they bargained for when they chatted with Woolworths\u2019 artificial intelligence (AI)&hellip;\n","protected":false},"author":2,"featured_media":312677,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[345,343,344,85,46,125],"class_list":{"0":"post-312676","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-il","12":"tag-israel","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/312676","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/comments?post=312676"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/312676\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media\/312677"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media?parent=312676"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/categories?post=312676"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/tags?post=312676"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}