{"id":515704,"date":"2026-04-06T11:02:14","date_gmt":"2026-04-06T11:02:14","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/515704\/"},"modified":"2026-04-06T11:02:14","modified_gmt":"2026-04-06T11:02:14","slug":"microsofts-own-tos-calls-copilot-entertainment-only-amid-adoption-slump","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/515704\/","title":{"rendered":"Microsoft&#8217;s own ToS calls Copilot &#8216;entertainment only&#8217; amid adoption slump"},"content":{"rendered":"<p>In short:\u00a0Microsoft has spent billions building Copilot into every corner of its product lineup, pitching it as an indispensable AI co-worker. Its own Terms of Use tell a different story. A clause quietly buried in the document labels Copilot \u201cfor entertainment purposes only\u201d and warns users not to rely on it for important advice. The gap between the marketing and the fine print has drawn fresh scrutiny as adoption figures reveal that fewer than one in 30 eligible users is actually paying for the tool.<\/p>\n<p>Somewhere between Satya Nadella\u2019s earnings calls and the product pages promising to \u201ctransform the way you work,\u201d Microsoft inserted a sentence into Copilot\u2019s Terms of Use that reads rather differently from the rest of its AI pitch. Updated in October 2025 and surfacing widely in early April 2026, the clause appears under a section in bold capital letters labelled \u201cIMPORTANT DISCLOSURES &amp; WARNINGS.\u201d It says: \u201cCopilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don\u2019t rely on Copilot for important advice. Use Copilot at your own risk.\u201d<\/p>\n<p>The same document states that Microsoft makes no warranty or representation of any kind about Copilot, that users should not assume its outputs are free from copyright, trademark, or privacy rights infringement, and that users are solely responsible for any Copilot content they choose to share or publish. The terms apply to consumer Copilot products; the enterprise-facing Microsoft 365 Copilot is excluded from the clause.<\/p>\n<p>What Microsoft has been saying publicly<\/p>\n<p>The disclaimer sits in sharp contrast to years of aggressive promotion. Since integrating Copilot across Windows 11 and the Microsoft 365 suite in 2023, the company has positioned the tool as a productivity multiplier, its \u201cAI companion\u201d for workers in Word, Excel, PowerPoint, and Outlook. Nadella has described Copilot as \u201cbecoming a true daily habit\u201d and told investors that daily active users had grown nearly threefold year on year. The company spent approximately $80 billion on AI-related capital expenditure in fiscal year 2025, including a $13 billion investment in OpenAI whose models underpin Copilot\u2019s core capabilities.<\/p>\n<p>Microsoft 365 Copilot is priced at $30 per user per month as an enterprise add-on, with a business tier at $18 per user per month. Premium consumer tiers carry costs that reach into the tens of dollars monthly. \u201cEntertainment purposes only\u201d is not language typically associated with a product charging at those rates.<\/p>\n<p><a href=\"https:\/\/thenextweb.com\/spaces\/book-a-tour\" data-event-category=\"Article\" data-event-action=\"In Article Block\" data-event-label=\"TNW City Coworking space - Where your best work happens\" target=\"_blank\" rel=\"nofollow noopener\"><\/p>\n<p class=\"ica-text__title\">TNW City Coworking space &#8211; Where your best work happens<\/p>\n<p>A workspace designed for growth, collaboration, and endless networking opportunities in the heart of tech.<\/p>\n<p><\/a>The legal logic behind the clause<\/p>\n<p>Legal analysts who reviewed the language offered a measured interpretation. The most widely cited read is that the clause represents a lawyer\u2019s attempt to limit liability in circumstances where the product fails, an overcorrection that has become embarrassing because of how bluntly it contradicts the marketing. OpenAI, Google, and Anthropic all include similar advisories in their terms of service, acknowledging inaccuracy and placing responsibility for verifying outputs on users. None of them, however, uses the phrase \u201centertainment purposes only,\u201d which Android Authority noted is \u201cthe same disclaimer that a psychic uses to avoid getting sued.\u201d<\/p>\n<p>The broader legal context matters. Microsoft has faced litigation over Copilot\u2019s outputs before: a class-action suit in a US federal court in San Francisco challenged the legality of GitHub Copilot over alleged open-source licence violations, and a separate dispute in Australia concerned customers who were moved to more expensive plans with Copilot bundled in. The consumer Copilot ToS language, on this reading, is corporate defensiveness made explicit, an attempt to establish in writing that the product never warranted the reliance users might have placed on it.<\/p>\n<p>The adoption numbers that give context<\/p>\n<p>The disclaimer arrives at an awkward moment for Copilot\u2019s commercial trajectory. Data published in early 2026 showed that only 3.3% of Microsoft 365 and Office 365 users who have access to Copilot Chat actually pay for it. Of roughly 450 million Microsoft 365 seats, 15 million are paid Copilot subscribers, a conversion rate that reflects the difficulty of persuading existing users to pay a significant premium for AI they find unreliable.<\/p>\n<p>Research from Recon Analytics traced the problem in part to accuracy. Its tracking of Copilot\u2019s accuracy Net Promoter Score found it at -3.5 in July 2025, deteriorating to -24.1 by September 2025, and only partially recovering to -19.8 by January 2026. In surveys of lapsed Copilot users, 44.2% cited distrust of answers as the primary reason they had stopped using the tool. Separately, the US paid subscriber market share fell from 18.8% in July 2025 to 11.5% in January 2026, a 39% contraction in six months. When users are given a choice between Copilot, ChatGPT, and Gemini, just 8% of workers opt for Copilot.<\/p>\n<p>The hallucination record has not helped. In August 2024, Copilot falsely accused German court reporter Martin Bernklau of the crimes he had covered for years, describing him as a convicted child abuser and fraudster and providing his home address. Microsoft was forced to block queries about Bernklau after a data protection complaint. In January 2026, Copilot generated false claims about football-related violence, triggering further coverage of the tool\u2019s reliability problem. The \u201centertainment purposes only\u201d clause looks rather less like a legal technicality in that context, and rather more like an accurate description.<\/p>\n<p>Microsoft\u2019s pivot and what it means<\/p>\n<p>Nadella\u2019s response to Copilot\u2019s uneven performance has been to assume direct control over AI product development, reportedly delegating other responsibilities from September 2025 onward to focus personally on the roadmap. The company has also begun building its own models.\u00a0<a href=\"https:\/\/thenextweb.com\/news\/microsoft-mai-models-openai-independence\" rel=\"nofollow noopener\" target=\"_blank\">Microsoft\u2019s launch of MAI-Transcribe-1, MAI-Voice-1, and MAI-Image-2 in April 2026<\/a> , its first proprietary AI model releases since renegotiating its contract with OpenAI in September 2025 \u2014 signals a strategic intent to reduce dependency on the models that currently sit under Copilot\u2019s hood.<\/p>\n<p>The irony is that Copilot\u2019s limitations are well understood inside Microsoft. The company\u2019s own leaked internal feedback, as reported by several outlets, described integrations that \u201cdon\u2019t really work.\u201d The ToS language is, in a sense, the legal department\u2019s way of saying what the product team has been grappling with in private.\u00a0<a href=\"https:\/\/thenextweb.com\/news\/why-2026-will-be-the-year-of-governed-cybersecurity-ai\" rel=\"nofollow noopener\" target=\"_blank\">The expectation that AI tools be trustworthy, verifiable, and fit for purpose<\/a>\u00a0has moved from aspiration to regulatory reality across multiple jurisdictions, making the gap between Copilot\u2019s marketing and its terms of service harder to sustain.<\/p>\n<p>None of this means Copilot is uniquely unreliable by the standards of the current generation of AI assistants.\u00a0<a href=\"https:\/\/thenextweb.com\/news\/chatgpts-ads-era-is-here\" rel=\"nofollow noopener\" target=\"_blank\">Its primary competitor, ChatGPT, has its own well-documented accuracy problems<\/a>\u00a0even as OpenAI pushes into commercialisation. The difference is that Microsoft bet earlier, louder, and more money on the proposition that AI assistants were ready to become essential workplace tools. The fine print in its own terms of service suggests the company is hedging on that bet while the marketing continues to double down on it.\u00a0<a href=\"https:\/\/thenextweb.com\/news\/anthropics-30b-raise-is-about-more-than-money\" rel=\"nofollow noopener\" target=\"_blank\">Competitors raising billions on promises of AI reliability<\/a>\u00a0will have noticed the opening.\u00a0<a href=\"https:\/\/thenextweb.com\/news\/a-2025-recap-for-tech-ai\" rel=\"nofollow noopener\" target=\"_blank\">The race that defined 2025<\/a>\u00a0is entering a phase where the gap between \u201cfor entertainment purposes only\u201d and genuinely trustworthy AI is the most valuable real estate in the industry.<\/p>\n","protected":false},"excerpt":{"rendered":"In short:\u00a0Microsoft has spent billions building Copilot into every corner of its product lineup, pitching it as an&hellip;\n","protected":false},"author":2,"featured_media":515705,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,86,56,54,55],"class_list":{"0":"post-515704","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/515704","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=515704"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/515704\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/515705"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=515704"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=515704"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=515704"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}