{"id":359661,"date":"2026-04-02T03:41:08","date_gmt":"2026-04-02T03:41:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/nz\/359661\/"},"modified":"2026-04-02T03:41:08","modified_gmt":"2026-04-02T03:41:08","slug":"microsoft-copilot-free-version-terms-raise-eyebrows-but-experts-say-theyre-industry-standard","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/nz\/359661\/","title":{"rendered":"Microsoft Copilot free version terms raise eyebrows, but experts say they\u2019re industry standard"},"content":{"rendered":"<p class=\"sYHrSxRJWo\" style=\"display:none\">Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don\u2019t rely on Copilot for important advice. Use Copilot at your own risk.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">The terms also indemnify Microsoft for any losses and expenses caused by Copilot use and add: \u201cWe can\u2019t promise that any [of] Copilot\u2019s Responses won\u2019t infringe someone else\u2019s rights (like their copyrights, trademarks, or rights of privacy) or defame them. You are solely responsible if you choose to publish or share Copilot\u2019s Responses.\u201d<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">Reeve was one of many to highlight updated wording for the tech giant\u2019s terms and conditions for its AI assistant, even if it dates from October last year. Over the past 24 hours, it\u2019s gone semi-viral on Reddit and elsewhere.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cIt\u2019s freaking wild,\u201d an AI specialist at one of the Big Four consultancy firms told the Herald.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">But other experts said the terms were equivalent for any free version of an AI chatbot. <\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">Microsoft has different terms for its Copilot with commercial data protection, where the AI can be pointed at data from your company only, and other trusted sources, and is not used to train Copilot.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">A Microsoft spokeswoman confirmed the terms highlighted by Reeve (and online <a href=\"https:\/\/www.microsoft.com\/en-us\/microsoft-copilot\/for-individuals\/termsofuse\" target=\"_self\" rel=\"nofollow noopener\" title=\"https:\/\/www.microsoft.com\/en-us\/microsoft-copilot\/for-individuals\/termsofuse\">here<\/a>) were for the consumer version of Copilot.<\/p>\n<p>\u2018Standard approach\u2019<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cThis is a fairly standard approach,\u201d privacy expert Frith Tweedie said.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cThese terms seem to apply to the free version. On that basis, I don\u2019t think they are unreasonable. <\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cMicrosoft is essentially pointing to the limitations of the tool, which are \u2013 or should be \u2013 well known, particularly in respect of hallucinations and other accuracy challenges. <\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cThe reference to Copilot being \u2018for entertainment purposes only\u2019 seems to be aimed squarely at individual users of the free version.\u201d <\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">The Simply Privacy principal forwarded terms from Claude maker Anthropic and ChatGPT maker OpenAI, which have similar wording to Microsoft\u2019s various qualifiers around liability and the possibility that their output could be inaccurate.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cIt blows my mind how underappreciated the accuracy issue tends to be. Particularly given how clearly it is addressed by the companies themselves,\u201d Tweedie said.<\/p>\n<p><img  alt=\" \u201cI think Microsoft also is trying to push Copilot onto individual users for productivity as well in their marketing, so I think it\u2019s mixed messaging\u201d- Victoria University senior lecturer in artificial intelligence Dr Andrew Lensen\" class=\"article-media__image responsively-lazy\" data-test-ui=\"article-media__image\"\/> \u201cI think Microsoft also is trying to push Copilot onto individual users for productivity as well in their marketing, so I think it\u2019s mixed messaging\u201d- Victoria University senior lecturer in artificial intelligence Dr Andrew Lensen<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">Victoria University AI expert Dr Andrew Lensen also said the terms were just reflecting the reality of the technology.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cWe are seeing a lot of people take the advice from these AI language models as gospel, when they can be wrong, often subtly,\u201d he said.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">While the terms were for the free version, Lensen added: \u201cI think Microsoft also is trying to push Copilot onto individual users for productivity as well in their marketing, so I think it\u2019s mixed messaging.\u201d<\/p>\n<p>Business protections<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cBusinesses get stronger privacy and security protections under M365 Copilot,\u201d Tweedie said.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cBut the warning \u2018It can make mistakes, and it may not work as intended. Don\u2019t rely on Copilot for important advice. Use Copilot at your own risk\u2019 remains true for any generative AI chatbot, whether accessed under a subscription or otherwise. <\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cThis appears to be an attempt by Microsoft to limit potential liability by pointing out the unreliability of Copilot outputs,\u201d said Tweedie, who worked as a lawyer at Bell Gully and intellectual property specialist James and Wells earlier in her career, and is currently an adviser to the Department of Internal Affairs\u2019 AI Advisory Panel.<\/p>\n<p><img  alt=\"\u201cIt blows my mind how underappreciated the accuracy issue tends to be. Particularly given how clearly it is addressed by the companies themselves,\u201d privacy expert Frith Tweedie.\" class=\"article-media__image responsively-lazy\" data-test-ui=\"article-media__image\"\/>\u201cIt blows my mind how underappreciated the accuracy issue tends to be. Particularly given how clearly it is addressed by the companies themselves,\u201d privacy expert Frith Tweedie.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cBusinesses need to pay proper attention to the accuracy problems that are fundamental to LLMs, a risk that I often see downplayed.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cTechniques like RAG can help a lot with this, but it\u2019s not typically a complete solution.\u201d<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">RAG (retrieval-augmented generation) is a name for a framework where AI is ring-fenced and only able to query trusted sources, among other protections.<\/p>\n<p><img  alt=\"&quot;Anyone using them [AI assistants] in a business or who is generally concerned over issues such as confidentiality and privacy should steer clear of free versions&quot; - Lowndes Jordan partner Rick Shera&#10;\" class=\"article-media__image responsively-lazy\" data-test-ui=\"article-media__image\"\/>&#8220;Anyone using them [AI assistants] in a business or who is generally concerned over issues such as confidentiality and privacy should steer clear of free versions&#8221; &#8211; Lowndes Jordan partner Rick Shera<\/p>\n<p>\u2018Steer clear of free versions\u2019<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cHaving reviewed terms and conditions for various LLMs and the wrappers that sit over the top of them, anyone using them in a business or who is generally concerned over issues such as confidentiality and privacy should steer clear of free versions,\u201d Lowndes Jordan partner Rick Shera said. <\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">\u201cAssurances that are given for paid versions around security, privacy, confidentiality and non-use for LLM training purposes are a must have where inputting business or sensitive personal information, particularly given recent cases suggesting that LLM platforms may be forced by court order to disclose user prompts and legal privilege may be lost where there is no expectation of confidentiality, as there cannot be with most free versions.\u201d<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">There are wrinkles. An organisation can subscribe to the paid, data-protected version of an AI, but then have staff \u201cBYO\u201d their favourite chatbot to the office.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">Others, like the Department of Corrections, have rules around the use of free AI chatbots, including prohibiting their use with sensitive data, only for some staff to <a href=\"https:\/\/www.nzherald.co.nz\/nz\/corrections-takes-action-against-staffs-unacceptable-use-of-artificial-intelligence\/ZXZHMCKB4JDEVMJXTWWT44ALBU\/\" target=\"_self\" rel=\"nofollow noopener\" title=\"https:\/\/www.nzherald.co.nz\/nz\/corrections-takes-action-against-staffs-unacceptable-use-of-artificial-intelligence\/ZXZHMCKB4JDEVMJXTWWT44ALBU\/\">ignore the guidelines<\/a>.<\/p>\n<p class=\"sYHrSxRJWo\" style=\"display:none\">Chris Keall is an Auckland-based member of the Herald\u2019s business team. He joined the Herald in 2018 and is the technology editor and a senior business writer.<\/p>\n","protected":false},"excerpt":{"rendered":"Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don\u2019t&hellip;\n","protected":false},"author":2,"featured_media":359662,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[365,363,364,1558,3013,9473,7707,5658,179374,4975,19995,305,2454,359,609,84887,111,139,69,2515,5659,21714,1177,145,39545,27300,1803,13020,189739],"class_list":{"0":"post-359661","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-been","12":"tag-but","13":"tag-chatbot","14":"tag-copilot","15":"tag-experts","16":"tag-eyebrows","17":"tag-free","18":"tag-gobsmacked","19":"tag-have","20":"tag-industry","21":"tag-internet","22":"tag-microsoft","23":"tag-microsofts","24":"tag-new-zealand","25":"tag-newzealand","26":"tag-nz","27":"tag-raise","28":"tag-say","29":"tag-sections","30":"tag-standard","31":"tag-technology","32":"tag-terms","33":"tag-theyre","34":"tag-update","35":"tag-version","36":"tag-wording"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts\/359661","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/comments?post=359661"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts\/359661\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/media\/359662"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/media?parent=359661"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/categories?post=359661"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/tags?post=359661"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}