{"id":472495,"date":"2026-03-13T01:18:08","date_gmt":"2026-03-13T01:18:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/472495\/"},"modified":"2026-03-13T01:18:08","modified_gmt":"2026-03-13T01:18:08","slug":"microsoft-copilot-now-boarding-your-health-information-the-register","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/472495\/","title":{"rendered":"Microsoft Copilot now boarding your health information \u2022 The Register"},"content":{"rendered":"<p>Microsoft wants to store your healthcare data so that its AI &#8220;delivers personalized health insights that you can act on,&#8221; but without the liability that comes with actual medical advice.<\/p>\n<p>This biz has created a supposedly &#8220;separate, secure space within Copilot&#8221; to do so, under the name Copilot Health.<\/p>\n<p>The company&#8217;s <a href=\"https:\/\/microsoft.ai\/news\/introducing-copilot-health\/\" rel=\"nofollow noopener\" target=\"_blank\">announcement<\/a> buries the lede. At the end of its post comes the disclaimer: &#8220;Copilot Health is not intended to diagnose, treat, or prevent diseases or other conditions and is not a substitute for professional medical advice.&#8221;<\/p>\n<p>That&#8217;s perhaps for the best in light of <a href=\"https:\/\/www.theregister.com\/2026\/02\/09\/ai_chatbots_medical_advice_sucks\/\" rel=\"nofollow noopener\" target=\"_blank\">a recent UK study<\/a> that found chatbots give poor medical advice.<\/p>\n<p>Nonetheless, people commonly consult AI models for advice about their health. When OpenAI counted up potential customers, it found more than 40 million people worldwide asking ChatGPT for healthcare advice each day. Eager to tap into that market, OpenAI <a href=\"https:\/\/www.theregister.com\/2026\/01\/05\/chatgpt_playing_doctor_openai\/\" rel=\"nofollow noopener\" target=\"_blank\">announced ChatGPT Health<\/a> in January. Anthropic threw its hat into the ring a few days later <a href=\"https:\/\/www.theregister.com\/2026\/01\/12\/claude_anthropic_healthcare\/\" rel=\"nofollow noopener\" target=\"_blank\">with Claude for Healthcare<\/a>.<\/p>\n<p>Microsoft&#8217;s own <a href=\"https:\/\/www.microsoft.com\/en-us\/research\/publication\/how-people-use-copilot-for-health\/\" rel=\"nofollow noopener\" target=\"_blank\">research on how<\/a> Copilot is used indicates that almost one in five conversations involves assessment of a personal symptom or condition.<\/p>\n<p>In a social media <a href=\"https:\/\/x.com\/mustafasuleyman\/status\/2032092644483141928\" rel=\"nofollow\">post<\/a>, Mustafa Suleyman, CEO of Microsoft AI, said, &#8220;I think people are still underestimating how profound this transformation is going to be. Today we&#8217;re announcing Copilot Health, enabling users to connect all their EHR records and wearable data in a secure, private health space that Copilot can analyze and reason about to provide personalized insights and proactive nudges.&#8221;<\/p>\n<p>These personalized insights and proactive nudges are not medical advice though; they&#8217;re intended to promote something more nebulous \u2013 wellness. Suleyman suggests that Copilot Health will help people come up with focused questions to present to actual doctors during medical appointments.<\/p>\n<p>Copilot Health is described as a way to help people organize activity data from consumer wearable devices such as Apple Watch, Oura, Fitbit, and others &#8211; information that can then be combined into a profile alongside hospital health records and lab results.<\/p>\n<p>Per Microsoft&#8217;s disclaimer, this is not intended as medical advice. But it certainly sounds like that&#8217;s the goal \u2013 Suleyman says that Microsoft wants &#8220;to make this service available to the billions of people around the world who struggle to access reliable medical advice.&#8221;<\/p>\n<p>But the distinction between regulated medical advice and best-effort AI emissions about health may become more difficult to discern, thanks to the US Food and Drug Administration&#8217;s <a href=\"https:\/\/www.fda.gov\/regulatory-information\/search-fda-guidance-documents\/general-wellness-policy-low-risk-devices\" rel=\"nofollow noopener\" target=\"_blank\">relaxation of wearable rules<\/a> at the start of the year. As law firm Arnold &amp; Porter <a href=\"https:\/\/www.arnoldporter.com\/en\/perspectives\/advisories\/2026\/01\/fda-cuts-red-tape-on-clinical-decision-support-software\" rel=\"nofollow noopener\" target=\"_blank\">noted<\/a> in January, &#8220;the revised policy concerning wearables likely means that more AI-enabled CDS [clinical decision support] can be made available as non-device CDS, i.e., without FDA review.&#8221;\u00a0<\/p>\n<p>Copilot Health comes with assurances about security and privacy, an area where Microsoft&#8217;s track record speaks for itself.<\/p>\n<p>&#8220;Your Copilot Health conversations and data are isolated from general Copilot and kept under additional access, privacy, and safety controls,&#8221; insist Microsoft&#8217;s medical messengers Bay Gross, Peter Hames, Chris Kelly, Dominic King, and Harsha Nori.\u00a0<\/p>\n<p>&#8220;Data in Copilot Health is protected with industry leading safeguards, including encryption at rest and in transit, strict access controls, and the ability to manage and delete your information when you choose. You can disconnect your connectors to health data sources such as electronic health records or wearables instantaneously at any time. Your information in Copilot Health is not used for model training.&#8221; \u00ae<\/p>\n","protected":false},"excerpt":{"rendered":"Microsoft wants to store your healthcare data so that its AI &#8220;delivers personalized health insights that you can&hellip;\n","protected":false},"author":2,"featured_media":472496,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,86,56,54,55],"class_list":{"0":"post-472495","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/472495","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=472495"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/472495\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/472496"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=472495"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=472495"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=472495"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}