{"id":236682,"date":"2026-01-09T21:13:07","date_gmt":"2026-01-09T21:13:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/ie\/236682\/"},"modified":"2026-01-09T21:13:07","modified_gmt":"2026-01-09T21:13:07","slug":"is-giving-chatgpt-health-your-medical-records-a-good-idea","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ie\/236682\/","title":{"rendered":"Is Giving ChatGPT Health Your Medical Records a Good Idea?"},"content":{"rendered":"<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color min-h-[6.375rem] lg:min-h-[4.75rem] dropcap text-left\" data-testid=\"paragraph-content\">Your AI doctor\u2019s office is expanding. On Jan. 7, <a href=\"https:\/\/openai.com\/index\/introducing-chatgpt-health\/\" rel=\"nofollow noopener\" target=\"_blank\">OpenAI announced that<\/a> over the coming weeks, it will roll out ChatGPT Health, a dedicated tab for health that allows users to upload their medical records and connect apps like Apple Health, the personalized health testing platform Function, and MyFitnessPal.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\"><a href=\"https:\/\/cdn.openai.com\/pdf\/2cb29276-68cd-4ec6-a5f4-c01c5e7a36e9\/OpenAI-AI-as-a-Healthcare-Ally-Jan-2026.pdf\" rel=\"nofollow noopener\" target=\"_blank\">According to the company<\/a>, more than 40 million people ask ChatGPT a health care-related question every day, which amounts to more than 5% of all global messages on the platform\u2014so, from a business perspective, leaning into health makes sense. But what about from a patient standpoint?<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">\u201cI wasn\u2019t shocked to hear this news,\u201d says Dr. Danielle Bitterman, a radiation oncologist and clinical lead for data science and AI at Mass General Brigham Digital. \u201cI do think that this speaks to an unmet need that people have regarding their health care. It\u2019s difficult to get in to see a doctor, it&#8217;s nowadays hard to find medical information, and there is, unfortunately, some distrust in the medical system.\u201d<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">We asked experts whether turning over your health data to an AI tool is a good idea.<\/p>\n<p>What is ChatGPT Health?<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">The new feature will be a hub where people can upload their medical records, including lab results, visit summaries, and clinical history. That way, when you ask the bot questions, it will be \u201cgrounded in the information you\u2019ve connected,\u201d the company said in its announcement. OpenAI suggests asking questions like: \u201cHow\u2019s my cholesterol trending?\u201d \u201cCan you summarize my latest bloodwork before my appointment?\u201d \u201cGive me a summary of my overall health.\u201d Or: \u201cI have my annual physical tomorrow. What should I talk to my doctor about?\u201d<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Read More: <a href=\"https:\/\/time.com\/7321821\/chatgpt-ai-how-to-use-for-health-safely\/\" rel=\"nofollow noopener\" target=\"_blank\">9 Doctor-Approved Ways to Use ChatGPT for Health Advice<\/a><\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Users can also connect ChatGPT to Apple Health, so the AI tool has access to data like steps per day, sleep duration, and number of calories burned during a workout. Another new addition is the ability to sync with data from Function, a company that tests for more than 160 markers in blood, so that ChatGPT has access to lab results as well as clinicians\u2019 health suggestions. Users can also connect MyFitnessPal for nutrition advice and recipes, and Weight Watchers for meal ideas and recipes geared toward those on GLP-1 medications.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">OpenAI, which has a licensing and technology agreement that allows the company to access TIME\u2019s archives, notes that Health is designed to support health care\u2014not replace it\u2014and is not intended to be used for diagnosis or treatment. The company says it spent two years working with more than 260 physicians across dozens of specialities to shape what the tool can do, as well as how it responds to users. That includes how urgently it encourages people to follow-up with their provider, the ability to communicate clearly without oversimplifying, and prioritizing safety when people are in mental distress.<\/p>\n<p>Is it safe to upload your medical data?<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">OpenAI partnered with b.well, a data connectivity infrastructure company, to allow users to securely connect their medical records to the tool. The Health tab will have \u201cenhanced privacy,\u201d including a separate chat history and memory feature than other tabs, according to the announcement. OpenAI also said that \u201cconversations in Health are not used to train our foundation models,\u201d and Health information won\u2019t flow into non-Health chats. Plus, users can \u201cview or delete Health memories at any time.\u201d<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Still, some experts urge caution. \u201cThe most conservative approach is to assume that any information you upload into these tools, or any information that may be in applications you otherwise link to the tools, will no longer be private,\u201d Bitterman says.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">No federal regulatory body governs the health information provided to AI chatbots, and ChatGPT provides technology services that are not within the scope of HIPAA. \u201cIt\u2019s a contractual agreement between the individual and OpenAI at that point,\u201d says Bradley Malin, a professor of biomedical informatics at Vanderbilt University Medical Center. \u201cIf you are providing data directly to a technology company that is not providing any health care services, then it is buyer beware.\u201d In the event that there was a data breach, ChatGPT users would have no specific rights under HIPAA, he adds, though it\u2019s possible the Federal Trade Commission could step in on your behalf, or that you could sue the company directly. As medical information and AI start to intersect, the implications so far are murky.\u00a0<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">\u201cWhen you go to your health care provider and you have an interaction with them, there&#8217;s a professional agreement that they&#8217;re going to maintain this information in a confidential manner, but that&#8217;s not the case here,\u201d Malin says. \u201cYou don&#8217;t know exactly what they are going to do with your data. They say that they\u2019re going to protect it, but what exactly does that mean?\u201d<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Read More: <a href=\"https:\/\/time.com\/7297703\/what-not-to-say-doctors-appointment\/\" rel=\"nofollow noopener\" target=\"_blank\">The 4 Words That Drive Your Doctor Up the Wall<\/a><\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">When asked for comment on Jan. 8, OpenAI directed TIME to <a href=\"https:\/\/x.com\/cryps1s\/status\/2009040709635199151\" rel=\"nofollow\">a post on X from chief information security officer Dane Stuckey<\/a>. \u201cConversations and files in ChatGPT are encrypted by default at rest and in transit as part of our core security architecture,\u201d he wrote. \u201cFor Health, we built on this foundation with additional, layered protections. This includes another layer of encryption\u2026enhanced isolation, and data segmentation.\u201d He added that the changes the company has made \u201cgive you maximum control over how your data is used and accessed.\u201d<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">The question every user has to grapple with is \u201cwhether you trust OpenAI to keep to their word,\u201d says Dr. Robert Wachter, chair of the department of medicine at the University of California, San Francisco, and author of A Giant Leap: How AI Is Transforming Healthcare and What That Means for Our Future.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Does he trust it? \u201cI sort of do, in part because they have a really strong corporate interest in not screwing this up,\u201d he says. \u201cIf they want to get into sensitive topics like health, their brand is going to be dependent on you feeling comfortable doing this, and the first time there&#8217;s a data breach, it&#8217;s like, \u2018Take my data out of there\u2014I&#8217;m not sharing it with you anymore.\u2019\u201d<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Wachter says that if there was information in his records that could be detrimental if it leaked\u2014like a past history of drug use, for example\u2014he would be reluctant to upload it to ChatGPT. \u201cI\u2019d be a little careful,\u201d he says. \u201cEverybody\u2019s going to be different on that, and over time, as people get more comfortable, if you think what you&#8217;re getting out of it is useful, I think people will be quite willing to share information.\u201d<\/p>\n<p>The risk of bad information<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Beyond privacy concerns, there are known risks of using large-language-model-based chatbots for health information. Bitterman recently <a href=\"https:\/\/www.nature.com\/articles\/s41746-025-02008-z\" rel=\"nofollow noopener\" target=\"_blank\">co-authored a study<\/a> that found that models are designed to prioritize being helpful over medical accuracy\u2014and to always supply an answer, especially one that the user is likely to respond to. In one experiment, for example, models that were trained to know that acetaminophen and Tylenol are the same drug still produced inaccurate information when asked why one was safer than the other.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">\u201cThe threshold of balancing being helpful versus being accurate is more on the helpfulness side,\u201d Bitterman says. \u201cBut in medicine we need to be more on the accurate side, even if it&#8217;s at the expense of being helpful.\u201d<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Plus, <a href=\"https:\/\/www.medrxiv.org\/content\/10.1101\/2025.02.28.25323115v1.full\" rel=\"nofollow noopener\" target=\"_blank\">multiple<\/a> <a href=\"https:\/\/jamanetwork.com\/journals\/jamanetworkopen\/fullarticle\/2822301?utm_source=chatgpt.com\" rel=\"nofollow noopener\" target=\"_blank\">studies<\/a> <a href=\"https:\/\/www.nature.com\/articles\/s41746-024-01079-8\" rel=\"nofollow noopener\" target=\"_blank\">suggest<\/a> that <a href=\"https:\/\/jamanetwork.com\/journals\/jamanetworkopen\/fullarticle\/2822296\" rel=\"nofollow noopener\" target=\"_blank\">if there\u2019s missing information<\/a> in your medical records, models are more likely to hallucinate, or produce incorrect or misleading results. <a href=\"https:\/\/www.nist.gov\/system\/files\/documents\/2025\/02\/20\/DataQuality-D3.pdf\" rel=\"nofollow noopener\" target=\"_blank\">According to a report on supporting AI in health care from the National Institute of Standards and Technology<\/a>, the quality and thoroughness of the health data a user gives a chatbot directly determines the quality of the results the chatbot generates; poor or incomplete data leads to inaccurate, unreliable results. A few common traits help increase data quality, the report notes: correct, factual information that\u2019s comprehensive, complete, and consistent, without any outdated or misleading insights.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">In the U.S., \u201cwe get our health care from all different sites, and it&#8217;s fragmented over time, so most of our health care records are not complete,\u201d Bitterman says. That increases the likelihood that you\u2019ll see errors where it\u2019s guessing what happened in areas where there are gaps, she says.<\/p>\n<p>The best way to use ChatGPT Health<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Overall, Wachter considers ChatGPT Health a step forward from the current iteration. People were already using the bot for health queries, and by providing it with more context via their medical records\u2014like a history of diabetes or blood clots\u2014he believes they&#8217;ll receive more useful responses.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">\u201cWhat you&#8217;ll get today, I think, is better than what you got before if all your background information is in there,\u201d he says. \u201cKnowing that context would be useful. But I think the tools themselves are going to have to get better over time and be a little bit more interactive than they are now.\u201d<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">When Dr. Adam Rodman watched the <a href=\"https:\/\/openai.com\/index\/introducing-chatgpt-health\/?video=1152278055\" rel=\"nofollow noopener\" target=\"_blank\">ChatGPT Health introductory video<\/a>, he was pleased with what he saw. \u201cI thought it was pretty good,\u201d says Rodman, a general internist at Beth Israel Deaconess Medical Center, where he leads the task force for integration of AI into the medical school curriculum, and an assistant professor at Harvard Medical School. \u201cIt really focused on using it to help understand your health better\u2014not using it as a replacement, but as a way to enhance.\u201d Since people were already using ChatGPT for things like analyzing lab results, the new feature will simply make doing so easier and more convenient, he says. \u201cI think this more reflects what health care looks like in 2026 rather than any sort of super novel feature,\u201d he says. \u201cThis is the reality of how health care is changing.\u201d<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Read More:\u00a0<a href=\"https:\/\/time.com\/7270606\/questions-to-ask-doctor-appointment\/\" rel=\"nofollow noopener\" target=\"_blank\">10 Questions You Should Always Ask at Doctors\u2019 Appointments<\/a><\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">When Rodman counsels his patients on how to best use AI tools, he tells them to avoid health management questions, like asking the bot to choose the best treatment program. \u201cDon\u2019t have it make autonomous medical decisions,\u201d he says. But it\u2019s fair game to ask if your doctor could be missing something, or to explore \u201clow-risk\u201d matters like diet and exercise plans, or interpreting sleep data.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">One of Bitterman\u2019s favorite usages is asking ChatGPT to help brainstorm questions ahead of a doctor appointment. Augmenting your existing care like that is a good idea, she says, with one clear bonus: \u201cYou don\u2019t necessarily need to upload your medical records.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"Your AI doctor\u2019s office is expanding. On Jan. 7, OpenAI announced that over the coming weeks, it will&hellip;\n","protected":false},"author":2,"featured_media":236683,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[777,103,397,396,61,60],"class_list":{"0":"post-236682","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-evergreen","9":"tag-health","10":"tag-health-care","11":"tag-healthcare","12":"tag-ie","13":"tag-ireland"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/236682","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/comments?post=236682"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/236682\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media\/236683"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media?parent=236682"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/categories?post=236682"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/tags?post=236682"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}