{"id":250452,"date":"2026-01-21T20:02:09","date_gmt":"2026-01-21T20:02:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/il\/250452\/"},"modified":"2026-01-21T20:02:09","modified_gmt":"2026-01-21T20:02:09","slug":"it-was-notorious-for-getting-things-wrong-now-its-assisting-your-doctor","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/il\/250452\/","title":{"rendered":"It was notorious for getting things wrong. Now it\u2019s assisting your doctor."},"content":{"rendered":"<p class=\"slate-paragraph slate-graf\" data-word-count=\"21\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7nkam00153b7a136wihon@published\"><a href=\"https:\/\/slate.com\/theslatest?utm_source=slate&amp;utm_medium=article&amp;utm_campaign=article_plain_text_topper&amp;sailthru_source=Article-TopperText-CTA\" rel=\"nofollow noopener\" target=\"_blank\">Sign up for the Slatest<\/a> to get the most insightful analysis, criticism, and advice out there, delivered to your inbox daily.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"74\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7lq1f000tv2mc7vp73k8b@published\">Asking a general-use chatbot for health help used to seem like a shot in the dark\u2014just two years ago, a study found that ChatGPT could diagnose only <a href=\"https:\/\/thehill.com\/policy\/healthcare\/4387138-chatgpt-incorrectly-diagnosed-more-than-8-in-10-pediatric-case-studies-research-finds\/\" rel=\"nofollow noopener\" target=\"_blank\">2 in 10 pediatric cases<\/a> correctly. Among Google Gemini\u2019s early recommendations were <a href=\"https:\/\/www.bbc.com\/news\/articles\/cd11gzejgz4o\" rel=\"nofollow noopener\" target=\"_blank\">eating one small rock a day and using glue to help cheese stick to pizza<\/a>. Last year, a nutritionist <a href=\"https:\/\/www.nbcnews.com\/tech\/tech-news\/man-asked-chatgpt-cutting-salt-diet-was-hospitalized-hallucinations-rcna225055\" rel=\"nofollow noopener\" target=\"_blank\">ended up hospitalized<\/a> after taking ChatGPT\u2019s advice to replace salt in his diet with sodium bromide.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"94\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pieo001v3b7abkjlxqih@published\">Now A.I. companies have begun releasing health-specific chatbots for both consumers and health care professionals. This month, <a href=\"https:\/\/openai.com\/index\/introducing-chatgpt-health\/\" rel=\"nofollow noopener\" target=\"_blank\">OpenAI announced<\/a> ChatGPT Health, which allows regular people to connect their medical records and health data to A.I. for (theoretically) more accurate responses to their health queries. It also released ChatGPT for Healthcare, a service that is already in use by hospitals across the country. OpenAI isn\u2019t the only one\u2014Anthropic announced its own chatbot, Claude for Healthcare, designed to help doctors with day-to-day tasks like retrieving medical records and to help patients better communicate with their providers.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"122\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7piep001w3b7a03i5z96t@published\">So how could these chatbots be an improvement over regular old chatbots? \u201cWhen talking about something designed specifically for health care, it should be trained on health care data,\u201d says Torrey Creed, an associate professor of psychiatry researching A.I. at the University of Pennsylvania. This means that a chatbot shouldn\u2019t have the option to pull from unreliable sources like social media. The second difference, she says, is ensuring that users\u2019 private data isn\u2019t sold or used to train models. Chatbots created for the health care sector are required to be HIPAA compliant. Bots that prompt consumers to directly chat with them about symptoms are designed only to connect the dots, and protecting consumer data is a matter of having robust privacy settings.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"58\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pieq001x3b7ablamugrt@published\">I spoke to Raina Merchant, the executive director of the Center for Health Care Transformation and Innovation at UPenn, about what patients need to know as they navigate the changing A.I. medical landscape, and how doctors are already applying the tech. Merchant says A.I. has a lot of potential\u2014but that, for now, it should be used with caution.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"12\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pieq001y3b7ag5no3jua@published\">How is the health care system currently using these chatbots and A.I.?<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"66\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pieq001z3b7a645ooo49@published\">It\u2019s a really exciting area. At Penn, we have a program called <a href=\"https:\/\/www.pennmedicine.org\/news\/new-ai-tool-helps-doctors-to-sift-and-synthesize-patient-data\" rel=\"nofollow noopener\" target=\"_blank\">Chart Hero<\/a>, which can be thought of like a ChatGPT embedded into a patient\u2019s health record. It\u2019s an A.I. agent I can prompt with specific questions to help find information in a chart or make calculations for risk scores or guidance. Since it\u2019s all embedded, I don\u2019t have to go look at separate sources.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"37\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pieq00203b7asdxtpbgv@published\">Using it, I can spend more time really talking to patients and have more of that human connection\u2014because I\u2019m spending less time doing chart digging or synthesizing information from different areas. It\u2019s been a real game changer.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"63\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pieq00213b7alce8i13a@published\">There\u2019s a lot of work in the ambient space, where A.I. can listen after patients have consented and help generate notes. Then there\u2019s also a lot of work in messaging interfaces. We have a portal where patients can send questions at any time using A.I. to help identify ways, still with a human in the loop, to be able to accurately answer information.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"10\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pieq00223b7ai7mktyai@published\">What does having a human in the loop look like?<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"32\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pieq00233b7a84eoz9r2@published\">Many hospital chatbots are intentionally supervised by humans. What might feel automated is often supported by people behind the scenes. Having a human makes sure that there are some checks and balances.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"45\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier00243b7aw7syfta6@published\">So a completely consumer-facing product like ChatGPT Health wouldn\u2019t have a human in the loop. You can just sit on the couch by yourself and have A.I. answer your health questions. What would you recommend that patients use ChatGPT Health for? What are the limitations?<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"48\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier00253b7a4j30s3ru@published\">I think of A.I. chatbots as tools. They are not clinicians. Their goal is to make care easier to access and navigate. They are good at guidance, but not so much judgment. They can help you understand next steps, but I wouldn\u2019t use them for making medical decisions.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"43\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier00263b7auofoklwu@published\">I really like the idea of using it to think through questions to ask your doctor. Going to a medical appointment, people can have certain emotions. Feeling like you\u2019re going in more prepared, that you thought of all the questions, can be good.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"19\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier00273b7almj4dze6@published\">Let\u2019s say I have a low-grade fever. Is it a good idea to ask ChatGPT Health what to do?<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"38\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier00283b7anj7eozmn@published\">If you are at the point of making a decision, that\u2019s when I would engage a physician. I see real value in using the chatbot as a tool for understanding next steps but not for making a decision.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"11\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier00293b7a53t9ctxm@published\">So how reliable are these new health chatbots at diagnosing conditions?<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"33\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier002a3b7a1ieeighd@published\">They have a tremendous amount of information that can be informative for both patients and clinicians. What we don\u2019t know yet is when they hallucinate, or when they veer from guidelines or recommendations.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"11\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier002b3b7as9wcw59i@published\">It won\u2019t be clear when the bot is making something up.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"44\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier002c3b7a2v3ol2yp@published\">There\u2019s a couple things that I tell patients: Check for consistency, go to trusted sources to validate information, and trust their instincts. If something sounds too good to be true, have a certain amount of hesitancy making any decisions based on the bot\u2019s information.<\/p>\n<p>    <a href=\"https:\/\/slate.com\/technology\/2025\/08\/doctor-patient-exam-ai-scribe-diagnosis.html\" class=\"recirc-line__content\" rel=\"nofollow noopener\" target=\"_blank\"><\/p>\n<p>          <img decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/il\/wp-content\/uploads\/2026\/01\/0dd14e9a-2fe8-401c-947d-aae506c25b72.jpeg\" width=\"141\" height=\"94\"   alt=\"\" loading=\"lazy\"\/><\/p>\n<p>\n          Alison Block<br \/>\n        I\u2019m a Doctor. I Never in a Million Years Thought I\u2019d Do What I\u2019m Doing Now to Connect With Patients.<br \/>\n        Read More\n      <\/p>\n<p>    <\/a><\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"9\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier002d3b7aq8tb3cl5@published\">What sources should patients be using to verify A.I.?<\/p>\n<p>          <a href=\"https:\/\/slate.com\/technology\/2026\/01\/health-chatbot-medicine-ai-doctor-google.html\" class=\"in-article-recirc__link\" rel=\"nofollow noopener\" target=\"_blank\"><\/p>\n<p>            Should You Take Advantage of the Latest Controversial Development in Health Care? Your Doctor Already Is.<br \/>\n          <\/a><\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"48\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pier002e3b7am44q6q5i@published\">I rely on the big recognizable names, like information from the American Heart Association or other large medical associations that might have guidelines or recommendations. When it gets to the question \u201cShould I trust the chatbot?,\u201d that\u2019s probably when it\u2019s valuable to work with your health care professional.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"10\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pies002f3b7ahctb3rck@published\">Is the data that patients put into health chatbots secure?<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"51\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pies002g3b7ag7mxjfn3@published\">My recommendation for any patient would be to not share personal details, like your name, address, medical record number, or prescription IDs, because it\u2019s not the environment we use for protecting patient information\u2014in the same way that I wouldn\u2019t enter my Social Security number into a random website or Google interface.<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"12\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pies002h3b7ac3lcr6rq@published\">Does this include health care chatbots provided through hospitals or health centers?<\/p>\n<p class=\"slate-paragraph slate-graf\" data-word-count=\"57\" data-uri=\"slate.com\/_components\/slate-paragraph\/instances\/cmko7pies002i3b7a2ifvkz1s@published\">If a hospital is providing a chatbot and [is very clear and transparent] about how the information is being used, and health information is protected, then I would feel comfortable entering my information there. But for something that didn\u2019t have transparency around who owns the data, how it\u2019s used, etc., I would not share my personal details.<\/p>\n<p>          <img alt=\"\" class=\"newsletter-signup__img\" hidden=\"\" data-src-light=\"https:\/\/dot.cdnslate.com\/static\/media\/components\/newsletter-signup\/the-slatest.49f353b.png\" data-src-dark=\"https:\/\/dot.cdnslate.com\/static\/media\/components\/newsletter-signup\/the-slatest-dark.ca73d21.png\" width=\"130\" height=\"58.7\"\/><\/p>\n<p>      Sign up for Slate&#8217;s evening newsletter.<\/p>\n","protected":false},"excerpt":{"rendered":"Sign up for the Slatest to get the most insightful analysis, criticism, and advice out there, delivered to&hellip;\n","protected":false},"author":2,"featured_media":250453,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[343,163,521,85,46,2409],"class_list":{"0":"post-250452","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-artificial-intelligence","9":"tag-health","10":"tag-healthcare","11":"tag-il","12":"tag-israel","13":"tag-medicine"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/250452","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/comments?post=250452"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/250452\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media\/250453"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media?parent=250452"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/categories?post=250452"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/tags?post=250452"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}