Tech News
← Back to articles

OpenAI Launches ChatGPT Health, Which Ingests Your Entire Medical Records, But Warns Not to Use It for “Diagnosis or Treatment”

read original related products more articles

AI chatbots may be explosively popular, but they’re known to dispense some seriously wacky— and potentially dangerous — health advice, in a flood of easily accessible misinformation that has alarmed experts.

Their advent has turned countless users into armchair experts, who often end up relying on obsolete, misattributed, or completely made-up advice.

A recent investigation by The Guardian, for instance, found that Google’s AI Overviews, which accompany most search results pages, doled out plenty of inaccurate health information that could lead to grave health risks if followed.

But seemingly unperturbed by experts’ repeated warnings that AI’s health advice shouldn’t be trusted, OpenAI is doubling down by launching a new feature called ChatGPT Health, which will ingest your medical records to generate responses “more relevant and useful to you.”

Yet despite being “designed in close collaboration with physicians” and built on “strong privacy, security, and data controls,” the feature is “designed to support, not replace, medical care.” In fact, it’s shipping with a ludicrously self-defeating caveat: that the bespoke health feature is “not intended for diagnosis or treatment.”

“ChatGPT Health helps people take a more active role in understanding and managing their health and wellness — while supporting, not replacing, care from clinicians,” the company’s website reads.

In reality, users are certain to use it for exactly the type of health advice that OpenAI is warning against in the fine print, which is likely to bring fresh new embarrassments for the company.

It’ll only be heightening existing problems for the company. As Business Insider reports, ChatGPT is “making amateur lawyers and doctors out of everyone,” to the dismay of legal and medical professionals.

Miami-based medical malpractice attorney Jonathan Freidin told the publication that people will use chatbots like ChatGPT to fill out his firm’s client contact sheet.

“We’re seeing a lot more callers who feel like they have a case because ChatGPT or Gemini told them that the doctors or nurses fell below the standard of care in multiple different ways,” he said. “While that may be true, it doesn’t necessarily translate into a viable case.”

... continue reading