Tech News
← Back to articles

‘Clinical-grade AI’: a new buzzy AI word that means absolutely nothing

read original related products more articles

is a London-based reporter at The Verge covering all things AI and Senior Tarbell Fellow. Previously, he wrote about health, science and tech for Forbes.

Earlier this month, Lyra Health announced a “clinical-grade” AI chatbot to help users with “challenges” like burnout, sleep disruptions, and stress. There are eighteen mentions of “clinical” in its press release, including “clinically designed,” “clinically rigorous,” and “clinical training.” For most people, myself included, “clinical” suggests “medical.” The problem is, it doesn’t mean medical. In fact, “clinical-grade” doesn’t mean anything at all.

“Clinical-grade” is an example of marketing puffery designed to borrow authority from medicine without the strings of accountability or regulation. It sits alongside other buzzy marketing phrases like “medical-grade” or “pharmaceutical-grade” for things like steel, silicone, and supplements that imply quality; “prescription-strength” or “doctor-formulated” for creams and ointments denoting potency; and “hypoallergenic” and “non-comedogenic” suggesting outcomes — lower chances of allergic reactions and non-pore blocking, respectively — for which there are no standard definitions or testing procedures.

Lyra executives have confirmed as much, telling Stat News that they don’t think FDA regulation applies to their product. The medical language in the press release — which calls the chatbot “a clinically designed conversational AI guide” and “the first clinical-grade AI experience for mental health care” — is only there to help it stand out from competitors and to show how much care they took in developing it, they claim.

Lyra pitches its AI tool as an add-on to the mental healthcare already provided by its human staff, like therapists and physicians, letting users get round-the-clock support between sessions. According to Stat, the chatbot can draw on previous clinical conversations, surface resources like relaxation exercises, and even use unspecified therapeutic techniques.

The description raises the obvious question of what does “clinical-grade” even mean here? Despite leaning heavily on the term, Lyra doesn’t explicitly say. The company did not respond to The Verge’s requests for comment or a specific definition of “clinical-grade AI.”

“There’s no specific regulatory meaning to the term ‘clinical-grade AI,’” says George Horvath, a physician and law professor at UC Law San Francisco. “I have not found any sort of FDA document that mentions that term. It’s certainly not in any statutes. It’s not in regulations.”

As with other buzzy marketing terms, it seems like it’s something the company coined or co-opted themselves. “It’s pretty clearly a term that’s coming out of industry,” Horvath says. “It doesn’t look to me as though there’s any single meaning ... Every company probably has its own definition for what they mean by that.”

Though “the term alone has little meaning,” Vaile Wright, a licensed psychologist and senior director of the American Psychological Association’s office of healthcare innovation, says it’s obvious why Lyra would want to lean on it. “I think this is a term that’s been coined by some of these companies as a marker of differentiation in a very crowded market, while also very intentionally not falling under the purview of the Food and Drug Administration.” The FDA oversees the quality, safety, and effectiveness of an array of food and medical products like drugs and implants. There are mental health apps that do fall under its remit and to secure approval, developers must meet rigorous standards for safety, security, and efficacy through steps like clinical trials that prove they do what they claim to do and do so safely.

The FDA route is expensive and time consuming for developers, Wright says, making this kind of “fuzzy language” a useful way of standing out from the crowd. It’s a challenge for consumers, Wright says, but it is allowed. The FDA’s regulatory pathway “was not developed for innovative technologies,” she says, making some of the language being used for marketing jarring. “You don’t really see it in mental health,” Wright says. “There’s nobody going around saying clinical-grade cognitive behavioral therapy, right? That’s just not how we talk about it.”

... continue reading