Skip to content
Tech News
← Back to articles

New England Journal of Medicine Retracts Paper Because Photo of Patient’s Insides Was Garbled by AI

read original get Medical Imaging AI Software → more articles
Why This Matters

The retraction by the NEJM highlights the growing risks of AI-generated content in scientific publishing, emphasizing concerns over accuracy, trustworthiness, and integrity in medical research. This incident underscores the need for stricter oversight and clear policies regarding AI use in scholarly work, which is crucial for maintaining public and professional confidence in health data. As AI tools become more prevalent, the industry must adapt to ensure the reliability of published research and protect consumer trust.

Key Takeaways

Sign up to see the future, today Can’t-miss innovations from the bleeding edge of science and tech Email address Sign Up Thank you!

Medical journals are being flooded with shoddy AI-generated work, a growing threat to the scientific community that could undermine the value and trustworthiness of potentially life-saving health research. Papers citing hallucinated journals and studies have quickly become a common fixture, raising major concerns among those tasked with weeding through a flood of new submissions.

In a high profile new gaffe, the reputable New England Journal of Medicine (NEJM) was forced to retract a paper by two Beijing-based researchers about a man in China developing “bronchial casts” in his lungs following a wildfire, after it was discovered that the authors had used an AI tool to manipulate a photograph in the piece.

The offending photo shows almost pitch-black, particle-filled bronchial tissues that were cryogenically removed from the patient’s lungs. As MedPage Today reported, an 87-year-old man had been brought to the emergency department at the Beijing Tsinghua Changgung Hospital after extensive fire smoke inhalation, requiring the removal of bronchial tissues that were entirely plugged with smoke particulate matter, an extremely dangerous obstruction of the airway. (MedPage later pointed out the retraction in an editor’s note.)

However, what appears to be a metric measuring tape above the tissues in the photo raises immediate red flags, with the numbers along the scale following a nonsensical sequence — a classic hallmark of the use of an unsophisticated AI image generator.

The authors said the slip-up was a careless accident.

In a retraction note, they wrote that “we were unaware of Journal policies on image manipulation and had altered our submission by using an artificial intelligence (AI) tool to move the ruler to the top of the image.”

“We therefore wish to retract our image and case report,” the note reads.

The blunder should give researchers pause. If simply moving a ruler results in this kind of AI-generated carnage, what other manipulations, both intentional or unintentional, are falling through the cracks?

Some users on social media also questioned the validity of the rest of the offending image, pointing out that there were too many segments of the senior patient’s lungs in the photo, raising the possibility that the image had been manipulated by AI in other ways.

... continue reading