Tech News
← Back to articles

Experts Alarmed That AI Is Now Producing Functional Viruses

read original related products more articles

AI can now invent working biological viruses.

In real world experiments, a team of Stanford researchers demonstrated that a virus with AI-written DNA could target and kill specific bacteria, they announced in a study last week. It opened up a world of possibilities where artificial viruses could be used to cure diseases and fight infections.

But experts say it also opened a Pandora’s box. Bad actors could just as easily use AI to crank out novel bioweapons, keeping doctors and governments on the backfoot with the outrageous pace at which these viruses can be designed, warn Tal Feldman, a Yale Law School student who formerly built AI models for the federal government, and Jonathan Feldman, a computer science and biology researcher at Georgia Tech (no word on whether the two are related).

“There is no sugarcoating the risks,” the pair warned in a piece for the Washington Post. “We’re nowhere near ready for a world in which artificial intelligence can create a working virus, but we need to be — because that’s the world we’re now living in.”

In the study, the Stanford researchers used an AI model called Evo to invent DNA for a bacteriophage, a virus that infects bacteria. Unlike a general purpose large language model like ChatGPT, which is trained on written language, Evo was exclusively trained on millions of bacteriophage genomes.

They focused on an extensively studied phage called phiX174, which is known to infect strains of the bacteria E. coli. Using the EVO AI model, the team came up with 302 candidate genomes based on phiX174 and put them to the test by using the designs to chemically assemble new viruses.

Sixteen of them worked, infecting and killing the E. coli strains. Some of them were even deadlier than the natural form of the virus.

But “while the Stanford team played it safe, what’s to stop others from using open data on human pathogens to build their own models?” the two Feldmans warned. “If AI collapses the timeline for designing biological weapons, the United States will have to reduce the timeline for responding to them. We can’t stop novel AI-generated threats. The real challenge is to outpace them.”

That means using the same AI tech to design antibodies, antivirals, and vaccines. This work is already being done to some extent, but the vast amounts of data needed to accelerate such pioneering research “is siloed in private labs, locked up in proprietary datasets or missing entirely.”

“The federal government should make building these high-quality datasets a priority,” the duo opined.

... continue reading