David Greene had never heard of NotebookLM, Google’s buzzy artificial intelligence tool that spins up podcasts on demand, until a former colleague emailed him to ask if he’d lent it his voice. “So... I’m probably the 148th person to ask this, but did you license your voice to Google?” the former co-worker asked in a fall 2024 email. “It sounds very much like you!”
Greene, a public radio veteran who has hosted NPR’s “Morning Edition” and KCRW’s political podcast “Left, Right & Center,” looked up the tool, listening to the two virtual co-hosts — one male and one female — engage in light banter.
“I was, like, completely freaked out,” Greene said. “It’s this eerie moment where you feel like you’re listening to yourself.”
Greene felt the male voice sounded just like him — from the cadence and intonation to the occasional “uhhs” and “likes” that Greene had worked over the years to minimize but never eliminated. He said he played it for his wife and her eyes popped.
As emails and texts rolled in from friends, family members and co-workers, asking if the AI podcast voice was his, Greene became convinced he’d been ripped off. Now he’s suing Google, alleging that it violated his rights by building a product that replicated his voice without payment or permission, giving users the power to make it say things Greene would never say.
Google told The Washington Post in a statement on Thursday that NotebookLM’s male podcast voice has nothing to do with Greene. Now a Santa Clara County, California, court may be asked to determine whether the resemblance is uncanny enough that ordinary people hearing the voice would assume it’s his — and if so, what to do about it.
The case is the latest to pit the rights of individual human creators against those of a booming AI industry that promises to transform the economy by allowing people to generate uncannily lifelike speech, prose, images and videos on demand. Behind the artificial voices in NotebookLM and similar tools are language models trained on vast libraries of writing and speech by real humans who were never told their words and voices would be used in that way — raising profound questions of copyright and ownership.
From political “voicefakes” to OpenAI touting a female voice for ChatGPT that resembled that of actress Scarlett Johansson, to deepfake scam ads that had a virtual Taylor Swift hawking Le Creuset cookware, the issues raised by Greene’s lawsuit are “going to come up a lot,” said James Grimmelmann, a professor of digital and information law at Cornell University.
A key question for the courts to decide, Grimmelmann said, will be just how closely an AI voice or likeness has to resemble the genuine article in order to count as infringing. Another will be whether Greene’s voice is famous enough for ordinary people to recognize it when they listen to NotebookLM and whether he’s harmed by the resemblance.
Those can be thorny questions when it comes to AI voices. There are software tools that can compare people’s voices, but they’re more commonly used to find or rule out an exact match between the voices of real humans, rather than a synthetic one.
... continue reading