Science fiction’s superpower isn’t thinking up new technologies – it’s thinking up new social arrangements for technology. What the gadget does is nowhere near as important as who the gadget does it for and who it does it to. Your car can use a cutting-edge computer vision system to alert you when you’re drifting out of your lane – or it can use that same system to narc you out to your insurer so they can raise your premiums by $10 that month to punish you for inattentive driving. Same gadget, different social arrangement.
Here’s why that’s so important: tech hucksters want you to think there’s only one way to use the gadget (their way). Mark Zuckerberg wants you to believe that it’s unthinkable that you might socialize with your friends without letting him spy on you all from asshole to appetite. Conversing with friends without Creepy Zuck listening in? That’s like water that’s not wet!
But of course, it’s all up for grabs. There’s nothing inevitable about it. Zuck spies on you because he wants to, not because he has to. He could stop. We could make him stop. That’s what the best science fiction does: It makes us question the social arrangements of our technology, and inspires us to demand better ones.
This idea – that who a technology acts for (and upon) is more important than the technology’s operating characteristics – has a lot of explanatory power.
Take AI: There are a lot of people in my orbit who use AI tools and describe them in glowing terms, as something useful and even delightful. Then there are people I know and trust who describe AI as an immiserating, dehumanizing technology that they hate using. This is true even for people who have similar levels of technological know-how, who are using the very same tools.
But the mystery vanishes as soon as you learn about the social arrangements around the AI usage.
I recently installed some AI software on my laptop: an open source model called Whisper that can transcribe audio files. I installed it because I was writing an article and I wanted to cite something I’d heard an expert say on a podcast. I couldn’t remember which expert, nor even which podcast. So I downloaded Whisper, threw 30 or 40 hours’ worth of podcasts I’d recently listened to at it, and then, a couple hours later, searched the text until I found the episode, along with timecode for the relevant passage. I was able to call up the audio and review it and match it to the transcript, correct a few small errors, and paste it into my essay.
A year ago, I simply would have omitted the reference. There was no way I was ever going to re-listen to hours and hours of podcasts looking for this half-remembered passage. Thanks to a free AI model that ran on my modest laptop, in the background while I was doing other work, I was able to write a better essay. In that moment, I felt pretty good about this little AI model, especially since it’s an open source project that will endure long after the company that made it has run out of dumb money and been sold for parts. The ability to use your personal computer to turn arbitrary amounts of recorded speech into a pretty accurate transcript is now a permanent fact of computing, like the ability to use your PC to crop an image or make a sign advertising your garage sale.
That’s one social arrangement for AI. Here’s another: last May, the Chicago Sun-Times included a 64-page “Best of Summer” insert from Hearst Publishing, containing lists of things to do this summer, including a summer reading list. Of the 15 books on that list, ten did not exist. They were AI “hallucinations” (jargon used by AI hucksters in place of the less sexy, but more accurate term, “errors”).
This briefly lit up the internet, as well it should have, because it’s a pretty wild error to see in a major daily newspaper. Jason Koebler from 404 Media tracked down the list’s “author,” a freelancer called Marco Buscaglia, who confessed that he had used AI to write the story and professed his shame and embarrassment at his failure to fact-check the AI’s output.
Koebler followed up on this report with a deeper dive into the entire “Best of Summer” guide, reporting that Buscaglia’s byline appeared under the majority of the lists in the Hearst guide. In a discussion on the 404 Media podcast, Koebler offered perspective on this, describing the early days of his career when, as an intern at the Washington Monthly, he would be called upon to contribute to guides like Hearst’s “Best of Summer” package. In those days, three interns would be assigned to each of the lists, overseen by a professional journalist and backstopped by a fact-checking section.
Seen in this light, the story of the nonexistent books in the summer reading guide takes on an entirely different complexion. The “Best of Summer” guide contained ten lists, almost all written (or rather, “written”) by one person: Buscaglia, evidently without any fact-checking whatsoever (many of the other lists also contained egregious errors).
In other words: Hearst’s King Features, who published the “Summer Reading Guide,” replaced 30 interns, 10 newsroom journalists, and an entire fact-checking department with one freelancer. No one has reported on how much Buscaglia got paid to write all those lists, but if it comes out to the total wages of all those people whose job he was doing, I’ll stick my tongue in a light socket.
In Buscaglia’s quotes to Koebler, it’s clear that this isn’t a person who is enjoying his AI experience. Whereas I, another freelance writer, found my sole use of AI in a writing project to be absolutely delightful.
It’s not hard to understand the difference here, of course.
There’s a bit of automation theory jargon that I absolutely adore: “centaurs” and “reverse-centaurs.” A centaur is a human being who is assisted by a machine that does some onerous task (like transcribing 40 hours of podcasts). A reverse-centaur is a machine that is assisted by a human being, who is expected to work at the machine’s pace. That would be Buscaglia: who was given an assignment to do the work of 50 or more people, on a short timescale, and a shoestring budget.
I don’t know if Hearst told him to use a chatbot to generate their “Best of Summer Lists,” but it doesn’t matter. When you give a freelancer an assignment to turn around ten summer lists on a short timescale, everyone understands that his job isn’t to write those lists, it’s to supervise a chatbot.
But his job wasn’t even to supervise the chatbot adequately (single-handedly fact-checking 10 lists of 15 items is a long, labor-intensive process). Rather, it was to take the blame for the factual inaccuracies in those lists. He was, in the phrasing of Dan Davies, “an accountability sink” (or as Madeleine Clare Elish puts it, a “moral crumple zone”).
When I used Whisper to transcribe a folder full of MP3s, that was me being a centaur. When Buscaglia was assigned to oversee a chatbot’s error-strewn, 64-page collection of summer lists, on a short timescale and at short pay, with him and him alone bearing the blame for any errors that slipped through, that was him being a reverse-centaur.
AI hucksters, desperate to keep their stock bubble inflated, will tell you that there is only one way that this technology can be used: to fire a whole ton of workers and make the survivors do their job at frantic Lucy-in-the-chocolate-factory cadence. While it’s true that this is the only way that their companies could possibly be worth the hundreds of billions of dollars that have been pumped into them (so far), there’s no iron law that says that investors in tech bubbles should always turn a profit (indeed, anyone who’s lived through this century knows that the opposite is far more likely).
The fact that the only way that AI investors can recoup their investment is by turning us all into reverse-centaurs is not our problem. We are under no obligation to arrange our affairs to ensure their solvency. In 1980, Margaret Thatcher told us, “There is no alternative.” In 1982, Bill Gibson refuted her thus: “The street finds its own uses for things.”
I know which prophet I’m gonna follow.
Cory Doctorow is the author of Walkaway, Little Brother, and Information Doesn’t Want to Be Free (among many others); he is the co-owner of Boing Boing, a special consultant to the Electronic Frontier Foundation, a visiting professor of Computer Science at the Open University and an MIT Media Lab Research Affiliate.
All opinions expressed by commentators are solely their own and do not reflect the opinions of Locus.
This article and more like it in the September 2025 issue of Locus.
While you are here, please take a moment to support Locus with a one-time or recurring donation. We rely on reader donations to keep the magazine and site going, and would like to keep the site paywall free, but WE NEED YOUR FINANCIAL SUPPORT to continue quality coverage of the science fiction and fantasy field.
©Locus Magazine. Copyrighted material may not be republished without permission of LSFF.