Indie writers can be under great pressure to write quickly, so it’s no surprise that a few have resorted to using generative AI software such as ChatGPT. Other writers may look to AI as a quick fix when they hit a roadblock. Plus, publishing on a budget can be tough, so AI-generated images and audio may seem like a good solution. Big tech companies with loads of advertising cash would like you to think that generative AI is the inevitable future of writing. So would some influencers who are probably sponsored by these companies. But while generative AI might provide some short-term gains, in the long run it’s bad for all of us. What’s more, it comes with risks that could ruin your writing career. Given all of its problems, let’s look at nine reasons to dismiss the hype and say “no” to machine-generated stories and art. 1. AI Doesn’t Understand Stories Large language models (LLMs) such as ChatGPT are just predictive software that look at words and try to guess what a human would say next. Because they consumed a ridiculous number of works to do so, their output tends to be generic and lackluster . But more than that, LLMs are only correlating words; they have no understanding of the concepts behind the words. People call AI output “slop” for a reason. The more text an LLM generates, the more its ideas wander aimlessly. AI can’t use its gut to judge what’s working, much less make a good plot. It has no idea what information readers need or what scenes can best represent the story. It can spit out random character thoughts, but it has no idea how to give your character emotional depth while carrying their arc forward. And if you generate a cover, the AI doesn’t understand your story or the best way to advertise it in today’s market. Even if the output looks good to you, it’s probably doing something wrong that you, as a human, would be getting right. Not everyone has the expertise to notice what the AI is messing up, but these problems will still impact the reading experience. Readers don’t want inconsistent characters and a plot that goes nowhere. They don’t want words that don’t quite fit and moments that never matter. You have more potential than the AI does. Don’t give up on yourself. 2. AI Outputs Are Stolen From People Like You If you’ve ever thought generated output looked surprisingly emotional, uncanny, or just amazing, it’s because a person actually created that. In all likelihood, a person who had professional skills they’d spent a lifetime refining. Like you, that person is probably trying to sell their work. But now they have to compete with software that has stolen the material they spent years of their life on so it can spit out hordes of soulless, uncredited copies. Writers and artists of all stripes have never been paid well. We don’t usually have steady nine-to-five jobs, and aside from the rare person who becomes a household name, we’re always struggling to get by. Even some best-selling authors don’t make a comfortable living from their work. On the other hand, the tech companies making these generative AI programs are extremely wealthy. AI technologies are getting a trillion dollars of investment. This is such a mind-bogglingly large number that it’s hard to understand. There is not, and has never been, anyone who is a trillionaire. The very richest people in the world might have a tenth of this, and these are the people who could buy a new mega yacht every single day. Yet that mind-boggling investment money is not going to writers or artists to license their work. Tech companies have deliberately chosen to steal from us instead, even downloading pirated copies of our books from torrent sites. In effect, this a huge heist where rich tech CEOs are robbing poor artists en masse. If we want them to stop, we need to stand united against them. 3. People Don’t Want to Buy AI Writing When someone spends their money on our writing, we have a duty to be fair to them. That means fulfilling the expectations we set before purchase so they don’t feel we tricked or cheated them. And customers of all stripes, plus readers in particular, have been very vocal that they feel cheated when they purchase AI outputs. Just have a look at Amazon, the biggest online retailer in the US. If you submit a book to Amazon, they now ask you to tell them if you used AI. Every time you edit an ebook and re-upload it, they ask you again. Amazon is perfectly happy to use unethical software; they are asking because they know consumers hate it. Just look at the reviews. Products that are suspected of being AI generated get tons of one-star ratings with complaints about AI use. When people make purchases from creators, they want to know the creator put time, thought, and feeling into the work. Customers want to know that their money is supporting people doing genuine art. And last but not least, people just don’t think AI outputs are worth anything. Being charged money for something a machine spat out in a split second is a rip-off. If they know you used AI as part of your process, or they can see you’re using an AI cover, many people won’t want to buy your book. And if you’re thinking that they don’t need to know, consider the next section. 4. It Could Create a Scandal This year, at least three indie authors–Lena McDonald, K. C. Crowne, and Rania Faris–have gotten caught publishing books with AI prompts in them. Their readers were enraged, leaving angry reviews on Amazon, Goodreads, or social media and subjecting authors to online harassment. While authors can edit prompts out of ebooks quickly, print copies are slow and expensive to replace. Crowne’s book was even removed from the Kindle store altogether after the scandal. You might think you’d never be so careless as to leave a prompt in, but mistakes happen. They are particularly likely to happen if you are in a big hurry or relying on the help of nonprofessionals. In other words, if you’re in a position that makes AI especially tempting. Scandals like these can ruin a writing career. When we are building our careers, our biggest asset is the reputation our pen name has. The more famous the pen name gets, and the more people associate it with good books, the better each of our books will sell. A big pen name means lucrative deals with publishers and studios. If a scandal tarnishes our pen name, we may have to start all over again. 5. People Will Buy Fewer Indie Books Publishers often treat writers unfairly. Countless writers have gotten publishing contracts thinking that it was their big break, only to learn later that their book wasn’t being marketed. Not to mention, some publishers are outright predatory, sneaking in contract language that ensures the author will never be paid. The indie market has given authors more options. Now authors can more easily be their own publishers, and publishers are under more pressure to offer some value. Genres that publishers have scorned, such as LitRPG or queer romantasy, have gained enough steam among indie works to make publishers pay attention. But the indie market depends on readers taking a chance on a book that hasn’t been vetted by a publisher. And the more AI books flood the indie market, the more it drives readers away from all indie books. Just like an author’s reputation can be ruined, the reputation of indie books as a whole can be ruined. It will be disastrous if readers start seeing all indie books as slop. But if too many indie authors don’t meet reader expectations, that’s what will happen. There are already a wealth of bad actors pumping out slop books on Amazon. Indie authors have to set themselves apart from this. 6. AI Endangers Your Copyright To start, you should know I am not a lawyer. If you want actual legal advice, please hire one. That said, the courts have long held that only works specifically created by humans are entitled to copyright protection. If an animal spreads some paint over a canvas, no one owns that image. If a work is not copyrighted, anyone can legally use it, however they want, without permission. The more you use AI outputs and the less they are modified, the harder it is to claim copyright over the work. So let’s say you want a cover without paying a small fee to a human artist (most artists are quite affordable). So you use AI to generate a cover image. You love the image and most readers can’t tell it’s AI generated, so your book does great. Then someone decides to piggyback on your success by using your cover image for their own book. People looking for your book end up buying theirs instead. Their book is terrible and readers hate it, but those angry readers leave their bad reviews on your book. It’s a nightmare, but this other writer is legally entitled to use the same generated cover. If you had just paid an artist, that artist could have sold you an exclusive license to use the art they made for you. If you generate text, that text also has no copyright protections. Someone can copy it and sell it. You can get more protection by modifying the text, but do you have the money for a tough legal battle? You don’t want to be in the position where you have to prove exactly how many changes you made. If you’re going to get Amazon or other big platforms to remove unauthorized copies, your copyright claim needs to be solid. 7. Your Skills Can Atrophy Fiction writing is a challenging endeavor that requires many years of skill building. We all start out in a rough state, but through years of study and practice, it starts to become second nature. When we hit roadblocks, we figure out new ways to get past them, often learning something important in the process. But what happens if a new writer tries to take a shortcut by using AI in place of learning? They don’t improve like they need to. If you use AI to generate ideas, it’s just recycling the most common ideas other storytellers have, and you’re not learning to cast a wider net to find fresh material. If you use AI to suggest edits, that’s practice you won’t get reviewing your own work. Recently, a research paper by Microsoft found that use of AI reduced critical thinking and cognitive practice, leaving users’ cognitive abilities “atrophied and unprepared” if they had to use the skill later. Of course, if a machine is doing something like saving our phone numbers so we don’t have to memorize them, that’s probably okay. But if the AI is doing creative work for us, that means our creative skills will suffer. And almost every bit of writing is creative work that cultivates skills we need. Even if you don’t like wordcraft, it really matters and you should build those skills. Idea generation is a creative skill. Outlining is a creative skill. Narrating is a creative skill. The AI can only offer the help it does because it has stolen the work of millions of people who put in the effort to learn. What happens if no one has those skills anymore, and the AI can only train on its own recycled outputs? 8. AI Is Harming the World Besides our specific concerns as writers and artists, generative AI or machine learning in general is harming everyone in an extraordinary number of ways. While we had these problems before machine learning, it’s truly remarkable how much worse they are getting. Even though I’m a developer myself, before ChatGPT I would never have believed new computer technology could do so much damage. AI is harming us by: Ideally, some of these problems would be addressed with regulation, but tech companies have been doing everything they can to stop any kind of oversight. That includes pouring hundreds of millions of dollars into the coffers of US politicians. As a result, Republicans in the House even passed a bill with a 10-year ban on any AI regulation in the States. We can’t count on anyone else to stop these harms. The most powerful tool we have is to boycott AI and anyone who uses AI. Change will only happen if we demand it. 9. The Machine Is Not You Before it completely dissolved, the official National Novel Writing Month (NaNoWriMo) organization did something baffling. It encouraged writers to use generative AI for its yearly challenge to write 50,000 words in November.* As someone who’s completed the challenge in the past, using LLMs would entirely defeat the point. Sure, you can generate 50K words of slop in a month, but it’s meaningless. Writing a novel, even a terrible draft, is a notable accomplishment because of what we put into it. We sacrifice our time and energy, and we fuel it with our creativity and passion. The end result is something that is deeply personal. The story has our distinctive voice and expresses opinions we didn’t even know we had. It repackages our memories and crystalizes our feelings. Most often, it’s deeply imperfect, but replacing that imperfection with stolen thoughts regurgitated by a machine isn’t an improvement. Even if it’s better in some ways, AI output isn’t yours. It’s not your accomplishment. It’s not your creative expression. It’s not a showcase of your skills. It doesn’t have your voice. Anyone can tell great stories. And everyone has a steep learning curve to do so. Riding an escalator past that curve not only cheapens the journey but makes us believe that we needed the escalator to get there. Possibly because it had an AI software company as one of its sponsors.