Tech News
← Back to articles

I tried to replace myself with ChatGPT in my English class

read original related products more articles

My students call it “Chat,” a cute nickname they all seem to have agreed on at some point. They use it to make study guides, interpret essay prompts, and register for classes, turning it loose on the course catalog and asking it to propose a weekly schedule. They use it to make their writing sound more “professional,” including emails to professors like me, fearing that we will judge them for informal diction or other human errors.

Article continues after advertisement

Like many teachers at every level of education, I have spent the past two years trying to wrap my head around the question of generative AI in my English classroom. To my thinking, this is a question that ought to concern all people who like to read and write, not just teachers and their students. Today’s English students are tomorrow’s writers and readers of literature. If you enjoy thoughtful, consequential, human-generated writing—or hope for your own human writing to be read by a wide human audience—you should want young people to learn to read and write. College is not the only place where this can happen, of course, but large public universities like UVA, where I teach, are institutions that reliably turn tax dollars into new readers and writers, among other public services. I see it happen all the time.

There are valid reasons why college students in particular might prefer that AI do their writing for them: most students are overcommitted; college is expensive, so they need good grades for a good return on their investment; and AI is everywhere, including the post-college workforce. There are also reasons I consider less valid (detailed in a despairing essay that went viral recently), which amount to opportunistic laziness: if you can get away with using AI, why not?

It was this line of thinking that led me to conduct an experiment in my English classroom. I attempted the experiment in four sections of my class during the 2024-2025 academic year, with a total of 72 student writers. Rather than taking an “abstinence-only” approach to AI, I decided to put the central, existential question to them directly: was it still necessary or valuable to learn to write? The choice would be theirs. We would look at the evidence, and at the end of the semester, they would decide by vote whether A.I. could replace me.

What could go wrong?

Article continues after advertisement

*

Speaking about AI in the classroom, OpenAI CEO Sam Altman has described ChatGPT as “a calculator for words.” This analogy indicates the magnitude of change that ChatGPT is poised to bring about—imagine how radically math class must have changed when calculators became widely affordable—but it also indicates that change itself, even radical change, is not necessarily scary. Most AI skeptics would admit that math class survived the advent of the calculator.

At the beginning of the semester, I asked my students to complete a baseline survey registering their agreement with several statements, including “It is unethical to use a calculator in a math class” and “It is unethical to use a generative AI service in an English class.”

... continue reading