"Should we be worried about ChatGPT?" No, but I'm getting pretty darn annoyed.
In which technology yet again shows us who we are and we don't like it
First—I appreciate all the well-wishes on the release of Fairy Bargains last month! I’ve enjoyed keeping in touch in advance of the release and, with the impending potential of Twitter fully folding in on itself and having Thoughts I’d like to have some outlet to share, a new development here: I’ll be writing slightly more regular newsletters. Don’t worry, there will still be chickens.
As I wrap up another semester teaching and the specter of AI continues to seep into conversations about creativity, I’ve seen it from all sides: “Should we be worried about ChatGPT?” The development and access to AI composition has kerfuffled both wordy worlds I inhabit—writing and teaching. In the academic world, the sudden availability of ChatGPT, free and unhindered, caused an immediate flurry of thought pieces: Should we be worried? How will we spot this new form of plagiarism? What will become of the process of evaluating writing if you can’t tell if students are actually writing? Oh, the woe! The horror!
A couple months in and I’m here to tell you, no. We shouldn’t be worried. But I’m getting pretty annoyed.
ChatGPT composed language is, first and foremost, an absolute breeze to catch. Yes, TurnItIn won’t snag it (yet), but the voice of AI is apparent when inserted into most student work instantly to anyone with a functioning knowledge of the English language. Students are suddenly using buzzy words from academic fields they know nothing about. They’re citing sources we never used in class and using complex terminology they should never have encountered unless they read academic think pieces for fun (note: students rarely read such pieces for fun). They’re ping-ponging back and forth between uncomfortable run-on sentences and something that could be plonked directly into a textbook. Much of the time, AI sentences are grammatically perfect, but the content is confusing garbage that has little pertinence to the assigned prompt or hooks together ideas that don’t run along the same tracks at all. Long story short, it’s not hard to spot.
And of course, a smidgen of forethought and planning makes it even easier to avoid AI-produced work slipping by—AI is scraping what’s online to train itself, so the less written about and available online in the vein of your assignments, the less sense AI will make. Ask it to tell you about symbolism in “The Yellow Wallpaper” and you’ll get a coherent set of ideas. Ask it to tell you about the use of metaphor in a recently published SFF short story and it falls all over itself. It’s like a student with a decent vocabulary trying to BS you about a reading they didn’t do. “Ah, yeah, the symbolic rapport between the main characters provides an insight into relational aptitudes for…” Nope. Stop it right there, kid. Beyond the source material we assign, asking for depth and creativity in thought prevents effective AI usage. When asked for summary, AI shines. When asked for connection and synthesis with seemingly disparate topics, AI stumbles, but students often shine, even if their grammar isn’t perfect.
In fact, AI is a bit of a wake-up call if we’ve become lackadaisical in our course and assignment design. If it keeps us on our toes and demands we stop recycling the same tired readings in our comp and lit courses, I’m all for it. (Look, I used to tutor and if I saw one more Yellow Wallpaper essay it’s possible I would have attempted to crawl into the walls myself.)
So if it’s so easy to spot, why am I annoyed?
Because I am tired of catching it already.
Less than one semester living with this new tech.
And I’m tired of it.
I am tired of having The Conversation with students. The “I noticed some discrepancies in your paper” talk. The “I’m afraid you may have used outside sources without attribution” talk. I hate that talk. It gets in the way of the “this is a really interesting comparison! What else could we say about this?” talk and even the “this paper isn’t organized, but let’s grab our pens and start making plans to chop it up and stitch it together again” talk and a million other talks—productive talks, creative talks, encouraging talks— that I love having.
Often, the reason a student cheats—the old fashioned way or popping into ChatGPT for some free words—is because they’ve become overwhelmed, they’re in panic mode, they’re scrabbling for a foothold. It’s a mistake. A lapse in judgment. And a (quick albeit awkward) chat straightens it out. I’ve seen chagrined apologies and gratitude for second chances, and usually the outcome, from an educational standpoint, is successful. The student learned something. It’s better to reach out for help before getting snowed under by the pressure. Asking for an extension isn’t the end of the world. There are resources on campus for mental health and academic success, and seeking them is no shame. That mean old bat on the third floor will catch your AI submission so don’t bother trying. See? Learning.
But sometimes…sometimes the problem is the desire for a shortcut not because of the pressure to do well, but because the student doesn’t actually want to do the work. And this veers right into the stream of conversation about writing books and AI, the pervasive undercurrent that “if only you could just feed some ideas into a word generator, it would free us from having to do the sloggy part of writing!” But here’s the rub, the part that both the student who wants to skip composing their paper and the “writer” (yeah, I’m using sarcastic quotes for it, come at me) who wants to skip drafting their book misses—the real work is in the writing itself.
The real work of writing a paper is learning. It’s not just learning concepts and regurgitating them (if that is what is assigned, that is a very poor paper prompt), but about synthesizing ideas and composing thoughts about them and then organizing those thoughts in a form of communication for others. Skip those steps and you haven’t really learned much of anything. And, once the excess stress or midterm panic is removed from the equation, if you don’t WANT to do these things, you don’t really want to learn. You want to be credentialed. It’s different.
The real work of writing a book is crafting a story, characters, world out of words. It’s not just feeding ideas in and stringing some words out—no, the real work is the give and take of drafting, not only the generation of ideas but the words themselves, the way that the words are a medium like paint or clay and how that medium is coaxed into shape in turn shapes the ideas themselves. Even in worldbuilding and other “thinky” parts of writing, the way in which the words encapsulate and represent elements of the world, crafting the voice of not only narrator and characters but the space itself. Then it’s the discovery and refinement and sharpening of edges in revision. Skip those steps and you haven’t really crafted much of anything. And, though yes, we all beat our heads against the wall from time to time and curse the blank page, if you don’t WANT to do these things you don’t really want to craft anything. You want to be published. It’s different.
And so, yet again, the problems technology dumps into our laps are really the same problems we’ve had all along. People want to be credentialed without learning for a whole host of reasons, from personal ambition to systemic inequities to the idiocy of some company’s HR software that weeds out anyone who doesn’t plug in a specific degree. AI just reminds us, yet again, that our classrooms have many people who don’t truly want to be there, and that’s a damn shame. (I’d rather they be somewhere else, somewhere that gives them engagement and satisfaction, a statement I make with all goodwill and no derision—but I know they feel they have no choice, and perhaps many don’t.)
People want to “have written” for all kinds of reasons, too. Personal ambition, delusions of fame and fortune (we’ll all disabuse of you that one real quick, promise), having an idea but just not loving the process of writing. (Personally, I give a pass to those who would use AI software for personal enjoyment, to have a story to read to their dictated parameters, though I think many a fanfic writer would note with distaste or pity or both that they’re missing something integral by not writing it themselves.) As a writer, I’m not worried about AI taking my job, at least not yet—it’s not good enough for that at this point. But the current state of affairs reminds me that the craft of writing is under-appreciated and that the joy of writing remains undiscovered by many. It’s just the sloggy bit, the part writers had to force themselves through to produce a finished process.
And if the craft—the process—of writing isn’t appreciated, how long until the art—the final result—isn’t appreciated either? The joy of reading isn’t only the plot; it’s the words. Words! They’re delicious! They’re bright and sharp, they’re thick and dolorous, they’re fresh and green and pert or—I digress. The point is, the words are everything. I’ll stop before I tumble right off into a tangential love letter to words. AI makes apparent what we were already grappling with—can our art form survive? Does anyone care? Do enough other people see the value in poetry and prose beyond simple meaning?
So maybe I am worried, after all. But not in the way the think pieces expected, and not because of AI. ChatGPT didn’t invent any of these problems, and though it certainly isn’t helping, it probably isn’t making them any worse, either. For me, I suppose I’ll do the only thing I know to do—keep writing and keep teaching. When it comes to teaching, some people want to be in a classroom, and when it comes to writing, some people still love the words.
I’ll also keep giving you pictures of chickens. It’s what I do.
English Orpington rooster and Blue Plymouth Rock hen hanging out in front of the newly bloomed daffodils.
I'll always come for the chickens, and I think you mentioned fabric somewhere along the line?