By Jon Ippolito
Worried that generative AI might have killed the student essay? An “AI sandwich” may resurrect learning through writing for the ChatGPT age.
The death of the term paper, and the birth of something else
Term papers are among the first casualties of the post-ChatGPT classroom. It now takes under five minutes to craft a prompt, refine format and style with some back-and-forth, and then watch a chatbot spit out 7000 words on the French Revolution. Facts and citations will need double-checking, but the structure will be clearer and the prose more polished than the average teen writer can muster.
As a former writing teacher, I lament the loss of the essay as an armature upon which to build a coherent argument. The act of writing helps students brainstorm, structure, and articulate thoughts about everything from how the US Constitution compares to the Haudenosaunee Confederacy to why Madame Bovary strayed from her husband.
But not every course is a writing course, and student essays have for too long been an exercise in summarizing received knowledge. Let’s be honest: before ChatGPT, student research too often consisted of looking up the topic on Sparknotes or Google, jotting down some factoids, and sprinkling in some references copied from the bottom of a Wikipedia page. In the best of circumstances, students wrote an outline, draft, and a final paper–usually a few hours before the paper is due. In the worst of circumstances, they paraphrased a paper already in their fraternity’s archive or bought one online from a ghostwriter in Kenya.
Can we blame them? Too many teachers still value eloquent regurgitation over students’ own growth and perspective. And, if you’ll pardon the image, eloquent regurgitation is exactly the forte of stochastic parrots.
Mercifully, the death of the term paper is an opportunity to rethink writing in the AI age. Instead of recapitulating information already in the library or on the Internet, we can invite students to lean on generative AI to contribute to knowledge creation themselves. I’m not talking about asking ChatGPT to fabricate new ideas or theories, since it’s notoriously unreliable at that. I’m talking about delegating the mechanics of writing to a large language model and inviting the student learner to dig out new data that can’t be found in the chatbot’s source material.
Let’s look at an example of a writing assignment that exploits the power of generative AI while also taking advantage of uniquely human capabilities.
The “AI Sandwich” assignment
Imagine you are teaching international relations and want your class to understand the interplay of national politics and local economies. One approach is to fashion a pedagogical sandwich that wraps a payload of human agency in layers of AI support.
Stage one: AI as brainstorming partner
A student prompts a large language model like ChatGPT to brainstorm a list of research directions about how the proximity of Canada has affected local economies in New England over the past 50 years–say, the impact of currency fluctuations, bilingualism, or the 2009 passport requirement. The student chooses whatever seems illuminating and realistic to answer in the timeframe of the assignment. The instructor might urge the student to push ChatGPT for more “out there” responses, such as how things might have been different if the border were completely closed or if God had swapped Mexico and Canada.
Stage two: Human-to-human knowledge creation
In the second stage, the student visits her hometown over break and interviews her mother, her grandfather, local business owners, members of the town council–anyone with a stake in the research question. The student takes notes on the interviews by hand or voice recording.
Step three: AI as writing partner
Back on campus, the student uses AI to convert audio to text and/or to convert notes full of misspellings and sentence fragments into an outline. The student tweaks the outline and asks the chatbot to generate an essay draft based on it. To encourage more authorial agency, the instructor might ask students to generate three possible conclusions based on their notes. The student may include one or all of them, but should assess how the truth value of each stacks up against what she learned from her own field research. Finally, the student can adjust the draft to match her opinions and submit the essay along with an audit of the prompt history used in the process.
(ChatGPT makes it easy to export a log of your conversation as a zip file, though if you’re worried about feeding ChatGPT this human-gathered knowledge, you can turn off data collection in its settings.)
Surrendering the craft of writing to a machine
Are we giving up the baby with the bathwater if, as the AI Sandwich assignment suggests, we delegate essential stages of the writing process to ChatGPT? I have relished watching my students shed misplaced modifiers and stilted sentences as they grow into their own, authentic voices. But I also recognize that my expectation puts at a disadvantage students who don’t speak English as their first language or are neurodiverse or didn’t grow up with access to good writing practice.
As I’ve begun to accept this new perspective, my joy in reading expressive prose is starting to feel like acknowledging a calligrapher’s fluid strokes and elegant flourishes. Penmanship used to be considered an essential academic skill; I still remember the chagrin I felt when fourth-grade classmates mocked my misshapen cursive Q after I was absent the day that letter was taught. Some schools, like the Montessori and French systems, still value handwriting to develop spelling and hand-eye coordination. Yet for most American classrooms, penmanship has become an admirable but outdated flair, while critics view it as a class marker once brandished by the elite to distinguish them from common folk. An English professor in 1876 admonished, “Both paper and envelopes should be of fine quality. It conduces to fine penmanship and perhaps inspires the writer with fine thoughts. Coarse paper, coarse language, coarse thoughts–all coarse things seem to be associates.”
Victorian snobbery aside, I believe excellent writers will still touch hearts in ways ChatGPT cannot reach, but that’s a higher bar than most writers require. Composition teachers use the word “mechanics” to describe rules like subject-verb agreement that undergird grammatical text. Perhaps most writers don’t need to master those rules if AI can perform the mechanics for them.
Anchoring student research in the here and now
The AI Sandwich exercise expects students to edit and personalize the AI-generated draft. Skeptics might question why students would bother fussing with a fully rendered text with near-perfect grammar and syntax. But if ChatGPT flubs the essay, it won’t be just international law or microeconomic theory it gets wrong. Students will have a warrant and personal investment in any knowledge they know firsthand or helped to create. “Wait, Millinocket isn’t on the Canadian border!” “I wouldn’t say my downtown is ‘thriving’ exactly.” “Grandma would never say that.” Not only might students produce knowledge that generative AI can’t, but they’re also put in a position to care that the AI gets it right.
While oral histories and local reporting are tasks an AI would struggle with, they’re not the only ones. The world abounds in information that never made it into ChatGPT’s training data. Budding citizen journalists can find it in small-town election results; citizen scientists can find it in seasonal temperatures of local ponds. (Fortuitously, GPT-4 has proven adept at converting notes into data and charts for scientific reports.) For more arcane topics, the world’s archives and museums still contain millions of historical, legal, and cultural artifacts that have never been digitized or put online.
Even students who can’t get off campus or travel to a brick-and-mortar collection can take advantage of the deep web, which by some estimates is 500 times larger than the websites and online forums that fed today’s AI chatbots. The automated harvesters that scraped together the Common Crawl, for example, cannot reach pages generated by server-side scripts like PHP, because they don’t exist until the script pulls information out of a database and renders it as HTML. That means humans can search for and read information on 1.5 million works in the Metropolitan Museum of Art that ChatGPT has never glimpsed.
While the deep web represents a vast resource of information currently available only to human researchers, asking students to tap knowledge from present-day, local communities instead may be a good direction for universities as a whole. Working class America has grown increasingly frustrated with higher education’s focus on intellectually exotic topics. Many of the lobstermen and construction workers who pay University of Maine tuition wonder why their kids seem to study only far-off places and times or ideas too abstract to apply to real-world problems. While my job as a teacher is to broaden my students’ horizons, I’m doing them a disservice if I disconnect them from their communities in the process. On the other end of the political spectrum are activists concerned that universities are neglecting pressing issues of the day like climate change and economic empowerment. Both sets of stakeholders might appreciate seeing young people apply big ideas to local issues, like whether to build an east-west highway across Central Maine or how to prevent forever plastics from contaminating family farms.
By anchoring course assignments in the communities outside of the ivory tower, AI-assisted local research can bring timeliness and relevance to the academy at a time when it needs it most. Render unto ChatGPT the things it is good at, and you can have your cake–er, sandwich–and eat it too.
Jon would like to thank Joline Blais for modeling how to connect student work to local communities and Sarah Newman of Harvard’s metaLab for identifying the investment students would have in local knowledge. Image based on graphic by Mohamed Hassan. No artificial intelligence was used in writing this essay.