The Alternative to AI in the Classroom is Desiring to Learn
In the end, students want to learn, or they don't
There is a lot of anxiety among teachers right now, especially among composition instructors, about how to deal with the growth and increasing abilities of AI. But this isn’t just an issue for teachers. It’s a societal problem. The use of AI when writing shortcuts critical cognitive processes that we also use for making moral decisions, engaging in political debate, reading the Bible and other texts, understanding others’ perspectives, articulating our perspectives, knowing ourselves and our beliefs, and so on. A risk of AI is an illiterate and convictionless society, one in which we no longer process our own thoughts, values, emotions, or beliefs and we are increasingly incapable of processing those of others. We offload that processing to a machine. If that sounds familiar, it should. Before the rise of AI we were already on this track. People were already reading fewer books and engaging in vapid civic engagement on social media. Already people were forming their “beliefs” based on trending topics and YouTube influencers. But AI accelerates this trend. So this issue isn’t just about teachers complaining that students aren’t writing their own papers. This is an issue about the future of civic society. And I think we can all agree that civic society ain’t looking so hot as it is.
For teachers, the anxiety comes from the fact that it isn’t that difficult to take most freshman composition prompts, input them into ChatGPT and get a decent B paper, or at least a few solid paragraphs you can shape into a B paper. And it won’t be long before it can produce A papers. That’s not just true for composition classes. It’s a problem for history, political science, philosophy, theology, etc. Anytime you ask a student to write something, AI can produce a facsimile. And these facsimiles are not always easy to detect. Teachers who take the time to get to know each student’s style through in-class writing can often detect significant changes in tone and style created by the use of AI. But I want you to consider how much time and attention and memory that involves. How many writing samples can you memorize? 20? 30? 50? 100? And what if the student matures in their writing or takes their work to a writing center to get help? What you think might be AI could just be growth! You might think that teachers could use fix this problem with the use of more AI, but unfortunately, AI “detectors” can be wrong, leading to serious consequences, so that’s not a viable option.
Another often recommended solution is to have students write everything in-class, but this takes away valuable class time and does not prepare students to write research papers, the kind of work where they spend days or even weeks researching a specific question, wrestling with an idea, researching it, writing and editing, drafting and revising, and so on. That kind of work has tremendous educational value. Research papers require students to engage with ideas and the wider scholarly conversation in a way that prepares them to do their own research after they graduate (a valuable skill as disinformation continues to be a problem) and enter into public debate. And all that can’t reasonably be done only in-class. So what are we left with? I believe teachers and society in general is left with one conclusion: either students will choose to see the pursuit of wisdom and knowledge as good and worth the effort and act accordingly, or they won’t.
It is my opinion that the main task for each teacher in confronting the challenge of AI is not to detect it, but to make the virtuous appeal to students that the only path to wisdom and knowledge goes through enduring hardship and persevering. I understand that many students, especially those with poorly formed prefrontal cortexes, simply don’t care about wisdom or virtue, but I think it’s wrong of us not to make an appeal to virtue at all. If we only talk in the language of discipline (“If you get caught, you’ll get punished!”), they won’t be reminded of their telos, they’ll only be reminded of danger. Whereas virtue reminds them of what they were created to be. They were created to pursue Christlikeness, including wisdom.
So we begin by explaining to them that the process of writing, when done well, is working magic in their minds, making them into better thinkers, better readers, better neighbors, better citizens. That writing will help them know themselves and others around them. But that writing will also take hard work, just as all good things take hard work. And to use AI to help with that hard work will rob their minds of all those good things. It would be like going to the gym to lift weights only to have someone come along and lift them for you. You’ll never grow stronger. You’ll never grow. You’ll only waste your time.
This is the kind of message that students, that we all need to hear. Because at the root of AI is the temptation to turn to technique and maximize our efficiency at all costs—even when it hurts us. And right now, many people look at AI and look at their assigned writing prompts and wonder why it’s wrong not to do the most efficient thing. After all, isn’t efficiency good? At the very least, shouldn’t they use ChatGPT for the brainstorming or outline? But of course, the answer is “no.” The cognitive process of wrestling with ideas in your mind and ordering those ideas into a coherent argument is precisely the kind of thinking we’re trying to mature in writing classes! But I hope you can sympathize with the pull of efficiency. Since we live in a society dominated by technique, it makes perfect sense for our students to think in those terms as well. In fact, they tend to think of education itself in those terms. Education is a tool I use to maximize my efficiency; therefore, anything not directly contributing to that maximization is irrelevant, a mere hoop I have to jump through.
Which is one reason I think it’s important for teachers to start using the language of “wisdom” in addition to language of virtue. If, of course, you are imparting wisdom in your classroom, then use that language. You aren’t offering a mere commodity. You are facilitating their growth in wisdom, which isn’t always an “efficient” process (although it’s not inefficient for inefficiency’s sake, either).
Now, I grant that what I’m talking about here is swimming upstream. Do I expect most students to buy into my conviction that they should not use AI because wisdom and knowledge can only be attained through effort? I don’t know. But here’s what I do know. AI is only going to get better at mimicking human language and harder to detect and cheaper to use. And someday, if my students don’t make the ethical choice not to use it, there won’t be a whole lot I will be able to do to stop them. And if they make the wrong choice, they will suffer the loss of wisdom and knowledge, and we will suffer the decline in civic virtue.
There are some who would push back and say that my problem is that I’m trying to stop them in the first place—that I should just incorporate AI into the classroom and teach students how to use it responsibly. And while I can imagine some value in using it for things like formatting an annotated bibliography, there are very few cases. All the meat of my classes comes in doing the work yourself. That’s where the magic happens, the transformational growth and learning. At the time, students may not recognize or appreciate the growth, but later it pays dividends. Incorporating AI into my classroom would be inviting students to shortcut their education.
In the end, the problem with students and cheating with AI comes down to choice. We must choose not to do all that we can do, as I’ve argued before, cribbing from Ellul. AI will only give us more and more options to shortcut our lives, and we must do the hard work of asking whether these shortcuts are ethical and whether they move us closer to our telos or further from it. Collectively, we need to inspire and encourage each other to achieve what we were created for, rather than settling for efficiency.
Yes! The process is the point.
I have tried this appeal. I don't know if it's working, but I am confident it's the only right appeal to make. My students are interested in public service (I teach econ and stats, mostly in professional Masters programs in public admin and international affairs). I appeal to their integrity and, honestly, I appeal to their pride, by saying: when you are in a room talking with decision makers and they ask you a question, how embarrassing would it be for you to have to say "I need to go look that up" when you could have *learned* it here and brought that knowledge into every room you enter? Basically, I say that there are going to be fewer and fewer people who are developing these skills for real, and I want them to be among them. So it's a bit of a different argument, but I think it's related to wisdom if not quite the same.
I am going to add your metaphor of outsourcing the lifting of weights at the gym - that's perfect! And that actually gets to a point I wanted to note here. The idea of efficiency (hey, I'm an economist so I think about this sometimes) is always relative to a goal. You efficiently accomplish X. There are more and less efficient ways to, say, clear snow from my driveway. So the way "efficiency" is used in the AI context is not just crass (people always think economists are so crass about efficiency) - it is actually being used incorrectly much of the time. The same thing is not accomplished if AI does it. Because for our students, the learning is the outcome. The paper to be turned in is not the outcome - so being able to get the paper done more "efficiently" is a misnomer. The paper isn't the goal at all. I think I might talk to the students about that explicitly: they may not think of it this way, but if we all talk together it will be obvious that these papers are not the goal because - what do they even do? They get handed to me, I look at them and give them back with a grade. They do not inform policy or change public strategies to solve problems. They are exercises. Literally. The weights at the gym capture this. Exercises are used to build muscles so that the actual goals can be reached. AI doesn't increase efficiency, it actually precludes the goal from being reached at all (which is certainly not efficient!!). Can't wait to bring this to the students.