Like every other educator across the globe, I’m working every day to adapt my teaching methods to the world of AI. In a recent e-mail exchange with colleagues, a business professor made the claim that we stop checking essays for AI and treat it just like a spellchecker. Hewrit,
First, we must move beyond the outdated notion of “AI detection.” Do we currently penalize students for using spell-check or grammar-check tools? Of course not—we expect them to use them. Similarly, today’s AI writing tools are becoming an integral part of professional and academic writing. For instance, on Mac systems, “Writing Tools” powered by AI are already integrated. If students or employees submit writing without utilizing available tools to enhance clarity and accuracy, it is now perceived as unprofessional.
His claim that spell-check is a tool just like AI reflects a flawed understanding of both AI and the human mind. What follows is the email response I sent to our group.
Let me start by saying that I appreciate AI and I use it every day. I created an entire podcast called The Schaeffer Dialogues which uses AI. In principle, I support the idea that training our students to properly use AI is a critical part of the educational task. But what counts and “proper” and “improper” use? This is where our Christian worldview must shape our strategy.
As Christian educators, we recognize that the human mind is more than a machine. The human person is not reducible to its component parts.. We are soulish beings with the capacity to reason and marvel at the wonder of God’s creation. However, this capacity must be nurtured, tested, and refined through academic discipline. As Christian educators, this truth about the human mind is critical for discerning the pedagogical difference between spell-checkers as a tool for professionalism and AI as a tool for generating entire papers.
Spell checkers fix a gap in knowledge or discreet errors which professors (or employers) rightly see as unprofessional. I have a visual impairment, so I often don’t see my spelling errors or missing punctuation marks. As the author of multiple books and articles, spell-check has been a God-send! However, The AI built into my MacBook Pro is fabulous for proofreading I catching the errors I simply don’t see. However, AI is not limited to fixing discreet errors. AI is capable of producing entire essays which bypass the uniquely human process of critical thinking, problem solving, creativity, cultural expression, and innovation.
This past week, I sat with Thomas, a student who used AI to generate his entire Christian worldview paper. I asked Thomas a series of questions about the words and concepts he wrote about.
“Thomas, please tell me what does the word ‘fundamental’ mean?”
“Thomas, you said that Jesus was the incarnation of God, please tell me what that means?”
Thomas had no idea. His essay was professionally written, but he didn’t understand the words used or the concepts he passed off as his own. So does Thomas earn an A for his ‘professionalism’ and ability to use AI and generate a document with no spelling errors?
Here’s an example of an essay about the nature of God submitted by my student Miguel. In this case, Miguel started by plagiarizing another student’s paper. He plugged the paper into an AI synonym generator to alter key words, enabling him to fool my university’s plagiarism software. Sadly, Miguel didn’t have the skillset needed to see the incomprehensible nature of the AI-generated changes. Here is just a sample of the paper he submitted.
The nature of God is often AJ question to think about, and even try to answer. To Christian, God is seen as the creator of all things, cleaning the universe. And Genesis war, God is seen as the creator, he created heaven in the earth, he made everything in six days. Yes, God, everything, and yes, he is a ruler of everything, he’s much more than words can say. He is more senate than anything we could imagine. Additionally, Christian that there is only one true God, but also just as a three person called the Trinity. The trinity Christian, in which God is asked three people. are common, in the Holy Spirit, who are united, and want substance of being. The training is fundamental and helps us with those and the nature of God.
Miguel’s use of AI illustrates why our students must learn to think before they can properly use AI. Basic knowledge, critical thinking, problem-solving, creativity, cultural expression, and innovation must be nurtured in our students, yet early reliance on AI short-circuits that process. Why? Because large language models (LLMs) have none of the human qualities we value in our poetry, our business plans, financial planning, our medical diagnostics, our architecture, or our engineering.
LLMs are merely predictive models which produce content based on what is culturally normative, morally expected, or commonly believed true. But we, as Christian educators, should expect more from our students. We must expect work that reflects their humanity, encompassing their personality, creativity, beauty, reason, and innovation.
Which is more unprofessional? A business plan turned in with some spelling errors, or an error-free business plan that a student doesn’t know how to execute?
Which is better, an engineer who can use AI to predict what equation is typically useful to calculate the load-bearing capacity of a steel member, or an engineer who actually knows how to derive that equation for an atypical design? As an engineering student, I learned the hard way about the problem of technological dependence. In my 400-level structures class, I didn’t listen to my professor and relied on my programmable HP calculator for the exams. Instead of developing my own reasoning skills, I ended up failing the course. In my second time through the course, I disciplined myself and earned an A.
If you’ll allow me a dramatic illustration, when China launches an EMP to kick off the next world war, we’ll need men and women who know how to use a slide rule.
AI is not a spell checker because these two systems perform radically different functions. The spell-checker fixes a student’s mistake based on a dictionary-defined standard. In contrast, the AI does not “fix” anything. The AI is a stochastic parrot that generates the most likely answer to a prompt based on the law of averages. There is no mind to process logic or ability to weigh each claim based on a specific set of Christian values. The end product is merely a “consensus” answer based on the bias of the programmers who chose what data to feed into the algorithm. The issue here is that the consensus answer isn’t always the right answer, and a student who is unable to understand both the content and context will never spot the errors.
Our students need to hone the skills of thinking outside the black-box of AI.
The field of ethics (bioethics, business ethics, etc.) is one example of where our capacity to think about the right use of emerging technologies is hindered by reliance on AI. If educators allow students to use AI before they establish a set of core values or before they can use logic to apply those values, AI will short-circuit the students’ ability to discern right and wrong.
When I ask a student, “What is your view of the nature of humanity?” I need to know not merely what they believe (or what AI generates as the ‘common’ answer). I need to know my student apprehends and appreciates the beauty of God’s image in every person. When it comes to debated issues such as IVF or human rights, can my students discern which laws violate our God-given dignity? Do my students have the moral courage to stand for what is right? AI-generated answers tell me none of those things.
As I often explain to my students like Thomas and Miguel, you can get a better grade using AI, but the grade tells me nothing about your capacity to synthesize ideas or apply them to difficult circumstances.
Students are more than a grade. Their writing is a window into their heart, mind, and soul. The imperfect words of a human are more valuable to me as their teacher than the perfect words of a machine-generated text.
When I catch a student using AI to generate their entire essay, I remind them, “AI will help you pass the class, but it won't help you succeed in becoming the man or woman God has called you to be.” AI can help students get better grades, but AI can’t make them better humans.
Yes, our pedagogy must adapt, but we’re wrong to think AI is actually some kind of Artificial Intelligence akin to human intelligence. That is a naturalistic worldview of the mind, not a Christian worldview.
AI is not “Artificial intelligence” at all. AI is merely an "Algorithmic Iteration” of data selected by a programmer that reflects mathematical probabilities, not human potentialities.
My most pressing concern is that AI is being used to keep students from failing when failure is what they need to experience. High School graduates who come into my classroom as freshman are mentally fragile. They come in ill-prepared to face any intellectual challenge. Maria sat in my office this past week and said to me,
“Yes, I used AI to write my paper, but only because I write like a 5th grader. I was embarrassed to turn in my writing. I didn’t want to fail.”
The failure, however, is not Maria’s. The failure is on the part of Maria’s teachers who passed this young woman along. without helping her grow as a human. I gently helped Maria see that failure is not fatal. Failure can be a good thing for students if they use it to mature. Failure generates friction. Failure creates pressure. Maria’s response to that failure is what has the potential to refine her, to make her stronger, and to equip her for the bigger challenges of life that lie ahead.
Until you learn to fail, AI will only be a hindrance to academic success and human flourishing.
All this to say, to conflate spell-check with AI-generated content is a fundamental error Christian educators cannot afford to make in our mission to train the next generation.