Artificial intelligence (AI) and technologies such as ChatGPT have forced educators to urgently confront difficult questions. Consider the following pedagogical concerns:
- Can these technologies be used effectively without compromising the development of understandings and competencies?
- How and when will they influence curriculum and evaluation?
- Will the use of these technologies interfere with the development of concepts and skills?
Educators raised the anxieties listed above not in response to the late-2022 emergence of ChatGPT technology but in a 1977 National Education Association publication regarding the debate over minicalculator use in the classroom.1 While ChatGPT technology is new, teachers have navigated similar pedagogical concerns before. Furthermore, I believe that theological educators are specially poised to leverage this opportunity for good in ways rarely imagined earlier.
Generative pre-trained transformer (GPT) technology utilizes machine learning based on an enormous collection of questions and examples to establish baseline answers which are fine-tuned for each specific user query. ChatGPT is a specific chatbot deployment of this technology developed by the company OpenAI.
This technology quickly produces bibliographic references, definitions of terms, explanations of scholarly debates, complete essays, as well as many other possibilities. When educators realized that much of this technology was freely available to students, they knew it couldn’t be ignored. Knee-jerk responses to this technology feared the collapse of learning, the exponential rise of academic disintegrity, and pleas to ban the technology and restrict access to it on university networks.
Fortunately, after a brief cooling-off period, most educator responses to ChatGPT avoided the simple binary of “should we or shouldn’t we allow it.” The better responses engaged how to use this technology responsibly—and this is where past pedagogical pauses can begin to guide us. A 1974 article discussed the purely functional and purely pedagogical uses of calculators.2
The purely functional approach utilized the tool to undertake rote tasks in order to save time and trouble. The purely pedagogical method leveraged calculators as teaching and learning machines. Teachers can apply this distinction between “time saver” and “teaching tool” for great use in our classes. In the 1970s, teachers utilized calculator technology to provide immediate engagement and feedback in the learning process sooner, and to arrive at “solutions” more quickly.
The need for “contrived problems”—that is, problems that are intentionally simplistic for the sake of introductory learning—was minimized. This acceleration allowed, instead, the application of technology to real-world problems. Due to similar advances, technology benefits leapt from secluded laboratories into neighborhoods with astonishing speed. I have adjusted (in italics) the 1974 statement on calculators from the National Council of Teachers of Mathematics in order to apply it to our current questions about ChatGPT:
Theological educators should recognize the potential contribution of ChatGPT as a valuable instructional aid. In the classroom, ChatGPT should be used in imaginative ways to reinforce learning and motivate the learner as he becomes proficient in theology and biblical studies.3
The key argument I’m making at this point is that AI technology provides an opportunity to become more efficient in our task as educators. But, before I give examples of how we can do that, I want to apply a theological lens to the discussion of AI pedagogy.
Theologically informed solutions
The scientific study of artificial intelligence often finds its way into theological territory. Noreen Herzfeld, professor of theology and computer science at St. John’s University, explains, “the potentiality of the computer to mimic human thought has … sparked a profound debate as to what it is that we wish to image with artificial intelligence, what it is that makes us truly human.”4 Classic models of the image of God, of what makes us “truly human,” fall into three categories.
First, the structural model highlights capacities that set humans apart from the rest of creation; the rational capacity of the mind and the volitional capacity of the will form the structural model that bring to light the knowledge and righteousness of God. Second, the relational model illustrates the human capacity to relate, respond, and participate with God.
The third model focuses on the function of humans, examining the unique tasks God has given humans in the created world; these tasks include overseeing and filling creation.5 In sum, a holistic view of the image demonstrates that God created humans to know and participate with God through a special capacity to display the knowledge and righteousness of God in order to carry out the purposes of God in the world.
An addendum to this statement is that the image of God is not manifested in artificial intelligence. In 2019, a group of Christian leaders agreed that technology should not “be assigned a level of human identity, worth, dignity, or moral agency.”6 Their reasoning behind this proclamation is their belief that only humans possess the image of God. ChatGPT does not possess the image of God.
A litmus-test question that educators can bring to our pedagogical questions about ChatGPT is: what do image-bearers uniquely offer that is unavailable to technology? One set of answers to this question flows from the three-fold understanding of the image of God.
The structural view leads us to consider the inquisitive and creative capacities of humans. While ChatGPT can provide pre-packaged answers, humans supply the initiative and form the questions brought to this technology. Helping students form quality questions is a prime opportunity in this new era. Secular observers are making similar claims.
For example, Tomas Chamorro-Premuzic, professor of business psychology at University College, London, predicts that “in the future smart humans will differentiate from others not in terms of what they know, but what they want to know.”7
Harnessing human creativity is another area for exploration. Technology can quickly provide definitions and content for our students, but we can challenge them to utilize this information as building blocks, to form inventive and original combinations in synthetic ways (more on this below).
The relational view highlights our capacity to relate, respond, and participate with God and others. This approach might lead us to ask students more about their personal experiences in the past and present. No computer database contains the experiences of our students. When we assign presentations, debates, interviews, speeches, and panels, we leverage the relational view as students engage people in real life.
Perhaps our most exciting pedagogical opportunity is to challenge students to ask how education provides us opportunities to carry out the functional aspect of imaging God in the world. Students can deploy technology to do more of the brute work so they can process the question: how do we apply our learning to the needs of the world? ChatGPT might provide some of the ideas, but only humans can provide the “doing.”
Chamorro-Premuzic makes a similar claim: “the ability to turn that knowledge into actions—going from theory to practice—that will epitomize human experts.”8 Advances in technology in recent decades and centuries have revolutionized our daily lives. What if the recent advances that ChatGPT provides for retrieving biblical and theological “data” could help us spend less time in retrieval and more time in application—utilizing well-established beliefs in order to exert our God-given capacities toward functioning for the sake of God’s plans for the world.
Technology can increase the opportunities for synthetic learning, as described above, and the functional approach takes these advances from theory to practice. Rightly utilized, perhaps we are on the verge of not just a technological revolution, but an ecclesiological revolution where the people of God collaborate in new and exciting ways never before seen.9
Practical examples for the ChatGPT-era classroom
We are now seeing some default recommendations regarding the emergence of ChatGPT in our classrooms. These include: reworking academic policies, including language in syllabi addressing what is acceptable and what isn’t, reworking existing assignments and prompts, and assessing a student’s process—not just the final product—by collecting outlines and drafts.10 Collaborating with our peers in higher education is a necessity in the coming years more than ever. Below are three examples of how we can integrate these reflections into the classroom.
First, as we consider the structural view, we can assign students to write questions as reflections on their reading and learning. One way I do this on a regular basis is my use of an “exit ticket.” In most of my classes, students earn participation points by filling out an online form at the end of class where they describe one thing they learned and a “lingering question.” I address many of my students’ questions in the first few minutes of the next class session. In the future, I plan to provide more assignments in which the student’s primary task is to create high-quality questions based on the reading and lectures. Writing the questions not only serves as an assignment; it also sets up a natural in-class discussion.
Another way to engage the structural view is to create assignments that provide opportunities for creativity. For example, an assignment could list several terms or ideas in one column and another set in a second column, and then ask the student to reflect on “the most natural, most unnatural, and most surprising” combinations of the two columns. For a class I teach later this semester, I could imagine an interesting response to, for instance, how Augustine would have responded to Luther’s 95 Theses (among an assortment of other options).
Second, the relational view provides instructors an opportunity for an “experience.” One of my colleagues begins his undergrad classes—much like a graduate seminar—by assigning a student to open the class with a three to five-minute recap of the pre-class reading to begin a class discussion of the text. While the student could prepare the content of the recap with ChatGPT technology, the experience of delivering the information, as well as the ensuing professor-led discussion requires more than mere retrieval and rephrasing of information.
As I mentioned earlier, presentations, debates, interviews, speeches, and panels, are prime pedagogical tools for this theme. These approaches fit naturally for in-person classes, and teachers may appreciate that much of the grading will be completed within the class. Online asynchronous educators can design assignments utilizing data that only that student can obtain. These might include personal interviews (guided by university guidelines) and reflections on personal experiences. Assignments in this category require clear rubrics, since personal reflections and experiences can veer off-course and avoid substantive engagement.
To engage a more experiential mode of learning, I recently transitioned a presentation assignment to an in-class debate. In the past, a group presentation on the topic of dehumanization worked well. But, as my class sizes increased, the groups grew larger and the time for each group decreased, diminishing the usefulness of the assignment. I responded by changing the assignment to an in-class debate.
I kept the group sizes the same as before, but on the day of the debate I used a random number generator to determine which members would represent their group in the debate. This approach made some students nervous, but it also forced them to be much more prepared than the previously dull presentations. Each team received eighty percent of their points for completing a pre-debate research template I provided and twenty percent for the debate performance. I am now considering altering the pre-debate research template to include a “What does ChatGPT suggest?” section.
The most difficult, yet perhaps the most significant, pedagogical opportunity in light of ChatGPT is to engage the functional view of the image of God where we move from theory to practice—from learning to doing. Some course topics will accommodate this more easily than others. Additionally, we recognize that theological education serves a different purpose than does, for example, a service or social agency.
Meanwhile, I suggest that we should push further into these areas in our teaching and assignments. Colleagues in our mathematics departments frequently collaborate with other departments to find solutions for the needs of the world; why would theological educators hesitate? This is an area I’m challenging myself to apply further. The academic in me is weary of this for some reason; I think it is a matter of my pride and being perceived by my peers as less rigorous. Yet, as we see technology providing more of the brute work of research and writing in the humanities, it might be that we recalibrate our pedagogical aim back toward how humanities can benefit more humans.
As I reflect on how ChatGPT influences my teaching, I remember the wisdom of Rabbi Jonathan Sacks from a reading I assign to freshman students every semester. Sacks writes, “Religion survives because it answers the three questions that every reflective person must ask: Who am I? Why am I here? How shall I live?”11 These are questions that only humans can truly answer.
This article was originally published in the April 2023 issue of Didaktikos. Slight adjustments, such as title and subheadings, may be the addition of an editor.
- Douglas Estes on Teaching, Culture & the Christian Life
- Language Learning: Empower Students to Take Hebrew and Greek
- Teaching an Excellent Seminary Lesson Starts with This
- Joseph R. Caravella, Minicalculators in the Classroom (Washington, D.C.: National Education Association of the United States, 1977), 6, 12.
- Leonard Etlinger, “The Electrical Calculator: A New Trend in School Mathematics,” Educational Technology 14.12 (December 1974): 43–45.
- Compare with Caravella, Minicalculators in the Classroom, 15.
- Noreen Herzfeld, In Our Image: Artificial Intelligence and the Human Spirit (Minneapolis, MN: Fortress, 2002), 1–2.
- Joshua A. Farris, An Introduction to Theological Anthropology: Humans, Both Creaturely and Divine (Grand Rapids, MI: Baker Academic, 2020), 84–91.
- Tomas Chamorro-Premuzic, “How ChatGPT Is Redefining Human Expertise: Or How To Be Smart When AI Is Smarter Than You,” Forbes.com, 21 January 2023, https://www.forbes.com/sites/tomaspremuzic/2023/01/12/how-chatgpt-is-redefining-human-expertise-or-how-to-be-smart-when-ai-is-smarter-than-you.
- Chamorro-Premuzic, “How ChatGPT Is Redefining Human Expertise.”
- For some reflections on the potential for collaboration in the church, see chapter 8, “Limited but Not Alone: The Church,” in Sean McGever, The Good News of Our Limits: Find Greater Peace, Joy, and Effectiveness Through God’s Gift of Inadequacy (Grand Rapids: Zondervan, 2022), 141–60.
- A useful resource is: https://www.turnitin.com/resources/academic-integ-rity-in-the-age-of-AI.
- Jonathan Sacks, The Great Partnership: Science, Religion, and the Search for Meaning (New York: Schocken Books, 2011), 282.