The March Active Teaching Lab opened with a question that’s been quietly troubling many instructors: if students are using AI constantly, and if AI can produce plausible-sounding answers to almost anything we assign, how do we as educators adapt?
Sarthak Singh and Lin Deng, both doctoral students and instructors in the Lubar College of Business, argued that the answer to AI in the classroom is not the latest plagiarism detector, it’s leaning into our human ability to build trust, surface complexity, and create the conditions where students feel safe enough to think out loud.
Watch Singh and Deng walk through their teaching framework in the recording below. The notes beneath the video provide a brief overview of their methodology.
Why “Rehumanizing” Engagement Matters
Students know AI negatively impacts their learning, but the ease by which AI can help complete homework, answer discussion prompts, or fill in quiz responses, makes it a siren song too tempting to resist. According to the 2024 Digital Education Council Global AI Student Survey, 86% of students reported using AI tools in 2024 to help complete coursework. Over half of these students were doing so on a weekly or daily basis.
Despite these facts, more than half of students surveyed also believe that an over-reliance on AI in teaching decreases the value of their education, and undermines their own academic performance. These students were thirsty for greater guidance in how to use AI ethically and effectively, but felt their universities did not provide the expected training in how and when to use AI in their courses.
But how do we guide students in the ethical use of AI, combat AI academic misconduct, and achieve our regular semester-long learning objectives?
Sarthak and Lin suggested that a way forward could be found by rehumanizing student engagement. Instead of just asking students to complete an assignment on their own, we build activities that help students connect to each other, think critically about course content, and move beyond the textbook answers at which AI excels.
The Preview-Practice-Reflection (PPR) Framework
Sarthak and Lin developed a teaching method by which to put this ‘rehumanized’ teaching practice into place: the Preview, Practice, Reflection (PPR) Framework.
PPR gives instructors a transferable three-part structure for any lesson. It helps build the empathy, trust, and connection Sarthak and Lin identified as so crucial to building student engagement. Simultaneously, the framework provides students a guided method by which to encounter the uses and limitations of AI contextualized to their own course material. Each of the framework’s three stages has a distinct purpose, and together they move students from passive consumption to active and critical engagement.
Preview
Before class, or at the start of a session, students use AI to generate an initial explanation of the topic or concept at hand. The goal isn’t to get a good answer, but an answer they can interrogate.
Then, together in groups or as a class, students explore what problems they identified in the AI’s response. For example, they might explore what information is missing, whose perspective is absent, and what the AI oversimplified.
Lin Deng’s cross-cultural management example is a useful illustration. She asked students to prompt AI about individualist versus collectivist cultures and whether the United States is an individualistic society. The AI gave a confident, textbook-accurate answer: the United States is highly individualistic. Then, in class, Lin asked students: “if you had extra money you didn’t need, would you give it to a homeless person or to your family and friends?” Lin noted that more than 90% said family, which is a very collectivist response.
The AI had told her class what culture theory says about the United States. Yet, it had no way to surface lived reality, or the fact that context is very important to any given question. That gap between the AI’s confident generalization and the students’ own experience was then the starting point for discussion over what is meant by individualism and collectivism.
Practice
In the practice stage students work in small groups to take one of the gaps or missing voices they identified in the Preview section and develop it further. Groups use course readings, primary sources, case studies, or other discipline-specific materials to pressure-test their reasoning and compare what the AI gave them against what the evidence actually says.
The practice stage not only gives students the opportunity to interrogate their use of AI, it offers instructors a convenient place in which to discuss AI in the context of their own discipline. Instructors can help students see both where using AI is harmful to student learning and when it can provide resources to develop better ideas more quickly.
Reflection
The Reflection stage is metacognitive. Students are not asked to critique the AI, restate what they learned, or evaluate the lesson. They are asked to reflect on their own thinking process: such as what question did you ask today that you couldn’t have asked at the start of class, and what would you need to find out next?
This section invites students to see themselves as thinkers who are developing and challenging received content, not just containers for facts. It also surfaces the experience of productive uncertainty: realizing you know more than you did, and that knowing more means you now have better questions, not necessarily the final answer.
Furthermore, Deng noted that students in her courses using this kind of structured reflection report feeling seen and heard. That experience of being recognized as a person with a perspective, rather than a student performing for a grade, improves engagement and counters many students’ instinct to stay invisible.
Build a PPR Activity for Your Course
To see what a PPR activity might look like in a course you teach, copy this prompt into your preferred AI tool. Fill in the bracketed sections before submitting. Click here for a sample PPR activity drafted using the prompt below.
I am an instructor teaching [COURSE NAME OR SUBJECT AREA] at the college or university level. I want to design a Preview-Practice-Reflection (PPR) activity for my students around the topic of [SPECIFIC TOPIC OR CONCEPT FROM YOUR COURSE].
Please build a complete PPR activity using the following structure:
Preview: Write a student-facing AI prompt on this topic — or note that students can use readings to develop a prompt of their own — that asks for a plain-language explanation and one real-world example or consequence. The prompt should be specific enough to be useful but open enough that students can identify gaps, missing voices, or oversimplifications in the response. Then provide three debrief questions the instructor can use with the full group to surface what the AI got right, what it oversimplified, and whose perspectives are absent.
Practice: Describe how students working in groups of 3 might identify one gap from the Preview debrief and build a scenario around it. Provide one example scenario that could emerge from this process, written from the perspective of someone living inside the historical moment, policy decision, scientific debate, or professional situation relevant to [SPECIFIC TOPIC OR CONCEPT FROM YOUR COURSE]. Include two or three primary sources, case studies, data sets, or discipline-specific readings an instructor might use to help groups pressure-test their reasoning — and two probing questions the instructor can use while circulating.
Reflection: Write one exit ticket prompt (suitable for a 3–5 minute written response) that asks students to reflect on their own thinking and inquiry process — not simply to restate what the AI got wrong.
Where possible, reflect the norms, methods, and source types typical of [DISCIPLINE OR FIELD — e.g., history, nursing, economics, art education, biology].