Tag: ChatGPT

Teaching strategies for “the ChatGPT wave”: Transferable lessons from proctoring tools

Teaching strategies for “the ChatGPT wave”: Transferable lessons from proctoring tools

Read time: 5 minutes

In my popular culture research, a cultural movement often carries the referent of a “wave.” Example: The Hallyu movement of the 1980s to 2000s (debatable depending on the scholar you consult) refers to a “wave” of Korean popular culture beyond the nation’s borders.

In my day-to-day work, I might use the referent “wave” to refer to the conversation en vogue in the fields of teaching, learning, and academic integrity: in this instance, let’s use the referent “the ChatGPT wave.”

But first, a quick blast from the past [three years] for context:

Higher education conversations about assessment in digital learning environments rarely avoid a debate on academic integrity. From my experience—and likely yours—this specific debate maps itself on a spectrum ranging somewhere from “enforcing academic integrity with the latest and most stringent means available” to “recognizing no perfect enforcement is possible and does not seem productive to ensure student learning”.

My emphasis here is on two points, to be revisited very soon: (1) that no flawless enforcement of academic honesty is possible with a tool; and (2) that a fixation on enforcement of not cheating rather than a focus on fostering student learning leads to costly outcomes for all.

Perhaps this diversity of positions on assessment with academic integrity emerged rather sharply during the emergency move to online learning per the COVID-19 pandemic. The immediate legacy might be summed up in some phases: faculty unrest for a technology-based solution to prevent students from cheating, a hasty adoption of an inadequate solution, uncomfortable and stressful assessments for both its administrating faculty and its examinee students using said inadequate solution, then a quick abandonment of said inadequate solution due to privacy violations (some of which are undergoing legal disputes, well within our region).

As we embark on the amazing frontier of AI (artificial intelligence) authoring tools, let us brace ourselves for the ChatGPT wave by remembering to prioritize student learning rather than hunting for cheaters. Here are some teaching strategies for AI authoring tools like ChatGPT, very much informed by our recent misadventures with proctoring tools:

Remember that a tool is not a human. Just like the highly touted and speedily adopted proctoring tools of yesteryear cannot guarantee or completely safeguard cheating by a human student, ChatGPT and AI tools share an obvious quality: ChatGPT is not a human student. A human demonstrates learning for a specific learning outcome, whether by sharing a sentiment or committing an error that is irrevocably human. Looking for signs of life might mean creating space for students to show their human selves, perhaps by engaging conversation about something fun to them, or posing a writing prompt that is more specific to their periphery of being, or assigning something creative or audio recorded. If you assign work that is general and without connection to your students, expect machine-like responses.

Revise your learning objectives and corresponding activities for someone who wants to learn. As an instructor, I find my essential job description, whether I am teaching professional business writing or instructional design, is to facilitate meaningful learning experiences for my students. Many times, essential charge prompts reflection and revision of my coursework and assessment designs. Rising to the occasion of facilitating meaningful learning is an easy move when students want to learn. National enrollment in higher education has seen better days, so being interesting seems like a project of mutual interest for faculty.

Find help for the things you don’t know. Since my start in the field of teaching and learning support, I have seen resources and services grow rapidly in the name of faculty teaching online and with instructional tools. It is highly likely that your place of teaching extends such resources and services to you, if only you seek them out. “Closed mouths don’t get fed,” as the saying goes, and in my experience, if you don’t ask for help, you will only fall more behind. Technologies are always updating and departments may shift in structure, but you can control your own course (pun intended) by looking for those that literally have in their job descriptions to help you.

Learn about the tool’s development and limitations, and share this with your students. OpenAI, the developers behind ChatGPT, are very transparent about its testing process and limitations as an AI authoring tool. Some key and critical limitations to note so far include a proclivity to outputs that are “toxic or biased” with made-up facts; and an English-speaking, and therefore cultural bias “towards the cultural values of English-speaking people.” Having a conversation with your students about such limitations makes for transparency in your class while addressing the serious possibilities for mis-presentations of self. Who wants to be seen as toxic or treacherous?

If we have learned anything from the Test Cheating Scare of 2020, let us brace for this ChatGPT wave with clarity of purpose as instructors, and aim for human exchanges with our students.