SKIP AHEAD TO
At a Glance
As AI tools like ChatGPT gain popularity on campus, instructors face new questions around academic integrity. Some worry that they could inadvertently give higher grades to students who use AI compared to those who don’t use AI for coursework. Others are concerned that reliance on AI tools could hinder students’ development of critical thinking skills. Whether or not you integrate these technologies into your courses, it’s important to reflect on how you’ll address them with students. How can you foster academic honesty and critical thinking when every student has easy access to generative AI?
In response to these concerns, some companies have developed “AI detection” software. This software aims to flag AI-generated content in student work. However, AI detection software is far from foolproof—in fact, it has high error rates and can lead instructors to falsely accuse students of misconduct (Edwards, 2023; Fowler, 2023). OpenAI, the company behind ChatGPT, even shut down their own AI detection software because of its poor accuracy (Nelson, 2023).
In this guide, we’ll go beyond AI detection software. We’ll discuss how clear guidelines, open dialogue with students, creative assignment design, and other strategies can promote academic honesty and critical thinking in an AI-enabled world.
Set Clear Policies and Expectations
It’s important to be clear with your students about if, when, and how they should use AI in your courses (Eberly Center, n.d.; Schmidli et al., 2023). Here are some potential strategies:
- Announce your policies on AI use both in person and in writing. First, make sure to talk about these policies with your students during class at the beginning of the semester. It’s also essential to include the policies in your syllabus and course site (as recommended in MIT Sloan’s Generative AI Guiding Principles) so students can easily go back and reference your expectations (Teaching + Learning Lab, n.d.-b).
- Provide definitions of key terms like plagiarism and cheating in the context of generative AI tools.
- Share clear examples of appropriate versus inappropriate AI applications for specific tasks (Eberly Center, n.d.; Columbia Center for Teaching and Learning, n.d.). For example, you might allow students to use ChatGPT to brainstorm ideas or review grammar, but not to generate significant portions of essay content.
Setting clear expectations from the start can help you guide appropriate use of generative AI tools. Furthermore, by aligning our policies and practices with MIT Sloan’s Values, we can foster a culture of academic honesty and ethical leadership even as new technologies emerge.
Promote Transparency and Dialogue
In addition to transparent policies, you can support academic integrity through open conversations with your students. Consider these possible approaches:
- Hold class discussions where students can ask questions and share their perspectives about AI tools (Stanford Center for Teaching and Learning, 2023).
- Explain the rationale behind your AI policies so students understand that the goal is to facilitate meaningful learning—not just enforce compliance (Teaching + Learning Lab, n.d.-a)
- If your students will be using generative AI tools, establish clear expectations in terms of how they’ll acknowledge and cite their use of these technologies (McAdoo, 2023). Note that OpenAI’s terms of use state that users may not “represent that output from the Services was human-generated when it is not.”
Open conversations can help you build trust with your students and learn from them as partners as we navigate these new challenges together.
Foster Intrinsic Motivation
Thoughtfully designed assignments can reduce the temptation to misuse AI by sparking students’ intrinsic motivation. These are some research-backed strategies for enhancing student engagement:
- Help students understand how completing a given assignment will support their learning (CAST, n.d.-a).
- Allow students flexibility to incorporate their interests and creativity through choices in project formats, topics, and methods (Usable Knowledge, 2016; CAST, n.d.-b).
- Build in opportunities for self-reflection and metacognition—for example, by asking students to reflect on what they’ve learned and how they learned it (Smith & Darvas, 2017).
- Scaffold assignments by breaking them into smaller pieces that build on one another—for example, asking students to submit an outline before writing their final paper (Sotiriadou et al., 2019; Eberly Center, n.d.).
- Give students opportunities to revise their work based on feedback before grading (Columbia Center for Teaching and Learning, n.d.).
- Connect assignments to real-world contexts and applications that are meaningful for your students (Sotiriadou et al., 2019; CAST, n.d.-c).
No assignment design can prevent all improper use of tools. However, instructors can thoughtfully shape activities to motivate meaningful effort.
Ensure Inclusive Teaching
If you don’t want students using AI in your course, it can be tempting to revert to analog forms of assessment. However, relying on handwritten exams, in-class writing, or oral presentations can raise equity concerns (Ceres, 2023):
- Timed, hand-written exams may present a distracting challenge for most students, since few today are accustomed to composing and writing by hand. This format can especially disadvantage those who are unable to hand-write quickly (Tai et al., 2022)
- Oral presentations put extra stress on students with anxiety and non-native English speakers, who may then face additional challenges that their peers do not (Grieve et al., 2021).
- In-class writing assignments might not fairly assess all students’ written communication skills, especially if they don’t have the chance to revise their work.
Prioritizing student success means creating an environment where everyone has an equitable opportunity to demonstrate their capabilities. Ultimately, using a mix of assessment approaches in your course is the best way to maximize equity and inclusion (Centre for Teaching and Learning, n.d.; Eberly Center, n.d.).
Conclusion
As the educational landscape evolves with new generative AI tools, remember that the heart of teaching and learning is undeniably human. By proactively establishing clear policies around the use of AI in your course, you can help students use AI responsibly. By engaging in open dialogue, you can encourage them to think critically about how and when they use these tools. By designing assignments that align with students’ interests and goals, you can make learning experiences more meaningful. And by adopting fair assessment methods, you can give every student the opportunity to showcase their skills.
Generative AI tools will affect how today’s students experience education. However, it’s still the authentic, human-centered learning experiences that will stand out and leave a lasting impact on students.
References
CAST. (n.d.-a). UDL: Clarify the meaning and purpose of goals. UDL Guidelines. https://udlguidelines.cast.org/engagement/effort-persistence/goals-objectives
CAST. (n.d.-b). UDL: Optimize choice and autonomy. UDL Guidelines. https://udlguidelines.cast.org/engagement/recruiting-interest/choice-autonomy
CAST. (n.d.-c). UDL: Optimize relevance, value, and authenticity. UDL Guidelines. https://udlguidelines.cast.org/engagement/recruiting-interest/relevance-value-authenticity
Centre for Teaching and Learning. (n.d.) IncludED: A guide to designing inclusive assessments. University of Oxford Centre for Teaching and Learning. https://ctl.ox.ac.uk/included-designing-inclusive-assessments
Ceres, P. (2023, January 26). ChatGPT is Coming for Classrooms. Don’t Panic. Wired. https://www.wired.com/story/chatgpt-is-coming-for-classrooms-dont-panic
Columbia Center for Teaching and Learning. (n.d.). Promoting academic integrity. Columbia University. https://ctl.columbia.edu/resources-and-technology/resources/academic-integrity
Eberly Center. (n.d.). Generative AI Tools FAQ. Carnegie Mellon University. https://www.cmu.edu/teaching/technology/aitools/index.html
Edwards, B. (2023, July 14). Why AI detectors think the US Constitution was written by AI. Ars Technica. https://arstechnica.com/information-technology/2023/07/why-ai-detectors-think-the-us-constitution-was-written-by-ai
Fowler, G. A. (2023, April 14). We tested a new ChatGPT-detector for teachers. It flagged an innocent student. The Washington Post. https://www.washingtonpost.com/technology/2023/04/01/chatgpt-cheating-detection-turnitin
Grieve, R., Woodley, J., Hunt, S. E., & McKay, A. (2021). Student fears of oral presentations and public speaking in higher education: A qualitative survey. Journal of Further and Higher Education, 45(9), 1281-1293. https://doi-org.ezproxy.canberra.edu.au/10.1080/0309877x.2021.1948509
McAdoo, T. (2023, April 7). How to cite ChatGPT. APA Style. https://apastyle.apa.org/blog/how-to-cite-chatgpt
Nelson, J. (2023, July 24). OpenAI quietly shuts down its AI detection tool. Decrypt. https://decrypt.co/149826/openai-quietly-shutters-its-ai-detection-tool
Schmidli, L., Harris, M., Caffrey, A., Caloro, A., Klein, J., Loya, L., Macasaet, D., Schock, E., & Story, P. (2023, January 5). Considerations for using AI in the classroom. L&S Instructional Design Collaborative at the University of Wisconsin Madison. https://idc.ls.wisc.edu/guides/using-artificial-intelligence-in-the-classroom
Smith, V. D. & Darvas, J. W. (2017). Encouraging student autonomy through higher order thinking skills. Journal of Instructional Research, 6, 29-34. https://eric.ed.gov/?id=EJ1153306
Sotiriadou, P., Logan, D., Daly, A., & Guest, R. (2019). The role of authentic assessment to preserve academic integrity and promote skill development and employability. Studies in Higher Education, 45(11), 2132–2148. https://doi-org.ezproxy.canberra.edu.au/10.1080/03075079.2019.1582015
Stanford Center for Teaching and Learning. (2023, June 19). Pedagogic strategies for adapting to generative AI chatbots. Stanford Teaching Commons. https://teachingcommons.stanford.edu/news/pedagogic-strategies-adapting-generative-ai-chatbots
Tai, J., Mahoney, P., Ajjawi, R., Bearman, M., Dargusch, J., Dracup, M., & Harris, L. (2022). How are examinations inclusive for students with disabilities in higher education? A sociomaterial analysis. Assessment & Evaluation in Higher Education, 48(3), 390–402. https://doi-org.ezproxy.canberra.edu.au/10.1080/02602938.2022.2077910
Teaching + Learning Lab. (n.d.-a). Rethinking your problem sets in the world of generative AI. Massachusetts Institute of Technology. https://tll-mit-edu.ezproxy.canberra.edu.au/rethinking-your-problem-sets-in-the-world-of-generative-ai
Teaching + Learning Lab. (n.d.-b). Teaching & learning with ChatGPT: Opportunity or quagmire? Part III. Massachusetts Institute of Technology. https://tll-mit-edu.ezproxy.canberra.edu.au/teaching-learning-with-chatgpt-opportunity-or-quagmire-part-iii
Usable Knowledge. (2016, September 11). Intrinsically motivated. Harvard Graduate School of Education. https://www.gse.harvard.edu/ideas/usable-knowledge/16/09/intrinsically-motivated