Applying IRT to Distinguish Between Human and Generative AI Responses to Multiple-Choice Assessments

Alona Strugatski, Giora Alexandron

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Generative AI is transforming the educational landscape, raising significant concerns about cheating. Despite the widespread use of multiple-choice questions (MCQs) in assessments, the detection of AI cheating in MCQ-based tests has been almost unexplored, in contrast to the focus on detecting AI-cheating on text-rich student outputs. In this paper, we propose a method based on the application of Item Response Theory (IRT) to address this gap. Our approach operates on the assumption that artificial and human intelligence exhibit different response patterns, with AI cheating manifesting as deviations from the expected patterns of human responses. These deviations are modeled using Person-Fit Statistics (PFS). We demonstrate that this method effectively highlights the differences between human responses and those generated by premium versions of leading chatbots (ChatGPT, Claude, and Gemini), but that it is also sensitive to the amount of AI cheating in the data. Furthermore, we show that the chatbots differ in their reasoning profiles. Our work provides both a theoretical foundation and empirical evidence for the application of IRT to identify AI cheating in MCQ-based assessments.

Original languageEnglish
Title of host publication15th International Conference on Learning Analytics and Knowledge, LAK 2025
Pages817-823
Number of pages7
ISBN (Electronic)9798400707018
DOIs
StatePublished - 3 Mar 2025
Event15th International Conference on Learning Analytics and Knowledge, LAK 2025 - Dublin, Ireland
Duration: 3 Mar 20257 Mar 2025

Publication series

Name15th International Conference on Learning Analytics and Knowledge, LAK 2025

Conference

Conference15th International Conference on Learning Analytics and Knowledge, LAK 2025
Country/TerritoryIreland
CityDublin
Period3/03/257/03/25

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Education
  • Information Systems
  • Computer Graphics and Computer-Aided Design
  • Computer Networks and Communications
  • Information Systems and Management

Cite this