Metacognitive Overload!: Positive and Negative Effects of Metacognitive Prompts in an Intelligent Tutoring System

Publication Information


  • Kathryn S. McCarthy, Arizona State University
  • Aaron D. Likens, Arizona State University
  • Amy M. Johnson, Arizona State University
  • Tricia Guerrero, Arizona State University
  • Danielle McNamara, Arizona State University


  • 420-438


  • Intelligent tutoring systems, Metacognition, Reading comprehension, Log data


  • Research suggests that promoting metacognitive awareness can increase performance in, and learning from, intelligent tutoring systems (ITSs). The current work examines the effects of two metacognitive prompts within iSTART, a reading comprehension strategy ITS in which students practice writing quality self-explanations. In addition to comparing iSTART practice to a no-training control, those in the iSTART condition (n = 116) were randomly assigned to a 2 (performance threshold: off, on) × 2(self-assessment: off, on) design. The performance threshold notified students when their average self-explanation score was below an experimenter-set threshold and the self-assessment prompted students to estimate their self-explanation score on the current trial. Students who practiced with iSTART had higher posttest self-explanation scores and inference comprehension scores on a transfer test than students in the no training control, replicating previous benefits for iSTART. However, there were no effects of either metacognitive prompt on these learning outcomes. In-system self-explanation scores indicated that the metacognitive prompts were detrimental to performance relative to standard iSTART practice. This study did not find benefits of metacognitive prompts in enhancing performance during practice or after the completion of training. Such findings support the idea that improving reading comprehension strategies comes from deliberate practice with actionable feedback rather than explicit metacognitive supports.