The Sensitivity of a Scenario-Based Assessment of Written Argumentation to School Differences in Curriculum and Instruction

Publication Information

Authors:

  • Paul Deane, Educational Testing Service
  • Joshua Wilson, University of Delaware
  • Mo Zhang, Educational Testing Service
  • Chen Li, Educational Testing Service
  • Peter van Rijn, ETS Global
  • Hongwen Guo, Educational Testing Service
  • Amanda Roth, Educational Testing Service
  • Eowyn Winchester, Educational Testing Service
  • Theresa Richter, Educational Testing Service

Pages:

  • 57-98

Keywords:

  • Scenario-based assessment, SBA, Writing, Assessment, Automated writing evaluation, AWE, Natural language processing, NLP, Automated essay scoring, AES, Writing process, Keystroke log, Argumentation, Interim assessment, Formative assessment

Abstract:

  • Educators need actionable information about student progress during the school year. This paper explores an approach to this problem in the writing domain that combines three measurement approaches intended for use in interim-assessment fashion: scenario-based assessments (SBAs), to simulate authentic classroom tasks, automated writing evaluation (AWE) features to track changes in performance and writing process traits derived from a keystroke log. Our primary goal is to determine if SBAs designed to measure English Language Arts skills, supplemented by richer measurement of the writing task, function well as interim assessments that are sensitive to differences in performance related to differences in quality of instruction. We calibrated these measures psychometrically using data from a prior study and then applied them to evaluate changes in performance in one suburban and two urban middle schools that taught argument writing. Of the three schools, only School A (the suburban school, with the strongest overall performance) showed significant score increases on an essay task, accompanied by distinctive patterns of improvement. A general, unconditioned growth pattern was also evident. These results demonstrate an approach that can provide richer, more actionable information about student status and changes in student performance over the course of the school year.