Short-answer responses to STEM exercises: Measuring response validity and its impact on learning

Andrew Waters, Phillip Grimaldi, Andrew Lan, Richard G. Baraniuk

Research output: Contribution to conferencePaperpeer-review

2 Scopus citations

Abstract

Educational technology commonly leverages multiple-choice questions for student practice, but short-answer questions hold the potential to provide better learning outcomes. Unfortunately, students in online settings often exhibit little effort when crafting short-answer responses, instead often produce off-topic (or invalid) responses that are off-topic and do not relate to the question being answered. In this study, we consider the effect of entering on-topic short-answer response on student learning and retention. To do this, we first develop a machine learning method to automatically label student open-form responses as either valid or invalid using a small amount of hand-labeled training data. Then, using data from several high school AP Biology and Physics classes, we present evidence that providing valid short-answer responses creates a positive educational benefit on later practice.

Original languageEnglish (US)
Pages374-375
Number of pages2
StatePublished - Jan 1 2017
Event10th International Conference on Educational Data Mining, EDM 2017 - Wuhan, China
Duration: Jun 25 2017Jun 28 2017

Other

Other10th International Conference on Educational Data Mining, EDM 2017
Country/TerritoryChina
CityWuhan
Period6/25/176/28/17

Keywords

  • Best educational practices
  • Cognitive psychology
  • Machine learning
  • Mixed effect modeling
  • Natural language processing

ASJC Scopus subject areas

  • Computer Science Applications
  • Information Systems

Fingerprint

Dive into the research topics of 'Short-answer responses to STEM exercises: Measuring response validity and its impact on learning'. Together they form a unique fingerprint.

Cite this