Inproceedings,

Stochastic Tree-Based Generation of Program-Tracing Practice Questions

, , , and .
Proceedings of the 50th ACM Technical Symposium on Computer Science Education, page 91-97. ACM, (February 2019)
DOI: 10.1145/3287324.3287492

Abstract

Recent work Erikson et al 2017 and Zavala et al 2018 has shown that mental program-execution exercises, in the form of Parson's puzzles or program-tracing, are effective in improving student performance in intro CS courses. This form of practice is promising because its low cost of creation and short duration (for the student) can promote the significant practice needed for learning. The goal of this paper is to enable wider use of such exercises through large-scale automated generation of short, multiple-choice mental execution questions. The challenge in automation is to algorithmically generate effective distractors (plausible, but incorrect choices), and to generate questions of varying levels of difficulty and whose difficulty level can be set by the instructor. In this paper, we propose a language-generalizable approach for automatically generating a practically unlimited number of such exercises, each constructed to a designated level of difficulty and incorporating the core programming-in-the-small themes: assignment, conditionals, loops, and arrays. The stochastic tree-based generation algorithm and a subsequent simulation of execution also enable generating effective distractors since all possible execution paths are readily available in the tree at the time of generation, and the distractors, therefore, correspond to reasonable (but ultimately incorrect) paths of execution. Furthermore, the approach is easily transferable to other languages with little effort. The generated questions are delivered through a mobile app that can be customized by the instructor to vary the questions generated and to introduce interleaving to take advantage of the spacing effect. Preliminary student feedback on the experience has been positive.

Tags

Users

  • @brusilovsky
  • @dblp

Comments and Reviews