Abstract

The advantages offered by natural language processing (NLP) and machine learning enable students to receive automated feedback on their argumentation skills, independent of educator, time, and location. Although there is a growing amount of literature on formative argumentation feedback, empirical evidence on the effects of adaptive feedback mechanisms and novel NLP approaches to enhance argumentative writing remains scarce. To help fill this gap, the aim of the present study is to investigate whether automated feedback and social comparison nudging enable students to internalize and improve logical argumentation writing abilities in an undergraduate business course. We conducted a mixed-methods study to investigate the impact of argumentative writing on 71 students in a field experiment. Students in treatment group 1 completed their assignment while receiving automated feedback, whereas students in treatment group 2 completed the same assignment while receiving automated feedback with a social comparison nudge that indicated how other students performed on the same assignment. Students in the control group received generalized feedback based on rules of syntax. We found that participants who received automated argumentation feedback with a social comparison nudge wrote more convincing texts with higher-quality argumentation compared to the two benchmark groups (p < 0.05). The measured self-efficacy, perceived ease of use, and qualitative data provide valuable insights that help explain this effect. The results suggest that embedding automated feedback in combination with social comparison nudges enables students to increase their argumentative writing skills by triggering psychological processes. Receiving only automated feedback in the form of in-text argumentative highlighting without any further guidance appears not to significantly influence students’ writing abilities when compared to syntactic feedback.

Links and resources

Tags

community