@brusilovsky

Generating Diverse Code Explanations using the GPT-3 Large Language Model

, , , , , and . Proceedings of the 2022 ACM Conference on International Computing Education Research - Volume 2, page 37-39. ACM, (August 2022)
DOI: 10.1145/3501709.3544280

Abstract

Good explanations are essential to efficiently learning introductory programming concepts 10. To provide high-quality explanations at scale, numerous systems automate the process by tracing the execution of code 8, 12, defining terms 9, giving hints 16, and providing error-specific feedback 10, 16. However, these approaches often require manual effort to configure and only explain a single aspect of a given code segment. Large language models (LLMs) are also changing how students interact with code 7. For example, Github's Copilot can generate code for programmers 4, leading researchers to raise concerns about cheating 7. Instead, our work focuses on LLMs' potential to support learning by explaining numerous aspects of a given code snippet. This poster features a systematic analysis of the diverse natural language explanations that GPT-3 can generate automatically for a given code snippet. We present a subset of three use cases from our evolving design space of AI Explanations of Code.

Description

Generating Diverse Code Explanations using the GPT-3 Large Language Model | Proceedings of the 2022 ACM Conference on International Computing Education Research - Volume 2

Links and resources

Tags

community

  • @brusilovsky
  • @dblp
@brusilovsky's tags highlighted