This study examines the critique resolution strategies employed by students in introductory computer science and engineering courses when responding to automated critiques of their code. Drawing on data from both Computer Science (CS) and Engineering Fundamentals (EF) students using an automated feedback system, we investigate how students adapt their coding behavior in response to critiques and how these adaptations evolve across multiple submissions. The tool we have developed is a code critiquing tool that provides real-time feedback on syntax, logic, and stylistic errors in student code, designed specifically for educational environments. Through a multi-semester analysis of student behavior data, we evaluate the comparative performance of students enrolled in CS and EF courses and their approaches to resolving critiques. This cross-disciplinary comparison illuminates distinct patterns of problem-solving, engagement, and learning between students of varying technical backgrounds.
The data for this study is drawn from three semesters of course assignments where the critiquer system was deployed in both CS and EF courses. Each submission was automatically logged by the critiquer system, capturing data such as time between submissions, antipattern identification, critique counts, and time to critique resolution. The behavioral patterns are analyzed by assignment, allowing us to assess trends over time and across multiple academic disciplines.
By understanding how students from different disciplines engage with automated feedback, we can tailor these tools more effectively to meet the diverse needs of learners. These insights have implications for both the design of automated feedback systems and the pedagogical strategies employed in introductory programming courses. The study further proposes enhancements to the critiquer system that include customizable feedback options for different educational contexts and improved usability for both students and instructors.
The full paper will be available to logged in and registered conference attendees once the conference starts on June 22, 2025, and to all visitors after the conference ends on June 25, 2025