ABSTRACT VIEW
Abstract NUM 1572

CODE & REFLECT: STUDENT ENGAGEMENT WITH AUTOMATED FEEDBACK
S. Kazamia, J. Appleton, M. Cirovic, J. Lam
University of Surrey (UNITED KINGDOM)
The increasing demand for programming courses in higher education has prompted instructors to adopt tools that automate student support and deliver timely, personalised feedback. Such automated systems promise iterative improvement of student work and have been demonstrated to scaffold learning effectively, as evidenced by increased engagement and improved academic performance. However, a nuanced understanding of students' perspectives remains essential to successfully integrate automated feedback into large-scale programming courses.

Building on previous work, this study analyses survey data to explore students' attitudes, confidence, and self-regulatory behaviours when interacting with an automated, language- and task-agnostic feedback system deployed within a first-year programming module at a Higher Education (HE) institution. The platform is integrated directly into the institution’s Learning Management System (LMS), simplifying administrative overhead and improving accessibility for both instructors and students. Key features include an interactive "Lab Status Report" that provides immediate feedback, encouraging iterative student engagement, and an "Individual Learning Plan" supporting personalised, self-paced progression through programming concepts.

The findings from our survey contribute to the broader discourse on student engagement and motivation, highlighting that students perceive continuous automated feedback as instrumental in refining code quality and fostering a tangible sense of achievement. These perceptions are accompanied by significant gains in student confidence and enhanced discipline in self-organisation and time management. Nevertheless, concerns emerged around potential over-reliance on automated feedback, raising questions about its impact on the genuine development of independent programming skills.

This paper addresses two primary research questions based on the analysis of survey data: (1) To what extent do automated feedback tools enhance genuine programming confidence, enabling students to independently solve programming tasks and transfer these skills effectively? (2) Does extensive reliance on automated feedback tools negatively impact students’ ability to manage their time and resources independently, potentially hindering critical self-regulation skills? By investigating these research questions, we hope to highlight effective ways to use automated feedback tools, identify potential pitfalls, and recommend personalised strategies to support student learning outcomes in programming education.

Keywords: Higher education, authentic assessment, programming, automated assessment, formative feedback, student outcomes, confidence, student engagement.

Event: ICERI2025
Track: STEM Education
Session: Computer Science Education
Session type: VIRTUAL