E. Caro, J. Cara, J. Juan, J.M. Núñez de Prado
Ensuring the integrity of assessments in data analysis courses has become increasingly challenging due to the rise of AI-assisted tools. This study explores the implementation of a secure digital evaluation framework for courses such as "Estadística" and "Análisis de Datos" at ETSII-UPM, leveraging the Moodle interface and the Safe Exam Browser (SEB). Our approach involves analyzing different online platforms that support R programming, evaluating their feasibility for controlled assessment environments. The study reviews technical implementation, student feedback, and instructor experiences in maintaining a fair evaluation process while harnessing the benefits of digital platforms. The findings provide insights into best practices for integrating R-based assessments in higher education while mitigating unauthorized AI usage during exams.
This framework has been tested in the courses "Estadística," "Diseño de Experimentos y Modelos de Regresión," and "Análisis de Datos," which are part of the second and fourth-year curriculum of the Industrial Technologies Engineering degree, Chemical Engineering degree, and Organization Engineering degree taught at the Escuela Técnica Superior de Ingenieros Industriales of the Universidad Politécnica de Madrid (UPM). Each year, approximately 1,200 students undergo evaluation through this system, providing a robust dataset to analyze the effectiveness of the proposed methodology. One of the main challenges faced is that around 400 to 500 students take their exams simultaneously, requiring the use of secure platforms with minimal risk of crashes or slowdowns to ensure a smooth assessment process.
Keywords: Digital Assessment, R Programming, Safe Exam Browser, Moodle, AI-Resistant Evaluation.