B. Hübler, V. Göhler
In digital education, keeping students engaged and focused is a significant challenge for educators. The intrinsic sense of connection experienced in a physical classroom often diminishes in a virtual environment, making it difficult for the teacher to gauge and respond to students attention and emotional states.
This research addresses the critical need for sophisticated digital teaching tools by using advanced eye-tracking and facial expression recognition technologies to monitor and analyse student engagement and emotional states during virtual classes. This system facilitates the assessment and adaptation of teaching methods to ensure that students remain focused and engaged with the material.
We have developed and validated a multimodal attention and emotion analysis system for video data, integrating machine learning techniques, in particular convolutional neural networks (CNNs) to detect facial expressions and classify human emotions. Eye movement analysis was done to measure attention levels.
An experimental design in a controlled laboratory environment was used to collect and analyse data from university student participants (n=18) to ensure the accuracy and effectiveness of the system. The validation process included separate and combined assessments of the attention tracking and emotion detection methods. A simulated 12-minute videoconference with intentional distractors was used to measure attention, while emotion induction techniques elicited defined emotional states to test emotion detection accuracy. The system achieved a distractor detection rate of 70.6%.
The results highlight the potential of this multimodal analysis approach to provide real-time, actionable feedback to instructors, thereby enhancing interactivity and responsiveness in digital learning environments. However, factors such as the availability and quality of suitable cameras and bandwidth remain limitations. It is also a challenge to motivate students to activate their cameras at times.
This research makes a significant contribution to the field of educational technology by presenting a novel method for augmenting digital instructional formats with advanced attentional and emotional analysis. The integration of these technologies promises to revolutionise the effectiveness of virtual education by fostering an environment where student engagement can be continuously monitored and optimised.
Keywords: Digital Education, Student Engagement, Eye-Tracking, Facial Expression Recognition, Machine Learning, Convolutional Neural Networks, Attention Analysis, Emotion Detection, Virtual Learning Environments, Educational Technology.