RNN Powered: Decoding Student Attentiveness
- DOI
- 10.2991/978-94-6463-858-5_272How to use a DOI?
- Keywords
- Student attentiveness; Recurrent Neural Networks (RNNs); Video data; Emotion analysis; Hierarchical Video Summarization; Engagement
- Abstract
This Paper focuses on enhancing the detection of student attentiveness in online classes through the use of Recurrent Neural Networks (RNNs). The system leverages video data to assess student engagement by analyzing both emotions and behavioral cues over time. RNN-based models are employed to handle the temporal nature of video frames, allowing the model to track changes in student behavior and emotional responses throughout the session. To address the challenge of processing large video datasets, a Hierarchical Video Summarization technique is used to identify key frames that summarize critical information. These key frames are then fed into the RNN for sequential analysis to detect patterns of engagement and disengagement. Face detection and emotion analysis are performed using advanced deep learning methods, such as a multi-scale face detection approach integrated with RNNs, which improves the system’s ability to track faces and emotions across different resolutions and frame rates. EmoNet, a neural network specifically tuned for emotion classification, is incorporated to identify various emotional states such as interest, boredom, frustration, and confusion. The network is optimized using hyperparameter tuning and regularization techniques, ensuring high accuracy and performance. The proposed RNN-powered system provides an unobtrusive method for real-time tracking of student attentiveness, offering valuable insights to educators for improving online learning environments. Metrics such as accuracy, precision, recall, and F1-score are used to evaluate the system, with a specific focus on student engagement in South Asian classrooms.
- Copyright
- © 2025 The Author(s)
- Open Access
- Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
Cite this article
TY - CONF AU - B. Sudha Madhuri AU - Karri. Janshi AU - CH. Tejaswi AU - B. Rajeswari AU - M. Dhatri PY - 2025 DA - 2025/11/04 TI - RNN Powered: Decoding Student Attentiveness BT - Proceedings of International Conference on Computer Science and Communication Engineering (ICCSCE 2025) PB - Atlantis Press SP - 3263 EP - 3273 SN - 2352-538X UR - https://doi.org/10.2991/978-94-6463-858-5_272 DO - 10.2991/978-94-6463-858-5_272 ID - Madhuri2025 ER -