Proceedings of the International Conference on Sustainability Innovation in Computing and Engineering (ICSICE 2024)

Dynamic Gesture Recognition using LSTM and Tf-Pose for Human Action Analysis

Authors
C. M. Karthik Sundar1, *, D. Hitheash1, G. SathyaDevi2
1UG Scholar, Department of Information Technology, St. Joseph’s College of Engineering, OMR, Chennai, 600119, Tamil Nadu, India
2Assistant Professor, Department of Information Technology, St. Joseph’s College of Engineering, OMR, Chennai, 600119, Tamil Nadu, India
*Corresponding author. Email: karthiksundarcm@gmail.com
Corresponding Author
C. M. Karthik Sundar
Available Online 23 May 2025.
DOI
10.2991/978-94-6463-718-2_123How to use a DOI?
Keywords
Human Activity Recognition; Body Pose Estimation; Recurrent Neural Network; TensorFlow Pose Estimation; Real-time Motion Recognition; Skeletal based Motion Recognition; Movement Temporal Characteristics; Video Processing; Animal; Gesture Recognition
Abstract

Human Action Recognition (HAR) is a significant important application in many areas including surveillance systems, human-computer interfaces and even sports monitoring. In this research, we investigate pose estimation as the first step, followed by the use of Long Short-Term Memory (LSTM) networks for action Recognition in real-time. The system uses key points detection from human pose using the tf-pose estimation library and follows the LSTM model in order to capture temporal patterns of movement from the video frames. Such integration enables the model to encode complementary spatial and temporal information from video sequences. The textual description of the proposed framework is aimed at the action sequences like jumping, running, waving and is further compatible with real-time inference using live video streams. The proposed system is validated with several benchmark datasets where promising accuracy and efficiency are demonstrated against counterparts. Additionally, the model’s capability of working on the skeletal data reduces the computational problem thus making it easy to deploy in low resource settings. Experimental outcome shows that the proposed approach yields superior performance to conventional action recognition methods, primarily due to its emphasis on the temporal behaviour of the human skeleton. Hence, our current study offers more meaningful information concerning increasing the efficiency and effectiveness of HAR systems using a deep learning framework.

Copyright
© 2025 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Download article (PDF)

Volume Title
Proceedings of the International Conference on Sustainability Innovation in Computing and Engineering (ICSICE 2024)
Series
Advances in Computer Science Research
Publication Date
23 May 2025
ISBN
978-94-6463-718-2
ISSN
2352-538X
DOI
10.2991/978-94-6463-718-2_123How to use a DOI?
Copyright
© 2025 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Cite this article

TY  - CONF
AU  - C. M. Karthik Sundar
AU  - D. Hitheash
AU  - G. SathyaDevi
PY  - 2025
DA  - 2025/05/23
TI  - Dynamic Gesture Recognition using LSTM and Tf-Pose for Human Action Analysis
BT  - Proceedings of the International Conference on Sustainability Innovation in Computing and Engineering (ICSICE 2024)
PB  - Atlantis Press
SP  - 1476
EP  - 1484
SN  - 2352-538X
UR  - https://doi.org/10.2991/978-94-6463-718-2_123
DO  - 10.2991/978-94-6463-718-2_123
ID  - Sundar2025
ER  -