Proceedings of International Conference on Computer Science and Communication Engineering (ICCSCE 2025)

Optimized Energy Consumption Prediction Using Dynamic Long Short Term Memory Based Deep Learning Technique

Authors
Sri Harish Nandigam1, 2, *, K. Nageswara Rao3
1Research Scholar, Department of EEE, Hindustan Institute of Technology and Science, Chennai, 600 016, Tamilnadu, India
2Senior Assistant Professor, Department of EEE, Sri Vasavi Engineering College, Pedatadepalli, Tadepalligudem, Andhra Pradesh, 534101, India
3Assistant Professor, Department of EEE, Hindustan Institute of Technology, Chennai, India
*Corresponding author. Email: rp.22703077@student.hindustanuniv.ac.in
Corresponding Author
Sri Harish Nandigam
Available Online 4 November 2025.
DOI
10.2991/978-94-6463-858-5_228How to use a DOI?
Keywords
Smart grids; Power management strategies; Long Short-Term Memory (LSTM); Gated Recurrent Unit (GRU); Dynamic LSTM; Power consumption data
Abstract

This research looks at how well GRU, LSTM, and Dynamic LSTM architectures work at predicting energy use based on changes in time, pointing out their unique pros and cons. Long Short-Term Memory (LSTM) networks are noted for their ability to model complex temporal dependencies, making them well-suited for datasets with long-term patterns. However, LSTMs require careful hyperparameter tuning and extensive training data for optimal performance. When compared to LSTMs, Gated Recurrent Units (GRUs) are easier to use because they require less training time and computational power. This makes them good for situations where resources are limited or where time patterns are not very complicated. Dynamic LSTMs, especially those with attention mechanisms added, make forecasting even more accurate by changing how much weight to give to time steps and features. This lets the model focus on the most important patterns in time and space. Attention mechanisms work best with non-stationary, non-linear time series data, like patterns of energy use, because they help the model better understand both short-term and long-term dependencies. Examples from carbon price forecasting, demand forecasting, and microgrid energy management demonstrate the superior performance of attention-based models. By combining features from time and space, hybrid models that use LSTM along with wavelet decomposition or fuzzy polynomial networks make predictions even more accurate. Overall, LSTMs are great at modeling complex dependencies, GRUs are good at using little computing power, and dynamic LSTMs with attention mechanisms make energy consumption forecasting much more flexible and accurate.

Copyright
© 2025 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Download article (PDF)

Volume Title
Proceedings of International Conference on Computer Science and Communication Engineering (ICCSCE 2025)
Series
Advances in Computer Science Research
Publication Date
4 November 2025
ISBN
978-94-6463-858-5
ISSN
2352-538X
DOI
10.2991/978-94-6463-858-5_228How to use a DOI?
Copyright
© 2025 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Cite this article

TY  - CONF
AU  - Sri Harish Nandigam
AU  - K. Nageswara Rao
PY  - 2025
DA  - 2025/11/04
TI  - Optimized Energy Consumption Prediction Using Dynamic Long Short Term Memory Based Deep Learning Technique
BT  - Proceedings of International Conference on Computer Science and Communication Engineering (ICCSCE 2025)
PB  - Atlantis Press
SP  - 2731
EP  - 2743
SN  - 2352-538X
UR  - https://doi.org/10.2991/978-94-6463-858-5_228
DO  - 10.2991/978-94-6463-858-5_228
ID  - Nandigam2025
ER  -