Proceedings of the International Conference on Sustainability Innovation in Computing and Engineering (ICSICE 2024)

Efficient Sentiment Classification using DistilBERT for Enhanced NLP Performance

Authors
J. Nirmala Gandhi1, *, K. Venkatesh Guru1, A. Rajiv Kannan2, R. Anandha Sudhan3, S. Arul Kumar3, M. Bharathvaj3
1Assistant Professor, Computer Science Engineering, K.S.R. College Engineering, Tiruchengode, TamilNadu, India
2Professor, Computer Science Engineering, K.S.R. College Engineering, Tiruchengode, TamilNadu, India
3Student, Computer Science Engineering, K.S.R. College Engineering, Tiruchengode, TamilNadu, India
*Corresponding author. Email: nirmalamuthu@gmail.com
Corresponding Author
J. Nirmala Gandhi
Available Online 23 May 2025.
DOI
10.2991/978-94-6463-718-2_125How to use a DOI?
Keywords
Sentiment analysis; DistilBERT; Natural Language Processing; transformer models; real-time applications; domain adaptation; lightweight architecture; computational efficiency; knowledge graphs; multilingual sentiment analysis
Abstract

The task of sentiment analysis, one of the most critical Natural Language Processing (NLP) tasks has recently risen in importance due to the astronomical growth of unstructured textual data sourced from social networks, e-commerce websites, and online news. Transformer-based models like BERT have achieved state-of-the-art results in sentiment classification, but require heavy computation resources making them infeasible for real-time applications and resource-constrained environments. In this study, we explore whether DistilBERT, a distillation of BERT, can reach an efficient sentiment classification while maintaining a competitive performance. The research showcases DistilBERT’s ability and scalability in custom terrain datasets by utilizing its compact architecture and further enhancing this through additional optimization techniques, including domain-specific fine-tuning, knowledge graph integration, and attention mechanism. The results show that DistilBERT is competitive in accuracy but saves on inference time and resource usage making it appropriate for practical applications. We conclude that DistilBERT is a promising architecture for maintaining speed and accuracy, with the potential for improvement across multilingual and low-resource scenarios.

Copyright
© 2025 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Download article (PDF)

Volume Title
Proceedings of the International Conference on Sustainability Innovation in Computing and Engineering (ICSICE 2024)
Series
Advances in Computer Science Research
Publication Date
23 May 2025
ISBN
978-94-6463-718-2
ISSN
2352-538X
DOI
10.2991/978-94-6463-718-2_125How to use a DOI?
Copyright
© 2025 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Cite this article

TY  - CONF
AU  - J. Nirmala Gandhi
AU  - K. Venkatesh Guru
AU  - A. Rajiv Kannan
AU  - R. Anandha Sudhan
AU  - S. Arul Kumar
AU  - M. Bharathvaj
PY  - 2025
DA  - 2025/05/23
TI  - Efficient Sentiment Classification using DistilBERT for Enhanced NLP Performance
BT  - Proceedings of the International Conference on Sustainability Innovation in Computing and Engineering (ICSICE 2024)
PB  - Atlantis Press
SP  - 1500
EP  - 1511
SN  - 2352-538X
UR  - https://doi.org/10.2991/978-94-6463-718-2_125
DO  - 10.2991/978-94-6463-718-2_125
ID  - Gandhi2025
ER  -