Proceedings of the 2025 2nd International Conference on Mechanics, Electronics Engineering and Automation (ICMEEA 2025)

Transformer and Its Application and Advances in Text Generation

Authors
Tong Wu1, *, Xiangzhe Xu2
1School of Electronic Engineering, Xi’an University of Posts and Telecommunications, Xi’an, 710061, China
2College of Automation, Hangzhou Dianzi University, Hangzhou, 310018, China
*Corresponding author. Email: atharp1@my.apsu.edu
Corresponding Author
Tong Wu
Available Online 31 August 2025.
DOI
10.2991/978-94-6463-821-9_44How to use a DOI?
Keywords
Transformer model; text generation; multimodal approaches; domain-specific applications
Abstract

The Transformer model holds a pivotal position in NLP (natural language processing), demonstrating significant advantages in text generation tasks. This paper explores its applications and recent advances across domains. First, multimodal and syntax-controlled text generation approaches are introduced. These include the VCT (Vision-enhanced and Consensus-aware Transformer), integrating visual information and consensus knowledge, and the syntax-guided model GuiG, both enhancing semantic and syntactic quality through cross-modal interactions and dynamic attention mechanisms. Second, a comparative analysis of LLMs (large language models) like BART (Bidirectional and Auto-Regressive Transformer), T5 (Text-To-Text Transfer Transformer), and PEGASUS (Pre-training with Extracted Gap-sentences for Abstractive Summarization) validates the Transformer’s superiority in handling long-term dependencies and consistency for automatic summarization. Domain-specific applications are further examined. Biomedical text generation models (e.g., BioGPT) improve accuracy via domain adaptation, while task-oriented dialogue systems (e.g., MegaT) optimize practicality through hybrid architectures. Addressing security challenges, a hybrid detection framework combining bidirectional LSTM (long short-term memory) and attention mechanisms achieves high-precision classification of AI-generated text. This study systematically reviews the technological evolution and boundaries of Transformer models, offering insights for advancing text generation technologies, refining domain-specific designs, and mitigating security risks posed by generative content.

Copyright
© 2025 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Download article (PDF)

Volume Title
Proceedings of the 2025 2nd International Conference on Mechanics, Electronics Engineering and Automation (ICMEEA 2025)
Series
Advances in Engineering Research
Publication Date
31 August 2025
ISBN
978-94-6463-821-9
ISSN
2352-5401
DOI
10.2991/978-94-6463-821-9_44How to use a DOI?
Copyright
© 2025 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Cite this article

TY  - CONF
AU  - Tong Wu
AU  - Xiangzhe Xu
PY  - 2025
DA  - 2025/08/31
TI  - Transformer and Its Application and Advances in Text Generation
BT  - Proceedings of the 2025 2nd International Conference on Mechanics, Electronics Engineering and Automation (ICMEEA 2025)
PB  - Atlantis Press
SP  - 425
EP  - 435
SN  - 2352-5401
UR  - https://doi.org/10.2991/978-94-6463-821-9_44
DO  - 10.2991/978-94-6463-821-9_44
ID  - Wu2025
ER  -