Proceedings of the 2025 2nd International Conference on Electrical Engineering and Intelligent Control (EEIC 2025)

Research on Specialized Chips for Artificial Intelligence Inference

Authors
Yixuan Yang1, *
1College of Electronic Information and Optical Engineering, Nankai University, Tianjin, China
*Corresponding author. Email: 2211668@mail.nankai.edu.cn
Corresponding Author
Yixuan Yang
Available Online 23 October 2025.
DOI
10.2991/978-94-6463-864-6_4How to use a DOI?
Keywords
AI Inference Chips; Compute-In-Memory (CIM); Edge AI; Hardware–Software Co-Design
Abstract

AI inference chips, they sort of play this biggish role when it comes to achieving both high-performance and low-energy computation. This is seen across lots of areas like autonomous systems, clever devices, and cloud services. AI models are growing at an unprecedented rate, making it difficult for traditional general-purpose processors to meet the demands for computing and energy efficiency. So, people really need these hardware-specific solutions or something along those lines. This article here tries to focus on designing and optimizing these AI inference chips specifically for server use and applications on the edge. Finally, the article attempts to validate these innovative techniques in real-world settings. However, challenges such as temperature control and scalable production remain significant obstacles. There’s a certain level of difficulty in figuring out which direction works best. Afterward, it digs into technologies that are pretty much crucial for enabling advancements, including what might be called mixed-precision computing, non-volatile compute-in-memory (nvCIM) circuits, as well as 3D chiplet stacking. Finally, the article attempts to validate these innovative techniques in real-world settings. However, challenges such as temperature control and scalable production remain significant obstacles. The findings, although vague, seem to emphasize this interplay between hardware and algorithms and advocate for hybrid structures. They wind up offering insights, though somewhat speculative, useful for thinking about developing future AI chip solutions that are efficient in terms of energy, scaled easily, and able to function in flexible manners.

Copyright
© 2025 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Download article (PDF)

Volume Title
Proceedings of the 2025 2nd International Conference on Electrical Engineering and Intelligent Control (EEIC 2025)
Series
Advances in Engineering Research
Publication Date
23 October 2025
ISBN
978-94-6463-864-6
ISSN
2352-5401
DOI
10.2991/978-94-6463-864-6_4How to use a DOI?
Copyright
© 2025 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Cite this article

TY  - CONF
AU  - Yixuan Yang
PY  - 2025
DA  - 2025/10/23
TI  - Research on Specialized Chips for Artificial Intelligence Inference
BT  - Proceedings of the 2025 2nd International Conference on Electrical Engineering and Intelligent Control (EEIC 2025)
PB  - Atlantis Press
SP  - 21
EP  - 31
SN  - 2352-5401
UR  - https://doi.org/10.2991/978-94-6463-864-6_4
DO  - 10.2991/978-94-6463-864-6_4
ID  - Yang2025
ER  -