Research on People Following Technology Based on Multiple Sensors
- DOI
- 10.2991/978-94-6463-986-5_38How to use a DOI?
- Keywords
- Multi-sensor Fusion; Person-following Technology; Target Recognition; Path Planning; Obstacle Avoidance
- Abstract
With the continuous expansion of application scenarios for service robots, person-following technology has become a key capability for enabling human-robot interaction and intelligent control. This paper focuses on a multi-sensor fusion-based person-following system and systematically reviews its core technologies, including target recognition, feature extraction, path planning, and obstacle avoidance. First, the features of typical followable targets are categorized into five aspects: visual appearance, geometric structure, motion trajectory, multi-modal perception, and pose estimation. Representative methods such as deep learning, skeleton extraction, and trajectory prediction are discussed. Next, the mainstream path planning strategies and obstacle avoidance mechanisms are reviewed. Finally, an analysis of existing prototype systems reveals that current person-following technology still faces challenges in occlusion handling, real-time embedded computing, and ensuring user experience and comfort.
- Copyright
- © 2026 The Author(s)
- Open Access
- Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
Cite this article
TY - CONF AU - Zhongyang Li PY - 2026 DA - 2026/02/18 TI - Research on People Following Technology Based on Multiple Sensors BT - Proceedings of the 2025 International Conference on Electronics, Electrical and Grid Technology (ICEEGT 2025) PB - Atlantis Press SP - 367 EP - 377 SN - 2352-5401 UR - https://doi.org/10.2991/978-94-6463-986-5_38 DO - 10.2991/978-94-6463-986-5_38 ID - Li2026 ER -