Research on Customer Emotion Recognition on the Side of State Grid Power Customer Service Agents based on Large Language Models
DOI:
https://doi.org/10.54691/74tfn712Keywords:
Customer Service; Emotion Recognition; Large Language Model; Online Agent; Model Fine-tuning; Script Recommendation.Abstract
In the service scenarios on the side of power customer service agents, it is difficult for agents to accurately capture and respond to changes in customer emotions, resulting in poor service experience. Fluctuations in customer emotions directly affect their communication methods and expression of needs. To address this problem, this paper proposes a three-stage emotion recognition and response scheme of "small model pre-screening - large model emotion grading - recommended soothing scripts". Firstly, a domain-specific sensitive lexicon for power customer service is constructed, and a lightweight small model is adopted to achieve rapid sensitive word recognition. If a sensitive word is matched, manual agent intervention is triggered and the call is transferred to a manual agent. Secondly, based on the State Grid Guangming Large Model, the model is fine-tuned to build an emotion recognition model, which judges the emotional polarity (positive, neutral, negative) of the dialogue text after the manual agent is connected, and subdivides negative emotions into three levels: mild, moderate, and severe. Finally, according to the recognition results of different levels of negative emotions, soothing scripts are recommended to assist agents in making accurate responses. This can effectively improve the efficiency of agents' emotion perception and response speed, enhance customer experience, and provide corresponding technical support for the intelligent upgrading of power customer service.
Downloads
References
[1] Zhang Yuqing. Research on Emotion Understanding Methods Based on Multimodal Feature Fusion[D]. Beijing University of Posts and Telecommunications, 2025. DOI:10.26969/d.cnki.gbydu.2025.000283.
[2] Wang Yaoyang, Huang Jingze. Multimodal Emotion Recognition Method Based on Diffusion Model and Modal Generation[J]. Metrology & Measurement Technology, 2025, 51(11):63-66. DOI:10.15988/j.cnki.1004-6941.2025.11.015.
[3] Tang Dong'er. Research on Intelligent Customer Service Decision Support System Based on Multimodal Data Fusion[J]. Science & Technology Information, 2025, 23(16):28-30. DOI:10.16661/j.cnki.1672-3791.2501-5042-6163.
[4] Wu Shisong, Dong Zhaojie. Design of Intelligent Customer Service System Based on Deep Semantic Matching Model[J]. Techniques of Automation and Applications, 2024, 43(07):176-180. DOI:10.20033/j.1003-7241.(2024)07-0176-05.
[5] Qin Hao, Liu Zhenhua, Su Liwei. Emotion Recognition Method for Power Intelligent Customer Service System Based on Multimodal Fusion[J]. Techniques of Automation and Applications, 2024, 43(04):169-172. DOI:10.20033/j.1003-7241.(2024)04-0169-04.
[6] Zhang Mengzhe. Design and Implementation of Customer Service Emotion Monitoring System Based on Speech Emotion Recognition[D]. Southeast University, 2023. DOI:10.27014/d.cnki.gdnau.2023.004976.
[7] Zhu Yan. Emotion Recognition and Early Warning of Government Hotline Customer Service Based on Dual Modalities of Speech and Text[D]. Donghua University, 2023. DOI:10.27012/d.cnki.gdhuu.2023.001421.
[8] Zhou Mingrui, Wang Yang, Peng Cheng, et al. CHARM-Net: Heterogeneous Recursive Network EEG Emotion Recognition Model Based on Hybrid Attention Enhancement[J]. Journal of Ordnance Equipment Engineering, 2025, 46(11):248-257.
[9] Liu Jie. Network Text Emotion Recognition Based on Deep Learning and Ensemble Learning[J]. Electronic Design Engineering, 2025, 33(16):42-45+50. DOI:10.14022/j.issn1674-6236.2025.16.009.
[10] Xie Xingyu, Ding Caiqin, Wang Xianlun, et al. Multimodal Emotion Recognition Fusing Text, Speech, and Expressions[J]. Journal of Qingdao University (Engineering and Technology Edition), 2024, 39(03):20-30. DOI:10.13306/j.1006-9798.2024.03.004.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Scientific Journal of Intelligent Systems Research

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.




