Paper
7 March 2022 Chinese text summarization based on fine-tuned GPT2
Qihang Zhu, Lin Li, Libing Bai, Feng Hu
Author Affiliations +
Proceedings Volume 12167, Third International Conference on Electronics and Communication; Network and Computer Technology (ECNCT 2021); 121671A (2022) https://doi.org/10.1117/12.2629132
Event: 2021 Third International Conference on Electronics and Communication, Network and Computer Technology, 2021, Harbin, China
Abstract
Pre-trained language model has a good performance in text summarization task, thus we present a neural text summarization based on a powerful pre-trained language model GPT-2. In this paper, we propose a Chinese text summarization model by extending into our downstream task to acquire relevant, contentful, and coherent summarization. By extensive experiments, our model achieves absolute improvements of 10.75% on ROUGE-1, 13.85% on ROUGE-2, and 9.73% on ROUGE-L on the LCSTS datasets. Compared with the state-of-the-art summarization model, e.g. BERTSUM based model, our model also achieves an improvement of 25.22% on ROUGE-1.
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Qihang Zhu, Lin Li, Libing Bai, and Feng Hu "Chinese text summarization based on fine-tuned GPT2", Proc. SPIE 12167, Third International Conference on Electronics and Communication; Network and Computer Technology (ECNCT 2021), 121671A (7 March 2022); https://doi.org/10.1117/12.2629132
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Performance modeling

Data modeling

Neural networks

Data processing

Back to Top