Controllable sentence simplification with a unified text-to-text transfer transformer
Controllable sentence simplification with a unified text-to-text transfer transformer
Citació
- Sheang KC, Saggion H. Controllable sentence simplification with a unified text-to-text transfer transformer. In: Proceedings of the 14th International Conference on Natural Language Generation (INLG); 2021 Sep 20-24; Aberdeen, Scotland, UK. Aberdeen: Association for Computational Linguistics; 2021. p. 341-52.
Enllaç permanent
Descripció
Resum
Recently, a large pre-trained language model called T5 (A Unified Text-to-Text Transfer Transformer) has achieved state-of-the-art performance in many NLP tasks. However, no study has been found using this pre-trained model on Text Simplification. Therefore in this paper, we explore the use of T5 fine-tuning on Text Simplification combining with a controllable mechanism to regulate the system outputs that can help generate adapted text for different target audiences. Our experiments show that our model achieves remarkable results with gains of between +0.69 and +1.41 over the current state-of-the-art (BART+ACCESS). We argue that using a pre-trained model such as T5, trained on several tasks with large amounts of data, can help improve Text Simplification.Descripció
Comunicació presentada a: 14th International Conference on Natural Language Generation (INLG), celebrat del 20 al 24 de setembre de 2021, a Aberdeen, Escòcia, Regne Unit.