On the evolution of syntactic information encoded by BERT’s contextualized representations

Mostra el registre complet Registre parcial de l'ítem

  • dc.contributor.author Pérez-Mayos, Laura
  • dc.contributor.author Carlini, Roberto
  • dc.contributor.author Ballesteros, Miguel
  • dc.contributor.author Wanner, Leo
  • dc.date.accessioned 2023-03-01T07:21:40Z
  • dc.date.available 2023-03-01T07:21:40Z
  • dc.date.issued 2021
  • dc.description Comunicació presentada a 16th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2021), celebrat del 19 al 23 d'abril de 2021 de manera virtual.
  • dc.description.abstract The adaptation of pretrained language models to solve supervised tasks has become a baseline in NLP, and many recent works have focused on studying how linguistic information is encoded in the pretrained sentence representations. Among other information, it has been shown that entire syntax trees are implicitly embedded in the geometry of such models. As these models are often fine-tuned, it becomes increasingly important to understand how the encoded knowledge evolves along the fine-tuning. In this paper, we analyze the evolution of the embedded syntax trees along the fine-tuning process of BERT for six different tasks, covering all levels of the linguistic structure. Experimental results show that the encoded syntactic information is forgotten (PoS tagging), reinforced (dependency and constituency parsing) or preserved (semantics-related tasks) in different ways along the finetuning process depending on the task.
  • dc.description.sponsorship This work has been partially funded by the European Commission via its H2020 Research Program under the contract numbers 779962, 786731, 825079, and 870930.
  • dc.format.mimetype application/pdf
  • dc.identifier.citation Pérez-Mayos L, Carlini R, Ballesteros M, Wanner L. On the evolution of syntactic information encoded by BERT’s contextualized representations. In: Merlo P, Tiedemann J, Tsarfaty R. The 16th Conference of the European Chapter of the Association for Computational Linguistics: proceedings of the Conference; 2021 Apr 19-23; online. Stroudsburg: Association for Computational Linguistics; 2021. p. 2243-58. DOI: 10.18653/v1/2021.eacl-main.191
  • dc.identifier.doi http://dx.doi.org/10.18653/v1/2021.eacl-main.191
  • dc.identifier.uri http://hdl.handle.net/10230/55971
  • dc.language.iso eng
  • dc.publisher ACL (Association for Computational Linguistics)
  • dc.relation.ispartof Merlo P, Tiedemann J, Tsarfaty R. The 16th Conference of the European Chapter of the Association for Computational Linguistics: proceedings of the Conference; 2021 Apr 19-23; online. Stroudsburg: Association for Computational Linguistics; 2021. p. 2243-58.
  • dc.relation.projectID info:eu-repo/grantAgreement/EC/H2020/779962
  • dc.relation.projectID info:eu-repo/grantAgreement/EC/H2020/786731
  • dc.relation.projectID info:eu-repo/grantAgreement/EC/H2020/825079
  • dc.relation.projectID info:eu-repo/grantAgreement/EC/H2020/870930
  • dc.rights © ACL, Creative Commons Attribution 4.0 License
  • dc.rights.accessRights info:eu-repo/semantics/openAccess
  • dc.rights.uri https://creativecommons.org/licenses/by/4.0/
  • dc.subject.other Lingüística computacional
  • dc.title On the evolution of syntactic information encoded by BERT’s contextualized representations
  • dc.type info:eu-repo/semantics/conferenceObject
  • dc.type.version info:eu-repo/semantics/publishedVersion