MemoryPrompt: a light wrapper to improve context tracking in pre-trained language models

Mostra el registre complet Registre parcial de l'ítem

  • dc.contributor.author Carraz Rakotonirina, Nathanaël
  • dc.contributor.author Baroni, Marco
  • dc.date.accessioned 2024-10-15T06:16:23Z
  • dc.date.available 2024-10-15T06:16:23Z
  • dc.date.issued 2024
  • dc.description Comunicació presentada a Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), celebrada a Turí del 20 al 25 de maig de 2024.
  • dc.description.abstract Transformer-based language models (LMs) track contextual information through large, hard-coded input windows. We introduce MemoryPrompt, a leaner approach in which the LM is complemented by a small auxiliary recurrent network that passes information to the LM by prefixing its regular input with a sequence of vectors, akin to soft prompts, without requiring LM finetuning. Tested on a task designed to probe a LM’s ability to keep track of multiple fact updates, a MemoryPrompt-augmented LM outperforms much larger LMs that have access to the full input history. We also test MemoryPrompt on a long-distance dialogue dataset, where its performance is comparable to that of a model conditioned on the entire conversation history. In both experiments we also observe that, unlike full-finetuning approaches, MemoryPrompt does not suffer from catastrophic forgetting when adapted to new tasks, thus not disrupting the generalist capabilities of the underlying LM.
  • dc.format.mimetype application/pdf
  • dc.identifier.citation Carraz Rakotonirina N, Baroni M. MemoryPrompt: a light wrapper to improve context tracking in pre-trained language models. In: Calzolari N, Kan MY, Hoste V, Lenci A, Sakti S, Xue N, editors. Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024); 2024 May 20-25; Torino, Italy. Brussels: ELRA and ICCL; 2024. p. 11187-95.
  • dc.identifier.uri http://hdl.handle.net/10230/61395
  • dc.language.iso eng
  • dc.publisher ELRA (European Language Resources Association)
  • dc.relation.projectID info:eu-repo/grantAgreement/EC/H2020/101019291
  • dc.rights © 2024 ELRA - European Language Resources Association: CC BY-NC 4.0
  • dc.rights.accessRights info:eu-repo/semantics/openAccess
  • dc.rights.uri http://creativecommons.org/licenses/by-nc/4.0/
  • dc.subject.keyword Memory-augmented language model
  • dc.subject.keyword Prompting
  • dc.title MemoryPrompt: a light wrapper to improve context tracking in pre-trained language models
  • dc.type info:eu-repo/semantics/conferenceObject
  • dc.type.version info:eu-repo/semantics/publishedVersion