CNNs found to jump around more skillfully than RNNs: compositional generalization in seq2seq convolutional networks

dc.contributor.authorDessi, Roberto
dc.contributor.authorBaroni, Marco
dc.date.accessioned2022-12-02T07:07:27Z
dc.date.available2022-12-02T07:07:27Z
dc.date.issued2019
dc.descriptionComunicació presentada a la 57th Annual Meeting of the Association for Computational Linguistics, celebrada del 28 de juliol al 2 d'agost de 2019 a Florència, Itàlia
dc.description.abstractLake and Baroni (2018) introduced the SCAN dataset probing the ability of seq2seq models to capture compositional generalizations, such as inferring the meaning of “jump around” 0-shot from the component words. Recurrent networks (RNNs) were found to completely fail the most challenging generalization cases. We test here a convolutional network (CNN) on these tasks, reporting hugely improved performance with respect to RNNs. Despite the big improvement, the CNN has however not induced systematic rules, suggesting that the difference between compositional and noncompositional behaviour is not clear-cut.en
dc.format.mimetypeapplication/pdf
dc.identifier.citationDessi R, Baroni M. CNNs found to jump around more skillfully than RNNs: compositional generalization in seq2seq convolutional networks. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019; 2019 Jul 28 - Aug 2; Florence, Italy. Stroudsburg: Association for Computational Linguistics; 2019. p. 3919-23.
dc.identifier.doihttps://doi.org/10.18653/v1/P19-1381
dc.identifier.urihttp://hdl.handle.net/10230/55068
dc.language.isoeng
dc.publisherACL (Association for Computational Linguistics)
dc.relation.ispartofProceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019; 2019 Jul 28 - Aug 2; Florence, Italy. Stroudsburg: Association for Computational Linguistics; 2019. p. 3919-23
dc.rights© ACL, Creative Commons Attribution 4.0 License
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subject.otherLingüística computacionalca
dc.subject.otherIntel·ligència artificialca
dc.titleCNNs found to jump around more skillfully than RNNs: compositional generalization in seq2seq convolutional networksen
dc.typeinfo:eu-repo/semantics/conferenceObject
dc.type.versioninfo:eu-repo/semantics/publishedVersion

Files

License

Rights