Improved transition-based parsing by modeling characters instead of words with LSTMs

Citació

  • Ballesteros M, Dyer C, Smith Noah. Improved transition-based parsing by modeling characters instead of words with LSTMs. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP 2015). 2015 September 17-21; Lisbon, Portugal. [Stroudsburg]: ACL, 2015. p.349-59.

Enllaç permanent

Descripció

  • Resum

    We present extensions to a continuousstate dependency parsing method that makes it applicable to morphologically rich languages. Starting with a highperformance transition-based parser that uses long short-term memory (LSTM) recurrent neural networks to learn representations of the parser state, we replace lookup-based word representations with representations constructed from the orthographic representations of the words, also using LSTMs. This allows statistical sharing across word forms that are similar on the surface. Experiments for morphologically rich languages show that the parsing model benefits from incorporating the character-based encodings of words.
  • Descripció

    Comunicació presentada a la 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP 2015), organitzada pel SIGDAT i celebrada els dies 17 a 21 de setembre 2015 a Lisboa (Portugal).
  • Mostra el registre complet