Browsing by Author "Ballesteros, Miguel"

Sort by: Order: Results:

  • Ballesteros, Miguel; Wanner, Leo (ACL (Association for Computational Linguistics), 2016)
    Even syntactically correct sentences are perceived as awkward if they do not contain correct punctuation. Still, the problem of automatic generation of punctuation marks has been largely neglected for a long time. We/npresent ...
  • Barbieri, Francesco; Ballesteros, Miguel; Saggion, Horacio (ACL (Association for Computational Linguistics), 2017)
    Emojis are ideograms which are naturally combined with plain text to visually complement or condense the meaning of a message. Despite being widely used in social media, their underlying semantics have received little ...
  • Ballesteros, Miguel; Bohnet, Bernd; Mille, Simon; Wanner, Leo (Cambridge University Press, 2016)
    ‘Deep-syntactic’ dependency structures that capture the argumentative, attributive and co-/nordinative relations between full words of a sentence have a great potential for a number/nof NLP-applications. The abstraction ...
  • Ballesteros, Miguel; Bohnet, Bernd; Mille, Simon; Wanner, Leo (ACL (Association for Computational Linguistics), 2015)
    Abstract structures from which the generation naturally starts often do not contain any functional nodes, while surface-syntactic structures or a chain of tokens in a linearized tree contain all of them. Therefore, data-driven ...
  • Kuncoro, Adhiguna; Ballesteros, Miguel; Kong, Lingpeng; Dyer, Chris; Smith, Noah A. (ACL (Association for Computational Linguistics), 2016)
    We introduce two first-order graph-based dependency parsers achieving a new state of the art. The first is a consensus parser built from an ensemble of independently trained greedy LSTM transition-based parsers with different ...
  • Padró, Muntsa; Ballesteros, Miguel; Martínez, Héctor; Bohnet, Bernd (ACL (Association for Computational Linguistics), 2013)
    This paper studies the performance of different parsers over a large Spanish treebank. The aim of this work is to assess the limitations of state-of-the-art parsers. We want to select the most appropriate parser for ...
  • Pérez-Mayos, Laura; Ballesteros, Miguel; Wanner, Leo (ACL (Association for Computational Linguistics), 2021)
    Transformers-based pretrained language models achieve outstanding results in many wellknown NLU benchmarks. However, while pretraining methods are very convenient, they are expensive in terms of time and resources. This ...
  • Ballesteros, Miguel; Dyer, Chris; Smith, Noah A. (ACL (Association for Computational Linguistics), 2015)
    We present extensions to a continuousstate dependency parsing method that makes it applicable to morphologically rich languages. Starting with a highperformance transition-based parser that uses long short-term memory ...
  • Pérez-Mayos, Laura (Universitat Pompeu Fabra, 2022-06-28)
    Pretrained Transformer-based language models have quickly replaced traditional approaches to model NLP tasks, pushing the state of the art to new levels, and will certainly continue to be very influential in the years to ...
  • Barbieri, Francesco; Ballesteros, Miguel; Ronzano, Francesco; Saggion, Horacio (ACL (Association for Computational Linguistics), 2018)
    Emojis are small images that are commonly included in social media text messages. The combination of visual and textual content in the same message builds up a modern way of communication, that automatic systems are not ...
  • Lample, Guillaume; Ballesteros, Miguel; Subramanian, Sandeep; Kawakami, Kazuya; Dyer, Chris (ACL (Association for Computational Linguistics), 2016)
    State-of-the-art named entity recognition systems/nrely heavily on hand-crafted features and/ndomain-specific knowledge in order to learn/neffectively from the small, supervised training/ncorpora that are available. In ...
  • Pérez-Mayos, Laura; Carlini, Roberto; Ballesteros, Miguel; Wanner, Leo (ACL (Association for Computational Linguistics), 2021)
    The adaptation of pretrained language models to solve supervised tasks has become a baseline in NLP, and many recent works have focused on studying how linguistic information is encoded in the pretrained sentence ...
  • Dyer, Chris; Kuncoro, Adhiguna; Ballesteros, Miguel; Smith, Noah A. (ACL (Association for Computational Linguistics), 2016)
    We introduce recurrent neural network grammars,/nprobabilistic models of sentences with/nexplicit phrase structure. We explain efficient/ninference procedures that allow application to/nboth parsing and language modeling. ...
  • Barbieri, Francesco; Camacho-Collados, Jose; Ronzano, Francesco; Espinosa-Anke, Luis; Ballesteros, Miguel; Basile, Valerio; Patti, Viviana; Saggion, Horacio (ACL (Association for Computational Linguistics), 2018)
    This paper describes the results of the first shared task on Multilingual Emoji Prediction, organized as part of SemEval 2018. Given the text of a tweet, the task consists of predicting the most likely emoji to be used ...
  • Barbieri, Francesco; Espinosa-Anke, Luis; Ballesteros, Miguel; Soler Company, Juan; Saggion, Horacio (ACL (Association for Computational Linguistics), 2017)
    Videogame streaming platforms have become a paramount example of noisy usergenerated text. These are websites where gaming is broadcasted, and allows interaction with viewers via integrated chatrooms. Probably the best ...
  • Buckman, Jacob; Ballesteros, Miguel; Dyer, Chris (ACL (Association for Computational Linguistics), 2016)
    We introduce a novel approach to the decoding problem in transition-based parsing: heuristic backtracking. This algorithm uses a series of partial parses on the sentence to locate the best candidate parse, using confidence ...
  • Dyer, Chris; Ballesteros, Miguel; Ling, W; Matthews, A; Smith, Noah A. (ACL (Association for Computational Linguistics), 2015)
    We propose a technique for learning representations of parser states in transitionbased dependency parsers. Our primary innovation is a new control structure for sequence-to-sequence neural networks— the stack LSTM. Like ...
  • Ballesteros, Miguel; Carreras, Xavier (ACL (Association for Computational Linguistics), 2015)
    We present a transition-based arc-eager model to parse spinal trees, a dependencybased representation that includes phrasestructure information in the form of constituent spines assigned to tokens. As a main advantage, the ...
  • Soler Company, Juan; Ballesteros, Miguel; Bohnet, Bernd; Mille, Simon; Wanner, Leo (ACL (Association for Computational Linguistics), 2015)
    “Deep-syntactic” dependency structures bridge the gap between the surface-syntactic structures as produced by state-of-the-art dependency parsers and semantic logical forms in that they abstract away from surfacesyntactic ...

Search DSpace

Browse

My Account

In collaboration with Compliant to Partaking