Study and correlation analysis of linguistic, perceptual, and automatic machine translation evaluations
Mostra el registre complet Registre parcial de l'ítem
- dc.contributor.author Farrús, Mireiaca
- dc.contributor.author Costa-jussà, Marta R.ca
- dc.contributor.author Popovic, Majaca
- dc.date.accessioned 2018-04-30T12:57:07Z
- dc.date.available 2018-04-30T12:57:07Z
- dc.date.issued 2011
- dc.description.abstract Evaluation of machine translation output is an important task. Various human evaluation techniques as well as automatic metrics have been proposed and investigated in the last decade. However, very few evaluation methods take linguistic aspect into account. In this paper, we use an objective evaluation method for machine translation output that classifies all translation errors into one of the five following linguistic levels: orthographic, morphological, lexical, semantic and syntactic, in order to analyse its linguistic quality. Linguistic guidelines for the target language are required, and human evaluators use them in to classify the output errors. The experiments are performed on English-to-Catalan and Spanish-to-Catalan translation outputs generated by four different systems: two rule-based and two statistical. All translations are evaluated using three following methods: a standard human evaluation method, several widely used automatic metrics, and the linguistic evaluation. Pearson and Spearman correlation coefficients between the linguistic, human and automatic results are then calculated, showing that the semantic level correlates significantly with both human evaluation and automatic metrics.en
- dc.description.sponsorship This work has been partially funded by the Spanish Department of Science and Innovation through the Juan de la Cierva fellowship program and the BUCEADOR project (TEC2009-14094-C04-01). The authors also want to thank the Barcelona Media Innovation Centre for its support and permission to publish this research.en
- dc.description.sponsorship This work has been partially funded by the Spanish Department of Science and Innovation through the Juan de la Cierva fellowship program and the BUCEADOR project (TEC2009-14094-C04-01). The authors also want to thank the Barcelona Media Innovation Centre for its support and permission to publish this research.en
- dc.description.sponsorship This work has been partially funded by the Spanish Department of Science and Innovation through the Juan de la Cierva fellowship program and the BUCEADOR project (TEC2009-14094-C04-01). The authors also want to thank the Barcelona Media Innovation Centre for its support and permission to publish this research.en
- dc.format.mimetype application/pdf
- dc.identifier.citation Farrús M, Costa-Jussà MR, Popovic M. Study and correlation analysis of linguistic, perceptual, and automatic machine translation evaluations. J Am Soc Inf Sci. 2012;63(1):174-84. DOI: 10.1002/asi.21674
- dc.identifier.doi http://dx.doi.org/10.1002/asi.21674
- dc.identifier.issn 2330-1635
- dc.identifier.uri http://hdl.handle.net/10230/34518
- dc.language.iso eng
- dc.publisher Wileyca
- dc.relation.ispartof Journal of the Association for Information Science and Technology . 2012;63(1):174-84.
- dc.rights This is the peer reviewed version of the following article: "Farrús M, Costa-Jussà MR, Popovic M. Study and correlation analysis of linguistic, perceptual, and automatic machine translation evaluations. J Am Soc Inf Sci. 2012; 63 (1): 174-84.", which has been published in final form at http://dx.doi.org/10.1002/asi.21674. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.
- dc.rights.accessRights info:eu-repo/semantics/openAccess
- dc.subject.keyword Machine translationen
- dc.subject.keyword Linguistic evaluationen
- dc.subject.keyword Automatic evaluationen
- dc.subject.keyword Human evaluationen
- dc.subject.keyword Catalan
- dc.title Study and correlation analysis of linguistic, perceptual, and automatic machine translation evaluationsca
- dc.type info:eu-repo/semantics/article
- dc.type.version info:eu-repo/semantics/acceptedVersion