The latest big leap forward in machine translation: adaptive MT

2228

Technological developments in the field of machine translation are taking place very fast and as a result computer-aided translation tools continue reinventing themselves with a view to make the translation process more time-efficient and, hence, better supporting translators’ work. In this post, Heriot-Watt University lecturer and freelance translator Ramón Inglada presents one of the recent innovations in this field that is being followed with increasing curiosity by both the industry and the academia: adaptive machine translation.

The latest big leap forward_banner

Over the last 18 months, the topic of neural machine translation (NMT) has attracted high levels of interest in the area of translation technology. Research output over this period seems to further confirm this trend.

While NMT has undoubtedly a lot of potential and many interesting applications, there has been another relatively recent development in the area of MT that has also attracted a fair amount of attention: adaptive MT. This new technology claims to allow a MT system to learn from corrections on the fly. If you have worked in post-editing, chances are that the idea of not having to enter the same changes time and time again will sound like music to your ears.

Two companies are at the forefront of this technology and in the race of its development. The first one is Lilt, a Silicon Valley start-up founded in 2015, although their work in adaptive MT started at Stanford back in 2011. The second one is SDL, which promoted adaptive MT as one of the biggest innovations included in the latest version of Trados Studio.

What are your thoughts on the matter? Do you have any experience with NMT or adaptive MT? In your opinion, which one of these technologies seems to be more promising? Which one do you think will be more useful for professional translators? Is a combination of both the way forward? Do you have any concerns about confidentiality, especially in terms of the corrections entered by post-editors? We invite you to navigate through the sources listed below to learn more about the current state of the art of adaptive MT!


Ramon Inglada_Profile Pic

Written by Ramón Inglada, Translator and Lecturer at Heriot-Watt University (Scotland). Ramón Inglada has been a professional translator since 2002. He has also been working as a university lecturer since 2012, teaching a wide range of courses, including English into Spanish Translation, Translation Technologies and Software Localisation.

Post prepared by Doris Fernandes del Pozo – Journalist, Translator-Interpreter and Communication Trainee at the Terminology Coordination Unit of the European Parliament. She is pursuing a PhD as part of the Communication and Contemporary Information Programme of the University of Santiago de Compostela (Spain)

Disclaimer_External contributions to blog

Sources:

  • Ferguson, Neil (2016) “AdaptiveMT for SDL Trados Studio 2017: a self-learning machine translation engine”, SDL Trados. Available at: http://bit.ly/2rjW280 (Accessed 15 June 2017)
  • Fernández Rouda, Juan Martín (2016) “Adaptive Machine Translation in a Nutshell”, Linkedin. Available at: http://bit.ly/2rk4pjI (Accessed 15 June 2017)
  • Lilt (n.d.) Lilt Labs. Available at: http://labs.lilt.com/ (Accessed 15 June 2017)
  • Slator (2017) Neural Machine Translation. Available at: http://bit.ly/2sCSEZN (Accessed 15 June 2017)
  • Estopace, Eden (2017) “Neural Machine Translation Research Accelerates Dramatically”, Slator. Available at: http://bit.ly/2r9f7Jr (Accessed 06 June 2017)
  • Marking, Marion (2017) “SDL Sues Lilt For Patent Infringement”, Slator. Available at: http://bit.ly/2p1tDpy (Accessed 06 June 2017)
  • Gaelle Bou (2016) “How does Neural Machine Translation work?”, Systran Blog. Available at: http://bit.ly/2otP6Vi (Accessed 06 June 2017)
  • TranslationZone (n.d.) “AdaptiveMT – next generation machine translation”. Available at http://bit.ly/2qTLbRX (Accessed 06 June 2017)
  • Trusted Translations, “What is post-editing?” Available at: http://bit.ly/2sOMTEN (Accessed 06 June 2017)