July 17, 2017 2:49 pm
As you may know, every now and then TermCoord publishes a new interview with a well-known language professional. This time, we have chosen Lynne Bowker, Full Professor of Translation (FR-EN) and Information Studies at the University of Ottawa, Certified Translator, Member of the Association of Translators and Interpreters of Ontario (ATIO), and pioneering academic in terminology and translation technologies.
Winner of the 2013 Vinay and Darbelnet Prize for her article “The Need for Speed! Experimenting with ‘Speed Training’ in the Scientific/Technical Translation Classroom”, awarded annually by the Canadian Association of Translation Studies (CATS), Prof. Bowker has always been fascinated by the interaction of language and technology and how using tools might impact the quality of translation. She first worked as a professor at Dublin City University in Ireland, where she taught on both the translation and the computational linguistics programs. Since 2002, she has been teaching at the School of Translation and Interpretation at the University of Ottawa, and since 2007, she also holds a cross-appointment to the School of Information Studies, which represents, according to Prof. Bowker’s words, “a natural intersection point” between her experience in terminology and concept analysis.
1. With reference to your awarded article mentioned above, what is Speed Training in Technical Translation and how does it impact the translation industry?
Our School received some feedback from organizations that employed our students during their practica and internships. They found that our students produced work of good quality, but commented that they seemed to have trouble meeting deadlines or achieving quotas. As all professional translators know, quality is important, but so is productivity! However, much of the emphasis in translator training programs addresses the former, rather than the latter. I wanted to see if I could introduce some exercises and activities aimed specifically at helping translation students learn to work faster. In the first iteration, students in a final-year technical translation class did a weekly 15-minute “speed translation” of a 225-word text. Of course, these constraints meant that they didn’t really have time to consult resources or even to revise their work. The idea was simply to prompt them to get into the habit of keeping up a good pace. The full report on the experiment has recently been published in the journal Meta (vol. 61, December 2016). In a second iteration, I used the technique of “speed summarization” where instead of translating, students prepared a short summary of a longer text (in the same language). This has been accepted for publication in The Interpreter and Translator Trainer (vol. 11(4)) to appear later in 2017. Currently, I’m working on speed training and post-editing. Quality will always be a critical consideration for translators, but realistically speaking, speed also counts, and many training programs don’t currently pay enough formal attention to this aspect. Hopefully some of my preliminary efforts will inspire additional work in this area.
2. Translation is not a static field and nor is Translation Studies. We are currently witnessing a wealth of new research methods, approaches and concepts. What’s next for translation?
Well, that’s a BIG question! If I had to pick just one thing, I would say that I think we will see an increasing trend towards interdisciplinarity. Although the practice of translation has a long history, the field of Translation Studies is relatively young. As is normal in the early years of a nascent discipline, we spent quite a lot of time looking inward, reflecting on ourselves and searching for an identity. I think Translation Studies is now beginning to emerge from this stage. As a discipline, we are more confident about our place and our contribution, and as a result we are ready to be more outward looking. A recent volume called Border Crossings: Translation Studies and other disciplines (Y. Gambier and L. van Doorslaer, eds. 2016) is a great example of this new attitude. Academia is changing too. In the past, academic disciplines were very siloed, but universities around the world are now launching interdisciplinary programs and encouraging collaborative research across domains. If we want to solve the world’s big problems, we need to work together!
3. Do you know IATE, Interactive Terminology Database for Europe? Do you use it in your job? How do you think it can benefit translators and language professionals?
Is this a trick question? 😉 Of course I know IATE! I even knew its predecessor Eurodicautom. As a translator trainer, I certainly include IATE among the resources that I teach to my students. Of course, I work in Canada, where we also have some great terminological resources, such as TERMIUM Plus and Le grand dictionnaire terminologique. But one of the things that I emphasize to students is that they need to learn how to identify the right tool for the job at hand. If they are translating a text that is destined for a Canadian audience, then one of the Canadian term banks is likely to provide better insight into the terminology used in this region. However, the internet has revolutionized the translation market, and the global business market in general. A translator living in Canada could easily have a client who asks for a text to be translated for a European audience, in which case, IATE would be a better choice. In addition, although the Canadian translation market does still focus heavily on translation between our country’s two official languages – English and French – its beginning to open up to other language combinations too. One of the great advantages of IATE is that it incorporates a much broader range of languages than the Canadian resources.
4. The IATE Management Group has launched an Interinstitutional Terminology Portal called EurTerm to foster cooperation on Terminology among EU Institutions. Do you think it should be accessible to the public as well?
You’ve piqued my curiousity! As we know, a key goal of any terminology effort is to facilitate specialized communication. If we want to make sure that people are able to understand each other effectively, we need to give them the tools – such as a common vocabulary – to do so. Encouraging cooperation on terminology among EU institutions sounds like a great initiative. Opening it up to a wider audience could bring a number of benefits. For instance, it could serve as a model for similar efforts in other parts of the world, and it could also generate feedback that could feed into continous improvement for EurTerm itself. Without knowing more about its contents and how it works, it’s hard for me to give a more detailed answer. In general though, fostering cooperation is a good thing! However, I do also understand the value of releasing access to new projects in stages. I assume that there will be some periodic evaluations of the project and, maybe some tweaking as a result. Once it has reached a stable state, perhaps that will be the right time to make it accessible to a wider audience? In the meantime, please add me to the waiting list!
5. With reference to Lexicography, Terminology and Corpus Linguistics, Semasiology and Onomasiology have become blending methods. How is Terminology semasiological nowadays?
It is correct to say that terminologists now integrate both the tradititonal concept-to-term approach (onomasiology), as well as the term-to-concept approach (semasiology) that was previously more typical of lexicographers. In terminology, the driving force behind the move towards semasiology has been corpora. A corpus is a large collection of authentic texts stored in electronic form. The most direct access points into a corpus are lexical items, rather than concepts. In the past, when terminologists used much smaller printed corpora and consulted them manually, they were able to read the documents in the corpus in their entirety and to identify concepts in this manner. Nowadays, when corpora are so enormous, terminologists cannot read them from beginning to end; they need ways to zoom into certain sections of the corpus for closer inspection. By adopting a semasiological approach, terminologists can use potential terms as search patterns, and then conduct a conceptual analysis using the concordance lines that are retrieved. The process can also be iterative in that the analysis of the concordances is likely to reveal more term candidates that can be looked up in a subsequent search.
6. Translation and Corpus Linguistics: is this combination fruitful from both an academic and working point of view? Which corpus tools would you recommend the most and for which purposes?
Yes, I think it is very safe to say that the combination of translation and corpus linguistics is a very productive one, for both practitioners and scholars. On the one hand, in research, corpora have been instrumental as a means for investigating recurrent features of translated text (sometimes referred to as “translation universals”), such as simplification, explicitation, and normalization. Meanwhile, practicing translators rely on corpora to provide multiple examples of terms or phrases in context, thus helping them to determine appropriate usage. Personally, if I want to get a quick feel for how a term is used, I’ll often try the free online tools Linguee or Glosbe first because these tools come with large, ready-made language resources, so there’s not much of an investment needed on my part. But for more specialized needs, where I supply my own corpus, then some of my favourites are WordSmith Tools (for monolingual work) and ParaConc (for bi- or multilingual work). For automatic term extraction from a corpus, the online tool TermoStat is usually my go-to tool.
7. “You are currently working on a project to improve “Website Localization and Machine Translation”. Could you tell us more about it and what you are hoping to achieve?
The project’s aim is to investigate the relationship between the translatability of a text and its effect on the User eXperience (UX), using websites as the test case. Essentially, there are lots of guidelines on writing for the web which suggest that web content should be catchy and appealing to readers; however, this type of content is harder to translate, especially for machine translation systems. At the other end of the spectrum, writing in a more controlled way will increase the translatability of the text, but may make it less appealing to readers. The first case results in a situation where the source language readers have a positive UX, but the target text readers have a less positive UX. Meanwhile, the controlled language situation reduces the UX for source text readers but increases it for target text readers. The first phase of the project was to verify this relationship, and the results were published in 2015 in Localisation Focus 14(2). Now we’d like to work on finding the right balance – a writing style that produces a good UX for both groups of readers. The initial experiments were done using raw (unedited) machine translation output, so in a future phase, we’d like to see how post-editing affects the results.
8. Translation is a consistent part in the multilingual news-making process in the EU. The so-called “trans-editors” often give rise to misleading news and reality distortion. Do you think they should be trained in translation too?
I’m not entirely familiar with the job of a “trans-editor”, so I don’t want to speak out of turn. However, I do know that people working in the journalistic field typically work under tight time constraints, which may be a contributing factor. Perhaps some kind of translator training that incorporates aspects of speed training, as described above, could be beneficial? The speed training exercises can include not only translation, but also speed summarization, which might be handy skill for “trans-editors” to develop. Because summarization cannot be accomplished without textual analysis, this activity helps to train would-be translators to think of meaning in terms of context. In addition, it trains them to write well, to think clearly, and to reformulate meaning accurately, thus making it an exercise that is highly pertinent for translators. But I’d need to know more about the specifics of a trans-editor’s job and current training before I could comment any further.
9. With reference to Prof. Humbley’s statement, “Dictionaries and specialist dictionaries, are ever increasingly being adapted to correspond more closely to users’ needs”. How would you describe the future of dictionaries?
For many users, an ideal resource is most likely one that offers a “one-stop shop”. We’re seeing evidence of this in many aspects of our lives. Over the years, I have seen my local grocery store transform from a place where I buy food to a so-called “superstore” where I can also pick up books, greeting cards, clothes, furniture, small appliances and many other items. And I must admit, on many occasions, it’s quite handy! Libraries, too, do much more than simply lend out books these days. Now I can go to my public library and borrow CDs, videogames, a laptop computer, a pedometer, and even a card that let’s me visit local museums. So it’s not really surprising that dictionary users have become more demanding. Language is about communicating. Lexical items are certainly a key feature of a language, but to communicate effectively, we need more. And so, in their linguistic “superstore”, users would like to see more examples, contexts, usage information, phraseology, and more. They want guidance about how they should use lexical items in the broader linguistic structures. Sometimes they even want information about how not to use language. And of course, the biggest challenge for dictionary makers is that user needs are not “static”. One day, a user wants help understanding a term, but another day, the type of help sought is for producing an idiomatic sentence. Second-language learners need some basic information in the early stages of language learning, but their needs become more sophisticated as they become advanced learners. And, in the same way that I want to buy apples one day, but the latest best-seller another day, dictionary users want to have their myriad needs met at their linguistic superstore. As a result, I think we will see lexicogrpahic resources being integrated into portals and linked tightly with other types of resources. A dictionary alone is not likely to meet all of a user’s needs, but a dictionary that it coherently and cohesively integrated into a larger collection of resources will go a long way.
10. “Invisibility” is considered a buzzword in translation and Lawrence Venuti has criticized the fact that the translator is an invisible figure. In 10 years’ time, will such invisibility turn into disappearance due to machine translation?
The idea of disintermediation, or cutting out the middleman, is one that has concerned translators for many years, although technological advances have certainly exacerbated these concerns. If people have direct access to machine translation, is there any need for translators? Translation is not the only profession struggling to find the best ways of integrating tools into our reality, and to educate the wider public about the value that professionals add to the process. For instance, in my “other” world – Information Science – librarians are in a similar situation. “Why would I consult a librarian when I can just Google it?” is a phrase that they hear all too often. In my opinion, translators could actually learn some valuable lessons from our librarian colleagues in this regard. Librarians have rallied and embraced technologies, making themselves indispensable as educators, informing patrons in areas of information literacy, information credibility and digital literacy, for example. They are vocal advocates of the value offered by the information professions. Translators, in contrast, have been somewhat less vocal and done less outreach. In comparison to librarians, we tend to toil away quietly in the background, understanding our own value, but not always communicating it well to others. We have made fewer efforts to integrate ourselves into our broader communities. Of course, what I’ve just presented here is a generalization, but the main point is, if WE believe that we have something to add and that our efforts add value, then it’s up to US to make that case, to educate others, and to increase our visibility. And we need to find creative and effective ways of doing it. I think we are starting. The EurTerm initiative mentioned above is a good one. Our collective voice will be heard more clearly than individual voices. The previously mentioned Border Crossings volume is another encouraging step. Providing evidence of how we have contributed to other disciplines makes it harder for us to be overlooked. Let’s hope that we can keep this sort of momentum going!
Interviewed by Jessica Mariani, PhD candidate in English Language and Translation at the University of Verona, where she works as an assistant of English Language and History of the English Language. She is a Journalism Post-graduate and has worked as a journalist both in Dublin and Brussels, where she trained as a press officer at the European Parliament Press Unit. The multilingual setting of the European Union inspired her to set up an interdisciplinary and ethnographic research project entitled Building a European Perspective in News Translation, in official partnership with the EP Press Unit, where she investigates the role of the news translator and analyses translation processes and practices involved in the information flow from European Institutions to the news media, and the EP Terminology Coordination Unit in Luxembourg, where she has conducted ethnographic research about the role of terminology in institutional settings and journalism. She also works as a language coach, translator, interpreter and content editor.
Post prepared by Laura Campan – Postgraduate student in Translation and Intercultural Communication at ISIT, and Communication trainee at TermCoord (European Parliament)
3,019 total views, 1 views today