Interview with Rita Temmerman

Rita TemmermanRita Temmerman is coordinator of Centrum voor Vaktaal en Communicatie (CVC) at Erasmushogeschool (Brussels). She teaches translation and terminology theory as well as terminology management at the Applied Linguistics Department. Besides this she’s also a research professor at Vrije Universiteit Brussel. She obtained her degree in Germanic Philology from The University of Antwerp, her Masters in Translation from the State University of New York (USA) and her PhD in Linguistics from the University of Leuven. Based on case studies on categorisation and naming in the life sciences (DNA technology) she developed the sociocognitive terminology theory. In 2000, she published the book Toward New Ways of Terminology Description. The Sociocognitive Approach (John Benjamins, Amsterdam/Philiadelphia). Her main research interests are: terminology theory and terminology management, cognition and semantics, translation theory, metaphor studies, dynamic systems in language, intercultural and plurilingual communication.

  1. As co-ordinator of Centrum voor Vaktaal en Communicatie (CVC) at Vrije Universiteit Brussel (VUB) and teacher of translation and terminology theory and terminology management at the Applied Linguistics Department, how do you define the current status of these fields within the academic and professional communities?

All over Europe students can enrol in master programmes in translation, interpreting and multilingual communication. These master programmes open up possibilities for direct access to the job market or for continued research programmes leading to a PhD-degree in aspects of the problem-driven research field of applied linguistics. This research concerns the theoretical and empirical investigation of real world communication problems. The emphasis is on actual language use in society and scientific activity in applied linguistics implies the quest for a better understanding of how languages function as tools for cognition, communication and cultural rooting in the dynamic society of today’s globalizing world.

Whereas in the past the training programmes for translators (and terminology training was part of that formation) put most of the emphasis on the practical training in translation skills and terminology management skills (vocational skills), the new style programmes are supplementing the vocational skills by research skills. Today translation studies and terminology studies are scientific domains within the larger field of applied linguistics. This means that the practical training in translation and in terminology management has been supplemented by training in qualitative and quantitative research skills.

In Brussels, I am the co-ordinator of CVC (short for Centrum voor Vaktaal en Communicatie or, in English, the Centre for Special Language Studies and Communication) since 1998, and it was recently turned into a research group of the Department of Applied Linguistics of the Brussels University (VUB). Ever since we have been involved in several projects that are applied by their nature as they revolve around specific communication problems: in technical, legal, business and medical domains. Our terminology, translation and multilingual communication projects are of an academic-corporate and academic-societal nature.

  1. Your main research interests are related to Terminology theory and Terminology management, cognition and semantics, translation theory, metaphor studies, dynamic systems in language, intercultural and plurilingual communication. Can you tell us why? And can you establish a connection between all this topics?

My doctoral research leading to a PhD in Linguistics at LeuvenUniversity (KULeuven 1998) was on the role that neology creation has within the dynamic communicative and cognitive process leading to more and better understanding within the life sciences (molecular genetics). I have always felt a fascination for the role of language in human understanding. The urge to understand all aspects of human experience more and better seems to be one of the motives underlying cognitive development in many domains of human existence. Understanding more and better is at the basis of knowledge creation and extension. How humans have managed to create understanding and knowledge in the past and continue doing so in what appears to be a never-ending process has been studied extensively in epistemology for thousands of years. One way of getting access to how understanding comes about and how knowledge is the result of a continuous dynamics of understanding and misunderstanding is by studying the cognitive potential and the development of natural language(s) and more particularly of lexical items, i.e. terminology, in specialized domains. The interrelatedness of understanding and terminology creation (and occasional terminology loss) can be demonstrated and has been described in all types of context, i.e. situational context, communicative context, historical context, cultural context, metaphorical context, (multi)lingual context, etc.

In the process of understanding the world better, diversity and variation in expression, polysemy, vagueness and indeterminacy have been proved important by many researchers. The importance of these phenomena for efficient communication and as catalysts for cognition is reflected in new trends of flexible data and knowledge management and is beginning to show its impact on terminology studies. Renewed interest in both the dynamics of cognition and the creative potential of language has shifted the perspectives of terminology studies to the creation of neologisms in special languages, the monosemy versus polysemy debate, and to research concerning ambiguity, synonymy, metaphor, phraseology, etc.

The dynamics of culture-bound terminology in monolingual and multilingual communication are examples of how, methodologically speaking, terminology research is more and more interdisciplinary, hybrid and diverse. Insights developed over the last decades in terminology theory are combined with tools and methods from e.g. cognitive linguistics, corpus linguistics, sociolinguistics, semiotics, knowledge engineering and statistics, but also from pragmatics, contrastive law studies, intercultural communication, ethnography, etc. Moreover, methodologies in sociocognitive terminology studies imply discourse and conversation analysis and often concern action research, based on observation in real communicative situations. Several drafts of white papers can be studied to find out about the dynamics of a new societal or legal phenomena in the making. Conversations between researchers in laboratories can be subjected to terminological analysis and so can interactions in courtrooms or the communication between care-takers and medical staff in hospitals. This type of research yields insights in multiple aspects of understanding and misunderstanding and in meaning negotiation. In most types of discourse, terminology use is prone to misunderstandings. The dynamics of understanding and knowledge creation is sprinkled with misunderstanding, often leading to attempts at clarification. New insights in cultural understanding of terminology and on the impact of intercultural contexts on the dynamics of terminology are the results of many research projects in applied linguistics, in terminology studies, special language studies, translation and interpreting studies.

  1. You developed the Sociocognitive terminology theory, based on case studies on categorisation and naming in the life sciences (DNA technology), which led to the publication of your book “Toward New Ways of Terminology Description. The Sociocognitive Approach”. The highlight of this book is the fact that you question the validity of traditional terminology theory. How would you define terminology then? And can you explain us briefly the Sociocognitive Approach that you present?

My criticism of the traditional Vienna school of terminology was a consequence of years of frustration in teaching terminology theory based on the Vienna school approach. Together with two of my colleagues working at the Brussels school for translation and interpretation, I took a training at Vienna Infoterm in 1986. We were taught the principles of “terminology work” (as it was called there, a literal translation of German Terminologiearbeit). The Vienna approach was onomasiological. The idea was to first delineate “a concept”, then to give it a place in a tree structure (based on logical (IS_A) or on partitive (PART_OF) relations), then to define the concept in an Aristotelian definition and finally to choose a preferred term to name the concept. The Vienna school approach was allegedly not interested in language as a cognitive tool, but only in the naming potential of language.

These principles were clear-cut and straightforward. The problem was that my students in translation and interpretation were not field specialists but applied linguists who needed textual information to understand a subject matter and to make a terminological analysis. In most texts we wanted to use for terminological analysis with our students, we found ambiguity, synonymy, vagueness and – what was worse from a Vienna school perspective – we became increasingly aware that there were good reasons for these phenomena in language, because the advancement of understanding and the negotiation of meaning go together. We concluded that terminology studies needed to be descriptive and that occasional prescriptivism was not for translators to decide but rather for field specialists or legal specialists for that matter.

  1. Another topic that you issued when lecturing in the Terminology and Knowledge Engineering Conference 2012, was the dynamicity and diversity and indeterminacy of terminological understanding in communication and the impact on knowledge engineering. Can you give us your insight on how the technological developments can influence and improve the terminological work?

In Madrid I discussed the difference between studying understanding terminology as a human activity and the type of understanding information systems can be capable of, as made possible by knowledge engineering. I also pointed out that ambiguity of terminological meaning in human discourse is very rare because the context in which a term is employed makes it possible to disambiguate and I discussed some cases of the complexity of understanding. I concluded by pointing out that all natural languages are excellent knowledge representation systems.

In Madrid I also evoked the likelihood that the future e-dictionary will be based on autonomous software agents that will access lexicographical data in distributed semantically-annotated web sources (instead of one lexicographical database) and will carry out several reasoning tasks to decide which data are relevant for the user and how these data need to be presented, depending on the user’s profile.

I mused over the fact that in May 2012 Google announced the introduction of the Knowledge Graph – an effort to improve its results by teaching its servers to understand what the words typed into its search boxes mean, and how they relate to other concepts.

Furthermore I discussed the Semantic Web which is not a separate Web but an extension of the current one, in which information is given well-defined meaning, better enabling computers and people to work in cooperation

  1. You seem to have a particular connection to all the technological developments that might be applied to multilingual terminological work. In which way do you think that the terminological work should be concentrated in these tools? And how would these tools affect the terminology management and the translation process?

The computer has revolutionized the possibilities for organizing, distributing and accessing information. Now that so much information has been made machine-readable, the scope for research has grown tremendously. Moreover new techniques for making the vast material manageable have seen the light. Free text searching has been improved by linguistic and statistical methods. The analytic and descriptive tools developed in corpus linguistics (lemmatizers, syntactic parsers, POS taggers and annotation tools, term (also multiword) extractors, etc.) have been integrated into research methodologies for terminology research. Thanks to these tools researchers have been able to go beyond introspection and generalizations based on the odd example. It has become possible to study special language and its development in quite substantive text samples or corpora. Publicly available as well as specifically developed tools are used by these researchers. The experience and results applied linguists gather through research projects should be passed on to practising translators and terminologists.

  1. The institutions and companies are also giving more importance to the field of Terminology. The European institutions inclusively. How do you think this field (an ever-evolving one) should be approached by them?

The insight that multilingual understanding implies better understanding is worth more consideration. Multilingual and intercultural communication is everywhere in today’s globalized society, regardless of the fact that in many circumstances English is used as the language for international communication. Even though the predominance of English as a first or a second language is a reality for many highly trained people all over the world, English is only one of the languages in many communicative settings. For instance, within the European union, law development is a multilingual and intercultural process. The legal document is initially drafted in one language, now frequently English, (often by non-native speakers) and then translated into the other EU languages. This results officially in a single multilingual text in 24 language versions that are authentic within the context of the EU legal order. As such EU legal language is developing its own terminology and legislative style. The European language policy opens up a new chapter for terminology studies: studying understanding, misunderstanding and meaning negotiation in a unique setting where 24 official languages and their creative potentials contribute to a new way of world making.

Due to the increasing globalisation, organisations are more often than before operating in linguistically and culturally diverse environments. English is most of the time not sufficient for overcoming communication barriers in local settings. Being able to communicate with business partners, customers and employees in their own language is of strategic importance for any organisation.

A mistake made by some is to believe that because English is the global language par excellence, knowledge of this language is sufficient for anyone entering the global market. Partly as a result of this claim we find that the status of some other linguae francae is changing.

It is hardly surprising that globalization is breaking down the territoriality of nation-states, and of their languages. The role of standardization in the construction of a language by reference to the territory of a nation-state is now being challenged. There is a variety of language groups within all states. However, for the time being the focus remains too much on the notion of autochthony.

The longstanding insistence upon standard language makes places for tolerance of language mixing and hybridity, as put to practice by plurilingual communicators and intercultural mediators.

Related to this shift in perspective is the question in how far language can be culturally neutral. Some languages are used all over the world as first, second or foreign languages but at the same time, any language is always culturally rich in the sense that it contains ‘culture in language’ sometimes referred to as “linguaculture” which implies all the varied meaning potential of a language. Linguaculture is carried by individuals and is being developed as part of their life in specific social and historical contexts. As such it varies from person to person. In learning new languages individuals draw upon their personal linguaculture as a bridge to the meaning dimensions of the new language.

The concern of the EU is the linguistic diversity of the 24 official languages, each nation-state being responsible for its internal diversity. In my opinion Euro-language which is the sum of a constantly growing body of discourse that is being expressed in 24 variants on a day to day basis, is part of the linguaculture of each and every European citizen. Some European citizens are first language speakers of languages beyond the 24 official ones, however.

It is my conviction that the learning process within a European multilingual community of practice contributes to cognitive creativity and to awareness of variation in meaning which will prove to be an asset for Europe. Translators, intercultural communicators and terminologist are bound to go on playing an important role both in the European Union institutions and in international business life.

  1. You have been teaching translation, terminology theory and terminology management. In your opinion how do students approach these fields? And what are the main changes that you noticed through the years?

The majority of my students are either enrolled in the Bachelor in Applied Linguistics or in the Master in Translation or the Master in Interpreting at Vrije Universiteit Brussel. Most of them aim at careers in multilingual settings either in Brussels, in other parts of Belgium or abroad.

In recent years, I have noticed that my students are ready for the globalisation tendency in today’s world. The majority does not only easily switch from one language to the next, indulge in hybridity and enjoy intercultural society, they also appear to be knowledgeable on variation in people’s identity, on the traps and pitfalls of the “otherisation” of some people in society and they appear to have acquired insight in representation of identity e.g. in the media.

I realize that this may sound excessively positive but what I am saying is based on my contact with the student translators and interpreters in Brussels.

  1. A common, but central question. What does it take to be a terminologist in your opinion?

In my opinion a terminologist is a professional who is not only knowledgeable in lexical semantics, sociolinguistics, cognitive linguistics and language policy but who also takes an interest in language philosophy, philosophy of science, knowledge engineering, ontology management and the future of the semantic web.

A terminologist also has information retrieval and knowledge acquisition skills. Relatively few people work as full time terminologists but people in all disciplines and from all walks of life are confronted with domain-specific vocabulary. A course on terminology creation and terminology management could be beneficial in all domains of research and development in higher education.

Depending on the context in which a terminologist is working different requirements may be at stake. Terminologist should be motivated people who negotiate their range of duties with their “clients”, those for whom they are catering, so to speak. In some working environment a strict working procedure is given (e.g. The IATE Input Manual) whereas in other circumstances the terminologist is asked to contribute in primary or secondary term creation.

  1. Nowadays the education systems across the world are introducing more Terminology courses and degrees in their curricula. What do you think of the Terminology teaching methods?

Of course I do not know the content of all terminology courses and degrees worldwide. My impression is that several online courses are reduced to the prescriptive and sheerly concept-oriented approach. University courses – generally give a survey of several schools of thought and several working methods and students are introduced to research methods for domain specific language and terminology.

  1. What do you think about the EU’s Interinstitutional Terminology Database, IATE? Do you consider it an important resource for the wide public?

IATE is a unique repository of European terminology and an important resource for all Europeans. However, in my opinion IATE could be made more powerful for translators if the concept orientation was complemented by a resource that automatically extracts variation in formulation from corpora of European texts and from other sources. At Centrum voor Vaktaal en Communicatie (CVC) of Vrije Universiteit Brussel the dynamics of terminology development in multilingual Europe has been one of the research topics in recent years. Terminological variation, secondary term formation and domain loss have been some of the key issues in our terminological research. Koen Kerremans, collaborator of CVC, has acomplished his doctoral research on Terminological variation in multilingual Europe. One of his research questions was: how is term variation accounted for in the EU’s multilingual termbase IATE? He observed that the diversity of intra- and interlingual variants encountered in the multilingual parallel corpus of European texts was not fully represented in IATE and wondered why this was the case and what the use of adding variation to the repository might be. He reflects on how the database can be ‘enriched’ with intra- and interlingual variants derived from parallel texts.

In his case study on environmental terminology he tried to acquire an understanding of the different ways in which English terms are translated into Dutch and French. His study showed that intralingual variation in the English source texts is also reflected (quantitatively and qualitatively) in the Dutch and French target texts. He examined to what extent it is possible in IATE to account for the different types of intra- and interlingual variants encountered in the parallel corpus. As he points out, it cannot be expected that every possible variant encountered in his corpus of European texts will also appear in the IATE database. This is due to the specific nature of termbases in general, which are not concerned with representations of term use and translation decisions at the level of an individual text.

The question whether IATE has the potential to represent the different types of intra- and interlingual variants that are encountered in a multilingual corpus of EU-texts is interesting. If that is the case, it becomes possible to develop automatic routines or procedures for extending the database with (semantically-annotated) intra- and interlingual variants retrieved from the parallel texts.

Kerremans’ discussion of the way terminological variation is accounted for in IATE, as it is now, refers to two documents. The first document is the IATE Input Manual. It describes the structure of the IATE database and explains to interinstitutional users (EU translators/terminologists) how content needs to be managed in the database. The second document is the Best Practice for Terminologists (Translation Section of the ICTI 2008) which serves as a guide for people carrying out terminology tasks in the EU institutions that are involved in the IATE project. The document outlines the principles and ground rules governing the content of terminological entries. Kerremans’ general observation by examining IATE was that the database features a lot of intra- and interlingual terminological variation in its terminological records. The reasons for this variation are to be found in the way the database is constituted.

The main purpose of the IATE-project was to reorganise the terminology activities of the European Union (EU) institutions in a coherent manner, to eliminate the duplication of effort between institutions and consequent duplicate entries in the various terminology databases managed by them and to develop a single database for future activity using resources as rationally as possible. Originally, all these resources were developed and maintained without much cooperation and discussion between the different institutions. Each institution had its own terminology to denote a given concept and this could sometimes deviate from the terminology that was managed by another institution for the same concept. Moreover, different philosophies on terminology and different historical backgrounds of the institutions – leading to different terminological database structures – had to be reconciled in the IATE database, resulting in an enormous number of overlapping entries.

Terminological variation in IATE is not only due to the many overlapping entries as a result of merging existing databases into one. EU translators and terminologists of the different institutions are requested to keep the database ‘alive’ and up-to-date by proposing new terms in the different languages, by merging duplicate terminological entries, by validating terms, rating their reliability, adding and verifying definitions, etc. Despite the efforts to control as much as possible the data proposed by these different users – by means of built-in validation procedures – multiple user input inevitably leads to different (validated) terminological proposals.

The presence of intra- and interlingual terminological variation in the database is due to the way IATE was created, due to the way it is currently maintained and, finally, due to its specific function within the EU’s terminology harmonisation process.

From Kerremans’ discussion of IATE’s field records, it can be derived that EU terminologists are offered many different possibilities to account for terminological variants of different types in the IATE system. The majority of these variants appearing in the same record also refer to the same unit of understanding, which is due to the specific concept-oriented structure of the term records. Given the concept-oriented structure of the database, it was to be expected that IATE would not cover all types of interlingual variants that were observed in the parallel corpus used for the comparative study of English, French and Dutch.

Interestingly, despite the general guidelines and concrete action points in favour of terminological precision and consistency, terminological variation is a common phenomenon in the IATE terminology base. The close study of term records in IATE has revealed that in the database many possibilities are offered for structuring/representing different types of terminological variants in different languages. But because these data are manually entered by different users, quite a lot of differences and inconsistencies were observed in the treatment of variation: e.g. inconsistencies in the types of variants considered (e.g. reduced forms, paraphrases, formulas, etc.) as well as their specific place in the term records.

The parallel texts corpus of EU-texts, studied by Kerremans, also contains a lot of conceptually-related translations that do not appear in IATE.

IATE can be considered as a conventional multilingual terminology base. Terminological data in such termbases are represented in concept-oriented terminological records. This implies that every entry (i.e. a term record) should deal with one concept only (or a single proper name in the case of nomenclature) and all data relating to a given concept should be consolidated in one entry. Consequently only cognitive equivalents in several languages will be represented in IATE’s term records, as is the case with traditional terminology bases in general.

In order to better reflect actual terminology and translation choices, a resource could be developed that is able to establish links between data that can be found in termbases (such as IATE) and data extracted from translation corpora.

  1. You have been a lecturer in one of the TermCoord’s seminars, one of the main purposes of which is to raise awareness for terminology studies, mainly within the institutions. In your opinion in which way might this actions change the Terminology approach among the institutions?

In applied research, questions need to be formulated and tackled by academic researchers in collaboration with professionals. I hope that the strict concept orientation will be reconsidered. As most of the information flow moves from originally drafted Euro-English texts to texts translated into the other 23 official EU-languages, translators are coining many new terms in their EU-language (EU-Maltese, EU-Portuguese, EU-Croatian, EU-French, etc.) This implies that translators are key players in secondary term formation. When translating European texts, translators should have awareness of e.g. the functional importance of vagueness (in order to be all-inclusive) and of their responsibility in creating documents that may end up as equally authentic to the original. Practitioners may find themselves in a limbo when they have the feeling that the theory they are supposed to apply does not correspond to the linguistic reality they are part of. At TermCoord’s seminars alternatives can be brought to the fore and discussed.

  1. In your opinion how to think of TermCoord’s approach in trying to follow the evolution of terminology in universities, technologies and communication?

In my opinion and from my perspective it is a wonderful opportunity for researchers to have contact persons at TermCoord since in applied research a close collaboration with practitioners in the field (terminologists and translators) is essential. Terminologists and translators working for the EU are instructed on the type of research academics are involved in. These encounters may yield interesting ideas for more research issues, possibly in collaboration with practising translators and terminologist.

Interview by Júlia de Sousa


Júlia de Sousa graduated in Journalism, at the University of Coimbra, Portugal. One of her biggest passions is radio that she practiced during her college years while working on RUC (the University’s radio), as an extra-curricular activity. For the past years she was working in institutional communications, where she developed some experience on social media and internal and external communications, as well as web content management. From March to July, 2013, Júlia was a trainee at the European Parliament’s Terminology Coordination Unit in Luxembourg, where she integrated the communication team. Her main interests are related to communication, social media and marketing.