A few months ago, I was flying to an important meeting and I was flicking through the in-flight magazine (for pitching purposes, you see). As I did that I spotted a short paragraph touting the latest technological development: an in-ear device that promised to translate flawlessly from one language to another. It looks like from now own event managers can dispense with us interpreters for good and just load up on a supply of tiny devices to make sure everyone has a great event, no matter which language they speak.
Obviously that isn’t going to happen.
Despite the wonderful headlines in the press and the incredible claims made by marketing departments, the chances of machine interpreting ear-pieces doing anything more than replacing phrasebooks is miniscule.
Firstly, there is nothing fundamentally new in the technology used in such devices. Machine translation of some sort or another has been around since the 1940s and is still producing results that range from the plausible to the ridiculous. Remember when google translate turned Russia into Mordor? Remember all those websites displaying mangled English because of poor use of machine translation?
Without going into the fine detail of where machine translation actually stands right now (you can read that in this article), basically, unless you are willing to spend months training it and are okay restricting your language to controlled phrases, the results of machine translation will be a bit dodgy.
When it comes to magical translation ear-pieces, machine translation is twinned with voice recognition – the technology that is still giving us frustrating helplines, semi-useful virtual assistants and the fury of everyone who doesn’t have a “standard accent”. Sure, voice recognition technology is advancing all the time but it still works best when you use a noise-cancelling microphone and speak super-clearly – not quite the thing for crowded cafés or busy conferences.
The second reason why translation headsets are not a cure-all is that interpreting is about far more than just matching a word or phrase in one language with a word or phrase in another. Language is a strange beast and in all communication, people use idioms, metaphors, similes, sarcasm, irony, understatement, and implications and are tuned to social cues, intentions, body language, atmosphere and intonation. At the moment, and for as much of the future as we can predict, computers will struggle to handle even one of those things.
Human interpreters have to be expert people readers as well as having enviable language knowledge. Ask the CEO for whom an interpreter helped sort out a cultural and terminological misunderstanding that threatened to lose the company a deal with several million pounds. Ask the doctor who worked with an interpreter to be culturally-aware enough to give a patient the right treatment. Ask the speaker whose interpreter prevented him from making a big, but accidental cultural mistake.
When human interpreters work, they don’t simply function as walking dictionaries. They take what is said in one language, try to understand its meaning, tone, and purpose and then recreate it in another language in a way that will work in that specific context.
The only way that machines could ever do that would be if meetings and events were just about stuffing information into people’s heads and human beings always said exactly what they meant in a completely neutral way. With the current emphasis on the importance of delegate experience and our newfound awareness that people are more than just robots, it makes sense that we would realise that their communication deserves to be handled by experts, not machines.
So the next time someone tries to persuade you that you should let machines take over the interpreting at your event, just remember: for information processing, use a computer; for experience and expertise, work with humans.