Google has now added an addition 13 languages to its machine learning driven Google Translate platform. As a result, it can now converse in 103 languages, covering 99% of the world’s population. (What on earth are the other 1% of us speaking?)
Translate began in 2004 using machine learning based translation systems. It’s since grown into one of the behemoths of the translate world. It allows translation by typing some text in, speaking to it, reading from a picture and writing.
For more information from Google directly, see their blog here.
They’re not the only ones in the game. Microsoft, for example, have a beta translate tool built directly into Skype. It’s called Skype Translator (catchy), and is available in preview, here.
Microsoft claims it allows calls in different languages to take place, complete with an on-screen transcript of the call. You can download it yourself from the URL above and make your own judgement as to how successful it is.
Weighing up machine learning
As machine learning seems to take over the world, many of us ordinary folk are left wondering what all the fuss is about. Sure, we may ask Siri to talk dirty to us (receiving the answer “Humus, Compost…”), or Cortana to sing for us (it will). The real question though, like that asked of gravitational waves, is how far does machine learning, and all the complex math which sits underneath that catchy label, really affect our lives?
Machine learning is a big, general area. It’s easy to cite hundreds of examples (e.g. Uber’s algorythms) or get bogged down in the technological advances (R Studio developments etc.). For most of us, automated translation is perhaps a decent barometer of how sophisticated, and useful the technology really is to us and how far it’s come.
Without boring you here with incidents where the translation engines messed up, such as the gem from Starbucks shown above, it’s clear that even the best AI platforms have a lot to learn.
MIT technology review, for example has lots of praise for Skype Translate, but also says:
“[it’s] deaf to the rhythms of normal spoken conversation, so you can’t be quite sure when its disembodied robot voice is going to break in and start blurting out its translated version”
though the article also points out that this is something which human translators mess up too.
Not quite Star Trek
Machine learning’s usefulness is increasing, particularly when the translations are simple, or where the individual is alive to the quirks of the system. In those cases, the technology can work well. However, using MIT Technology Review’s analogy of the translator in Star Trek (The Next Generation), we’re not quite at the level of Captain Picard’s Communicator yet
In case you’re wondering, the 13 new languages are:
- Kurdish (Kurmanji),
- Scots Gaelic,
- Pashto and
one of which is the second most spoken Semitic language and another of which was spoken by Mohamed Ali’s granddaddy. Makes you wonder why Google didn’t already have those ones.