Google's AI invents a language to make translations
Google Translate invents a language to make translations.Image: YouTube.

On Tuesday, a team led by Mike Schuster of the Google Brain Team published a post on the company’s official Research Blog describing a new process in Google Translate. The tool can generate ‘zero-shot’ translations using a self-generated language.

The new developments in Google Translate follow the system switch it underwent back in September. The tech giant announced the tool would be adopting a new platform called Google Neural Machine Translation or GNMT.

Google’s implementation of machine learning and deep neural networks has resulted in a dramatic enhancement of its products and services. The company further committed to AI with the introduction of its AI Experiments open-source platform last week.

How does Google translate work? 

Google Translate currently supports translation for 103 languages. Usually, an AI needs three idioms. The two involved in the conversion, and another to act as a connector.

The GNMT system, however, adds an extra element to the formula: A token. It comes in the form of a shortcode before the words enter Google Translate.

Supposedly, the token is the answer to how Google’s AI can translate languages without training. The Google Team explained how the ‘zero-shot’ translation would work using three languages: Korean, Japanese, and English.

Google Translate can learn how to say a word in a different language by going through a back and forth process: Japanese to English, English to Japanese, Korean to English, and English to Korean.

To make accurate transcriptions, the multilingual system has to or should go through a training process using different language pairs with common elements between them.

However, the GNMT team decided to remove the bridge between the two idioms involved in the process to see if the software was still capable of completing the task.

“Impressively, the answer is yes — it can generate reasonable Korean⇄Japanese translations, even though it has never been taught to do so. To the best of our knowledge, this is the first time this type of transfer learning has worked in Machine Translation,” Google said.

Google’s AI came up with a replacement for the missing bridge

The Google Brain Team along with Google Translate rendered an image of the language connections the AI system was making, and it revealed that it was undergoing a much more complicated process than just mechanic translation.

“The network must be encoding something about the semantics of the sentence rather than simply memorizing phrase-to-phrase translations. We interpret this as a sign of existence of an interlingua in the network,” the teams said.

Interlingua is a term used to refer to the language humans develop when they are learning a new one.

For example, a Spanish apprentice has an interlingua that is neither his or her mother tongue nor Spanish, but it combines elements and structures of both as the language acquisition happens.

In technology and the case of the GNMT, the AI is artificially generating a new method of communication to achieve the goal. However, the engineers still don’t fully understand how the software is doing it.

Source: Google