FB shuts down AI robots after they start talking to each other in strange language

From the Web

Facebook abandoned an experiment after two artificially intelligent programs appeared to be chatting to each other in a strange language only they understood. Andrew Griffin reports in The Independent on August 1, 2017.

The Independent Logo

The two chatbots came to create their own changes to English that made it easier for them to work – but which remained mysterious to the humans that supposedly look after them.

The bizarre discussions came as Facebook challenged its chatbots to try and negotiate with each other over a trade, attempting to swap hats, balls and books, each of which were given a certain value. But they quickly broke down as the robots appeared to chant at each other in a language that they each understood but which appears mostly incomprehensible to humans.

Robot AI

The robots had been instructed to work out how to negotiate between themselves, and improve their bartering as they went along. But they were not told to use comprehensible English, allowing them to create their own ‘shorthand’, according to researchers.

The actual negotiations appear very odd, and don’t look especially useful:

Bob: i can i i everything else . . . . . . . . . . . . . .

Alice: balls have zero to me to me to me to me to me to me to me to me to

Bob: you i everything else . . . . . . . . . . . . . .

Alice: balls have a ball to me to me to me to me to me to me to me

Bob: i i can i i i everything else . . . . . . . . . . . . . .

Alice: balls have a ball to me to me to me to me to me to me to me

Bob: i . . . . . . . . . . . . . . . . . . .

Alice: balls have zero to me to me to me to me to me to me to me to me to

Bob: you i i i i i everything else . . . . . . . . . . . . . .

Alice: balls have 0 to me to me to me to me to me to me to me to me to

Bob: you i i i everything else . . . . . . . . . . . . . .

Alice: balls have zero to me to me to me to me to me to me to me to me to

But there appear to be some rules to the speech. The way the chatbots keep stressing their own name appears to a part of their negotiations, not simply a glitch in the way the messages are read out.

Indeed, some of the negotiations that were carried out in this bizarre language even ended up successfully concluding their negotiations, while conducting them entirely in the bizarre language.

That said, it’s unlikely that the language is a precursor to new forms of human speech, according to linguist Mark Liberman.

“In the first place, it’s entirely text-based, while human languages are all basically spoken (or gestured), with text being an artificial overlay,” he wrote on his blog. “And beyond that, it’s unclear that this process yields a system with the kind of word, phrase, and sentence structures characteristic of human languages.”

The company chose to shut down the chats because “our interest was having bots who could talk to people,” researcher Mike Lewis told FastCo. (Researchers did not shut down the programs because they were afraid of the results or had panicked, as has been suggested elsewhere, but because they were looking for them to behave differently.)

The chatbots also learned to negotiate in ways that seem very human. They would, for instance, pretend to be very interested in one specific item – so that they could later pretend they were making a big sacrifice in giving it up, according to a paper published by FAIR. (That paper was published more than a month ago but began to pick up interest this week.)

Facebook’s experiment isn’t the only time that artificial intelligence has invented new forms of language.

Earlier this year, Google revealed that the AI it uses for its Translate tool had created its own language, which it would translate things into and then out of. But the company was happy with that development and allowed it to continue.

Another study at OpenAI found that artificial intelligence could be encouraged to create a language making itself more efficient and better at communicating as it did so.

Update by The Independent: This article has been amended to stress that the experiment was abandoned because the programs were not doing the work required, not because they were afraid of the results, as has been reported elsewhere.

Comment Osho News: These robot creations appear to be evolving more human-like than we might want them to and even have human names defined by gender. Who exactly is keeping an eye on these developments?

independent.com – Illustration by Osho News

Comments are closed.