A viral video which shows three different chatbots speaking in their own “secret language” has amassed hundreds of thousands of views across various social media platforms.
The clip shows three chatbots engaging in a phone call in English, in which they discuss “an employee’s badge number”.
When the machines realise that they are all speaking to other bots, they ask each other whether they should switch to “Gibberlink”, prompting them to start emitting high-pitched noises, in what appears to be something out of a science-fiction film.
Hype or a real technology?
Gibberlink — a term which combines “gibberish” and “link” — is real. While use of the technology is limited, it enables AI engines to communicate in their own language.
EuroVerify asked Anton Pidkuiko, who co-founded Gibberlink, to review a number of online clips.
“Many of the videos are imitating an existing technology — they show phones which aren’t really communicating and there is no signal between them, instead the sounds have been edited in and visuals have been taken from ChatGPT.”
Fake online videos purporting to show Gibberlink software have begun to emerge after the technology was created in February by Pidkuiko and fellow AI engineer Boris Starkov, during a 24-hour tech hackathon held in London.
The pair combined ggwave — an existing open-source technology that enables data exchange through sound — with artificial intelligence.
So, although AI can communicate in its own language, it is not “secret”, as it is based on open-source data and is coded by humans.
For Pidkuiko, the technology is comparable to QR codes. “Every supermarket item has a bar code which makes the shopping experience much more efficient.”
“Gibberlink is essentially this barcode — or think of it as a QR code — but over sound. Humans can look at QR code and just see black and white pieces. But QR codes don’t scare people.”
What Gibberlink will be used for in the future
While the use of Gibberlink technology is very limited at present, its creators believe it will become more mainstream, “as it stands, AI is able to make and receive phone calls,” Pidkuiko said.
“With time, we will see an increase in the number of these robot calls — and essentially more and more we will see that one AI is exchanging.”
Although this technology presents the risk of stripping humans of meaningful interactions, as well as replacing a further swath of unnecessary jobs, for Pidkuiko Gibberlink, it would be a means of maximising efficiency.
“If you manage a restaurant and have a phone number that people call to book tables, you will sometimes receive calls in different languages,” stated Pidkuiko.
“However, if it’s a robot that can speak every language and it is always available, the line is never blocked and you will have no language issues.”
“Another way the technology could be used, is if you want to book a restaurant, but don’t want to ring 10 different places to ask if they have space, you can get AI to make the call and he restaurant can get AI to receive it. If they can communicate more quickly in their own language, it makes sense”, concluded Pidkuiko.
Wider concerns
However, fears around what could happen if humans become unable to interpret AI communications are real, and in January the release of AI software DeepSeek R1 raised alarm.
Researchers who had been working on the technology revealed they incentivised the software to find the right answers, regardless of whether its reasoning was comprehensible to humans.
However, this led the AI to begin spontaneously switching from English to Chinese to achieve a result. When researchers forced the technology to stick to one language — to ensure that users could follow its processes — its capacity to find answers was hindered.
This incident led industry experts to worry that incentivising AI to find the correct answers, without ensuring its processes can be untangled by humans, could lead AI to develop languages that cannot be understood.
In 2017, Facebook abandoned an experiment after two AI programmes began conversing in a language which only they understood.