{"id":21513,"date":"2025-02-26T10:49:04","date_gmt":"2025-02-26T08:49:04","guid":{"rendered":"https:\/\/forklog.com\/en\/ai-agents-develop-a-new-language-for-intercommunication\/"},"modified":"2025-02-26T10:49:04","modified_gmt":"2025-02-26T08:49:04","slug":"ai-agents-develop-a-new-language-for-intercommunication","status":"publish","type":"post","link":"https:\/\/forklog.com\/en\/ai-agents-develop-a-new-language-for-intercommunication\/","title":{"rendered":"AI Agents Develop a New Language for Intercommunication"},"content":{"rendered":"<p>Two AI agents have transitioned to a language understood solely by computers during their interaction.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">As something we can only describe as nightmare material \u2014 two software engineers from Meta (Anton Pidkuiko, Boris Starkov) demonstrated a more efficient way for AI to communicate.<\/p>\n<p>The scenario of the AI recognizing each other is a demonstration. The communication is real. <a href=\"https:\/\/t.co\/7UMjcr9dw9\">pic.twitter.com\/7UMjcr9dw9<\/a><\/p>\n<p>\u2014 vx-underground (@vxunderground) <a href=\"https:\/\/twitter.com\/vxunderground\/status\/1894217933477360127?ref_src=twsrc%5Etfw\">February 25, 2025<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>The published video showcases neural networks communicating via a mobile phone and a laptop. Initially, one agent introduced itself and inquired if the counterpart could assist with booking.<\/p>\n<p>The other confirmed its identity as artificial intelligence and suggested switching to &#8220;Gibberlink mode&#8221;\u2014a protocol developed by Anton Pidkuiko and Boris Starkov to optimize AI communication.<\/p>\n<p>When two AI agents recognize they are interacting with each other, they can switch from human speech to a more efficient data transmission method using sound waves. This enhances the speed and accuracy of their communication.<\/p>\n<p>Gibberlink was created over the weekend of February 22-23 during the ElevenLabs x Andreessen Horowitz hackathon.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>&#8220;We wanted to demonstrate that in a world where AI agents can make and receive phone calls (i.e., today), they will occasionally converse with each other. Generating human-like speech for this is a waste of computation, money, time, and the environment. Instead, they should switch to a more efficient protocol once they recognize each other as AI,&#8221; <a href=\"https:\/\/www.linkedin.com\/posts\/boris-starkov_gibberlink-ai-elevenlabs-activity-7300116692964126721-UHz1?utm_source=share&#038;utm_medium=member_desktop&#038;rcm=ACoAADMQn6YBpe56UytF3aSJ3HDSrgQJEnY2PHQ\">noted<\/a> Starkov.<\/p>\n<\/blockquote>\n<p>He added that Gibberlink uses <span data-descr=\"the technology allows encoding information into sound waves, which can be transmitted and decoded using speakers and microphones without the need for internet or Bluetooth connectivity\" class=\"old_tooltip\">GGWave<\/span> for data transmission via sound, akin to the dial-up modems of the 1980s.<\/p>\n<p>In January, the startup OpenAI introduced its own AI agent, &#8220;Operator,&#8221; capable of performing tasks on the internet on behalf of the user.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Two AI agents have transitioned to a language understood solely by computers during their interaction. As something we can only describe as nightmare material \u2014 two software engineers from Meta (Anton Pidkuiko, Boris Starkov) demonstrated a more efficient way for AI to communicate. The scenario of the AI recognizing each other is a demonstration. The [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":21512,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"","news_style_id":"","cryptorium_level":"","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[1751,438],"class_list":["post-21513","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-ai-agents","tag-artificial-intelligence"],"aioseo_notices":[],"amp_enabled":true,"views":"43","promo_type":"","layout_type":"","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/21513","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/comments?post=21513"}],"version-history":[{"count":0,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/21513\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media\/21512"}],"wp:attachment":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media?parent=21513"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/categories?post=21513"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/tags?post=21513"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}