{"id":24900,"date":"2025-06-25T10:46:11","date_gmt":"2025-06-25T07:46:11","guid":{"rendered":"https:\/\/forklog.com\/en\/google-deepmind-unveils-local-ai-model-for-robots\/"},"modified":"2025-06-25T10:46:11","modified_gmt":"2025-06-25T07:46:11","slug":"google-deepmind-unveils-local-ai-model-for-robots","status":"publish","type":"post","link":"https:\/\/forklog.com\/en\/google-deepmind-unveils-local-ai-model-for-robots\/","title":{"rendered":"Google DeepMind Unveils Local AI Model for Robots"},"content":{"rendered":"<p>Google DeepMind has launched a new language model, Gemini Robotics On-Device, designed to enable robots to operate locally without an internet connection.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">We\u2019re bringing powerful AI directly onto robots with Gemini Robotics On-Device. ?<\/p>\n<p>It\u2019s our first vision-language-action model to help make robots faster, highly efficient, and adaptable to new tasks and environments \u2014 without needing a constant internet connection. ? <a href=\"https:\/\/t.co\/1Y21D3cF5t\">pic.twitter.com\/1Y21D3cF5t<\/a><\/p>\n<p>\u2014 Google DeepMind (@GoogleDeepMind) <a href=\"https:\/\/twitter.com\/GoogleDeepMind\/status\/1937511515768176966?ref_src=twsrc%5Etfw\">June 24, 2025<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cThis is our first <span data-descr=\"combines computer vision, language processing, and decision-making or actions\" class=\"old_tooltip\">Vision-Language-Action (VLA)<\/span> model, which will help make robots faster, highly efficient, and adaptable to new tasks and environments \u2014 without the need for a constant internet connection,\u201d highlighted the AI division of Google.<\/p>\n<\/blockquote>\n<p>Sergey Lonshakov, architect of the \u201cRobonomics\u201d project, commented to ForkLog that VLA is an advanced solution in the field of humanoid robotics.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cFigure is currently demonstrating its collaborative production scenarios using the same type of models,\u201d he noted.<\/p>\n<\/blockquote>\n<p>In February, Figure <a href=\"https:\/\/forklog.com\/en\/news\/figure-unveils-revolutionary-ai-for-robots\">introduced<\/a> its own artificial intelligence, Helix, for integration with robots. According to the creators, the model is capable of \u201creasoning like a human.\u201d AI-equipped humanoids can handle \u201cvirtually any household items without any code or pre-training.\u201d<\/p>\n<p>Lonshakov described the local deployment of the model on the device as a correct decision that aligns with current trends.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201c[\u2026] It\u2019s called a high-level task planner \u2014 it used to be only in the cloud, when it was exclusively <span data-descr=\"reinforcement learning \u2014 a branch of machine learning where an agent (AI) learns to make decisions by interacting with the environment to maximize reward\" class=\"old_tooltip\">RL<\/span> used for training a simple model to act in a simulation, and then the ready sequence of operations was uploaded to the robot. Now roboticists are trying to create <span data-descr=\"multilingual neural networks designed for instant speech-to-speech translation without going through text, and without losing emotions, intonation, and even the speaker's voice\" class=\"old_tooltip\">seamless<\/span> models, where at the planning stage there are no processes stopping the robot&#8217;s operations when switching activities. If the bot stopped tightening bolts on the conveyor and went for new parts, now no one wants to wait for the change of &#8216;equipment&#8217; in its head,\u201d added the expert.<\/p>\n<\/blockquote>\n<p>Key features of Gemini Robotics On-Device:<\/p>\n<ul class=\"wp-block-list\">\n<li>versatility and agility of Gemini Robotics with the ability to operate locally on the device;<\/li>\n<li>execution of a wide range of complex tasks using both hands;<\/li>\n<li>acquisition of new skills after 50-100 demonstrations.<\/li>\n<\/ul>\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh7-qw.googleusercontent.com\/docsz\/AD_4nXe5VwLS0a86jq25a2PcxUBmhqbShYjcOt6ptOZ6lRuzVv8pgbzcpTgzTwniFyqdCDmBmwtcZf_8X4JVmfQZZKvukOF659gu-CxmGt51ctK3Bh16JG95oAukCFWk2VsAyD2kSStrfw?key=GCak1AT1Xmb0iA_Z0hTLug\" alt=\"Google DeepMind Unveils Local AI Model for Robots\"\/><figcaption class=\"wp-element-caption\">Comparison of Gemini Robotics On-Device with Gemini Robotics and other solutions in benchmarks. Data: <a href=\"https:\/\/x.com\/GoogleDeepMind\/status\/1937511518771233271\">X<\/a>.<\/figcaption><\/figure>\n<p>An additional <span data-descr=\"a set of tools and resources for developers intended for creating programs or integrating with a specific platform, service, or device\" class=\"old_tooltip\">SDK<\/span> has been launched to assist developers in customizing the model for their own applications, including testing it in the MuJoCo physical simulator. They can use natural language prompts.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">From humanoids to industrial bi-arm robots, the model supports multiple embodiments, even though it was pre-trained on ALOHA \u2014 while following instructions from humans. ?<\/p>\n<p>These tasks may seem easy for us but require fine motor skills, precise manipulation and more. \u2193 <a href=\"https:\/\/t.co\/GhBkCj4juZ\">pic.twitter.com\/GhBkCj4juZ<\/a><\/p>\n<p>\u2014 Google DeepMind (@GoogleDeepMind) <a href=\"https:\/\/twitter.com\/GoogleDeepMind\/status\/1937511523183649002?ref_src=twsrc%5Etfw\">June 24, 2025<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>Many companies continue to actively develop the sector. In June, <a href=\"https:\/\/forklog.com\/en\/news\/amazon-to-replace-couriers-with-robots-reports-suggest\">it was revealed<\/a> that Amazon is working on AI software to enable humanoid robots to deliver packages in Rivian electric vans.<\/p>\n<p>In March, Nvidia <a href=\"https:\/\/forklog.com\/en\/news\/nvidias-2025-gtc-new-ai-chips-pcs-robots-and-partnerships\">introduced<\/a> a motion simulation engine for robots.<\/p>\n<p>Earlier, 21 humanoid robots <a href=\"https:\/\/forklog.com\/en\/news\/robots-join-humans-in-beijing-half-marathon\">participated<\/a> in the Beijing half marathon.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Google DeepMind has launched a new language model, Gemini Robotics On-Device, designed to enable robots to operate locally without an internet connection. We\u2019re bringing powerful AI directly onto robots with Gemini Robotics On-Device. ? It\u2019s our first vision-language-action model to help make robots faster, highly efficient, and adaptable to new tasks and environments \u2014 without [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":24899,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"","news_style_id":"","cryptorium_level":"","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[438,738,652],"class_list":["post-24900","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-artificial-intelligence","tag-google","tag-robots"],"aioseo_notices":[],"amp_enabled":true,"views":"64","promo_type":"","layout_type":"","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/24900","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/comments?post=24900"}],"version-history":[{"count":0,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/24900\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media\/24899"}],"wp:attachment":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media?parent=24900"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/categories?post=24900"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/tags?post=24900"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}