{"id":96381,"date":"2026-04-21T19:09:27","date_gmt":"2026-04-21T16:09:27","guid":{"rendered":"https:\/\/forklog.com\/en\/?p=96381"},"modified":"2026-04-21T19:10:28","modified_gmt":"2026-04-21T16:10:28","slug":"former-google-engineers-unveil-ai-for-robots-with-untrained-skills","status":"publish","type":"post","link":"https:\/\/forklog.com\/en\/former-google-engineers-unveil-ai-for-robots-with-untrained-skills\/","title":{"rendered":"Former Google Engineers Unveil AI for Robots with &#8216;Untrained&#8217; Skills"},"content":{"rendered":"<p>A startup founded by former Google engineers, Physical Intelligence, has unveiled the \u03c00.7 model. The developers have claimed a &#8220;qualitative leap&#8221; in the AI&#8217;s ability to generalize skills and perform tasks it was not directly trained for.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">Our newest model, \u03c00.7, has some interesting emergent capabilities: it can control a new robot to fold shirts for which we had no shirt folding data, figure out how to use an appliance with language-based coaching, and perform a wide range of dexterous tasks all in one model! <a href=\"https:\/\/t.co\/s9NxKfb7pe\">pic.twitter.com\/s9NxKfb7pe<\/a><\/p>\n<p>\u2014 Physical Intelligence (@physical_int) <a href=\"https:\/\/twitter.com\/physical_int\/status\/2044841263254638862?ref_src=twsrc%5Etfw\">April 16, 2026<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>The system belongs to the &#8220;Vision-Language-Action&#8221; (VLA) class and is designed for robot control. <\/p>\n<p>Unlike previous solutions, \u03c00.7 has demonstrated signs of compositional generalization\u2014the ability to combine previously learned skills to solve new tasks.<\/p>\n<h2 class=\"wp-block-heading\">Untrained Tasks and Transfer Between Robots<\/h2>\n<p>During experiments, the model exhibited a range of unexpected abilities. Notably, \u03c00.7 was able to control a new type of robot and fold t-shirts, despite the absence of training data for this specific platform.<\/p>\n<blockquote class=\"twitter-tweet\" data-conversation=\"none\">\n<p lang=\"en\" dir=\"ltr\">Compositional generalization is a key capability of large models like LLMs, but it has been elusive in robotics. Another emergent ability we found is to control a new robot (UR5e) to fold t-shirts, even though we didn&#8217;t have any laundry folding data on this robot. <a href=\"https:\/\/t.co\/lAXYag002Z\">pic.twitter.com\/lAXYag002Z<\/a><\/p>\n<p>\u2014 Physical Intelligence (@physical_int) <a href=\"https:\/\/twitter.com\/physical_int\/status\/2044842117839864256?ref_src=twsrc%5Etfw\">April 16, 2026<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>The results are comparable to the level of operators with hundreds of hours of teleoperation experience, noted the programmers. <\/p>\n<p>The tool also managed to understand the use of previously unfamiliar devices, including kitchen appliances. For instance, the robot completed part of a task involving cooking sweet potatoes in an air fryer, even though such scenarios were not in the training set.<\/p>\n<p>According to the developers, this was made possible by combining disparate skills\u2014similar to how language models combine knowledge from different domains.<\/p>\n<h2 class=\"wp-block-heading\">Control Through Language and Context<\/h2>\n<p>One of the key differences of \u03c00.7 is its ability to be controlled not only through &#8220;what to do&#8221; commands but also through &#8220;how to do&#8221; clarifications.<\/p>\n<p>The model accepts:<\/p>\n<ul class=\"wp-block-list\">\n<li>text instructions;<\/li>\n<li>metadata (such as speed and quality of execution);<\/li>\n<li>visual subgoals\u2014images of the expected result of a step.<\/li>\n<\/ul>\n<p>Some of the subgoals can be created by the auxiliary system during operation. This allows the robot to adjust its behavior without retraining.<\/p>\n<blockquote class=\"twitter-tweet\" data-conversation=\"none\">\n<p lang=\"en\" dir=\"ltr\">\u03c00.7 handles diverse prompts that don&#8217;t just say what to do, but also how to do it, including rich language and multimodal information, such as visual subgoal images. At test time, these images can be produced by a lightweight world model. <a href=\"https:\/\/t.co\/cbdovdVjBG\">pic.twitter.com\/cbdovdVjBG<\/a><\/p>\n<p>\u2014 Physical Intelligence (@physical_int) <a href=\"https:\/\/twitter.com\/physical_int\/status\/2044842265982665022?ref_src=twsrc%5Etfw\">April 16, 2026<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>This approach allows for the integration of data from various sources\u2014video, telemetry from robots, and autonomously collected episodes\u2014into a unified learning system.<\/p>\n<h2 class=\"wp-block-heading\">The First Step Towards &#8216;Universal&#8217; Robots<\/h2>\n<p>Physical Intelligence noted that previously such models required retraining for each task\u2014similar to early versions of language models. In contrast, \u03c00.7 works &#8220;out of the box&#8221; and adapts to new scenarios through language.<\/p>\n<p>The team emphasized that such a level of generalization has long been considered a strong point of <span data-descr=\"large language models\" class=\"old_tooltip\">LLM<\/span>, but remained unattainable in robotics.<\/p>\n<p>Despite the progress, the model still struggles with complex tasks without step-by-step prompts. However, with sequential instructions, the quality of execution significantly improves.<\/p>\n<p>In the future, such instructions could help train more autonomous machines capable of acting without human intervention. Physical Intelligence believes that \u03c00.7 shows the first signs of a transition to universal robots that adapt to new conditions without manual adjustment for each task.<\/p>\n<p>Back in February, the company Carbon Robotics <a href=\"https:\/\/forklog.com\/en\/news\/carbon-robotics-unveils-ai-system-for-weed-elimination-by-robots\">released<\/a> the AI model Large Plant Model, which can recognize plant species to combat weeds. <\/p>\n","protected":false},"excerpt":{"rendered":"<p>A startup, Physical Intelligence, has unveiled the \u03c00.7 model. Developers claim a &#8220;qualitative leap&#8221; in AI&#8217;s ability to generalize skills and perform tasks.<\/p>\n","protected":false},"author":1,"featured_media":96382,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"1","news_style_id":"1","cryptorium_level":"","_short_excerpt_text":"Physical Intelligence unveils \u03c00.7 model, claiming a leap in AI's skill generalization.","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[438,738,652],"class_list":["post-96381","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-artificial-intelligence","tag-google","tag-robots"],"aioseo_notices":[],"amp_enabled":true,"views":"12","promo_type":"1","layout_type":"1","short_excerpt":"Physical Intelligence unveils \u03c00.7 model, claiming a leap in AI's skill generalization.","is_update":"","_links":{"self":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/96381","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/comments?post=96381"}],"version-history":[{"count":1,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/96381\/revisions"}],"predecessor-version":[{"id":96383,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/96381\/revisions\/96383"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media\/96382"}],"wp:attachment":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media?parent=96381"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/categories?post=96381"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/tags?post=96381"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}