{"id":21399,"date":"2025-02-21T17:27:01","date_gmt":"2025-02-21T15:27:01","guid":{"rendered":"https:\/\/forklog.com\/en\/figure-unveils-revolutionary-ai-for-robots\/"},"modified":"2025-02-21T17:27:01","modified_gmt":"2025-02-21T15:27:01","slug":"figure-unveils-revolutionary-ai-for-robots","status":"publish","type":"post","link":"https:\/\/forklog.com\/en\/figure-unveils-revolutionary-ai-for-robots\/","title":{"rendered":"Figure Unveils Revolutionary AI for Robots"},"content":{"rendered":"<p>Figure has introduced its proprietary artificial intelligence, Helix, designed for integration with robots. According to its creators, the model is capable of &#8220;reasoning like a human.&#8221;<\/p>\n<p><iframe loading=\"lazy\" width=\"560\" height=\"315\" src=\"https:\/\/www.youtube.com\/embed\/Z3yQHYNXPws?si=NTwzJ7RUE6BlM_3o\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n<p>AI-equipped humanoids can pick up &#8220;virtually any household item without any code or prior training.&#8221; <\/p>\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">We find that just by prompting the robot, we can pick up virtually any object<\/p>\n<p>When asked to &#8220;Pick up the desert item,&#8221; Helix identifies the toy cactus, chooses the nearest hand, and executes precise motor commands to grasp it securely <a href=\"https:\/\/t.co\/2l70kKG1GV\">pic.twitter.com\/2l70kKG1GV<\/a><\/p>\n<p>\u2014 Figure (@Figure_robot) <a href=\"https:\/\/twitter.com\/Figure_robot\/status\/1892577876454801453?ref_src=twsrc%5Etfw\">February 20, 2025<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>&#8220;When asked to &#8216;pick up the desert item,&#8217; Helix identifies the toy cactus, chooses the nearest hand, and executes precise motor commands to grasp it securely,&#8221; notes the Figure team.<\/p>\n<\/blockquote>\n<p>According to a <a href=\"https:\/\/www.figure.ai\/news\/helix\">technical report<\/a>, the new universal Vision-Language-Action model enables machines to interpret natural language commands and interact with unfamiliar objects without the need for prior training or programming for each one.<\/p>\n<p>This allows bots to become more capable over time without needing system updates or specific training on new data. In the video above, the robot was unaware of the items it was dealing with, yet it could determine which should be placed in the refrigerator.<\/p>\n<p>Helix operates on low-power embedded graphics processors, making it suitable for commercial deployment. Figure <a href=\"https:\/\/decrypt.co\/307058\/figure-ai-supercharging-humanoid-robots\">announced<\/a> that it has already secured deals with BMW Manufacturing and an undisclosed major American client. <\/p>\n<p>Earlier in February, Figure exited a collaboration agreement with OpenAI after making a &#8220;significant breakthrough&#8221; in developing artificial intelligence for humanoids. <\/p>\n","protected":false},"excerpt":{"rendered":"<p>Figure has introduced its proprietary artificial intelligence, Helix, designed for integration with robots. According to its creators, the model is capable of &#8220;reasoning like a human.&#8221; AI-equipped humanoids can pick up &#8220;virtually any household item without any code or prior training.&#8221; We find that just by prompting the robot, we can pick up virtually any [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":21398,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"","news_style_id":"","cryptorium_level":"","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[438,652],"class_list":["post-21399","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-artificial-intelligence","tag-robots"],"aioseo_notices":[],"amp_enabled":true,"views":"112","promo_type":"","layout_type":"","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/21399","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/comments?post=21399"}],"version-history":[{"count":0,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/21399\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media\/21398"}],"wp:attachment":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media?parent=21399"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/categories?post=21399"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/tags?post=21399"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}