{"id":91033,"date":"2025-11-14T14:26:37","date_gmt":"2025-11-14T11:26:37","guid":{"rendered":"https:\/\/forklog.com\/en\/?p=91033"},"modified":"2025-11-14T14:31:32","modified_gmt":"2025-11-14T11:31:32","slug":"google-unveils-ai-updates-notebooklm-a-robot-brain-and-shopping-tools","status":"publish","type":"post","link":"https:\/\/forklog.com\/en\/google-unveils-ai-updates-notebooklm-a-robot-brain-and-shopping-tools\/","title":{"rendered":"Google unveils AI updates: NotebookLM, a robot \u2018brain\u2019 and shopping tools"},"content":{"rendered":"<p>Google unveiled a raft of new agentic-AI features, including a Deep Research mode in NotebookLM, SIMA 2\u2014the \u201cbrain for robots\u201d\u2014and shopping tools.<\/p>\n<h2 class=\"wp-block-heading\">Deep Research in NotebookLM<\/h2>\n<p>Google has updated its note-taking AI assistant <a href=\"https:\/\/notebooklm.google\/\">NotebookLM<\/a>, adding a tool to simplify complex research and support for more file types.<\/p>\n<p>The service introduced Deep Research, an automation tool for online search. The company says it works like an independent researcher: it can produce a detailed report or recommend relevant articles, academic papers and websites.<\/p>\n<p>Deep Research takes a question, drafts a research plan and scans the web. After a few minutes it delivers a source-backed report that can be added directly to the notebook.<\/p>\n<p>The mode runs in the background, allowing other tasks to continue in parallel.<\/p>\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" src=\"https:\/\/forklog.com\/wp-content\/uploads\/img-35f450eec518f302-9511099506469003.webp\" alt=\"image\" class=\"wp-image-269743\"\/><figcaption class=\"wp-element-caption\">Source: Google.<\/figcaption><\/figure>\n<p>The tool is accessible via search. Users can choose a research style: the detailed Deep Research or the quicker Fast Research.<\/p>\n<p>NotebookLM now supports Google Sheets, Drive files via URL, PDFs from Google Drive and Microsoft Word documents.<\/p>\n<p>The updates will roll out within a week.<\/p>\n<p>NotebookLM is an AI assistant from Google for notes, research and document work. It lets users upload materials\u2014PDFs, articles, spreadsheets, images, links, legal documents, lectures\u2014and build a structured knowledge base.<\/p>\n<p>The service launched in 2023. Since then its capabilities have steadily expanded with AI. In early 2025, the Video Overviews feature arrived, turning complex multimedia material into clear visual presentations.<\/p>\n<p>In May, NotebookLM became available on Android and iOS.<\/p>\n<h2 class=\"wp-block-heading\">The future of robots<\/h2>\n<p>Google is pushing ahead with a \u201cbrain\u201d for robotics.<\/p>\n<p>Its DeepMind unit <a href=\"https:\/\/deepmind.google\/blog\/sima-2-an-agent-that-plays-reasons-and-learns-with-you-in-virtual-3d-worlds\/\">unveiled<\/a> SIMA 2, a new generation of generalist AI agent. It \u201cgoes beyond simply following instructions\u201d, starting to understand and interact with its environment.<\/p>\n<p>The first SIMA was trained on hundreds of hours of gameplay footage to learn to play various 3D games like a human. Introduced in March 2024, it could execute basic commands across virtual worlds, but managed complex tasks only 31% of the time.<\/p>\n<p>SIMA 2 draws on Gemini\u2019s language and reasoning capabilities and runs on the 2.5 flash-lite version. Accuracy has risen to 65%.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cSIMA 2 is a qualitative leap over SIMA 1. It is a more generalist agent. It can perform complex tasks in a new environment it has not seen before,\u201d <a href=\"https:\/\/techcrunch.com\/2025\/11\/13\/googles-sima-2-agent-uses-gemini-to-reason-and-act-in-virtual-worlds\/\">said<\/a> senior research scientist at DeepMind Joe Marino at a press briefing.<\/p>\n<\/blockquote>\n<p>The assistant can teach itself\u2014improving its skills based on experience. That is a step towards more general robots and systems, Marino noted.<\/p>\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" src=\"https:\/\/forklog.com\/wp-content\/uploads\/img-12a7d36e5d49eccf-9511183289254345.webp\" alt=\"image\" class=\"wp-image-269744\"\/><figcaption class=\"wp-element-caption\">SIMA 2\u2019s performance is twice that of the first version. Source: Google DeepMind.<\/figcaption><\/figure>\n<h3 class=\"wp-block-heading\">A step towards AGI<\/h3>\n<p>Researchers at Google\u2019s AI division stressed that work on so-called embodied agents is critical to developing general intelligence. Such an assistant must be able to interact with the physical and virtual worlds through a body\u2014like a person or a robot.<\/p>\n<p>A disembodied assistant can manage a calendar, take notes or execute code, Marino explained.<\/p>\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/forklog.com\/wp-content\/uploads\/img-6611604a2a4a6c6d-9511218856842579-1024x576.png\" alt=\"image\" class=\"wp-image-269745\" srcset=\"https:\/\/forklog.com\/wp-content\/uploads\/img-6611604a2a4a6c6d-9511218856842579-1024x576.png 1024w, https:\/\/forklog.com\/wp-content\/uploads\/img-6611604a2a4a6c6d-9511218856842579-300x169.png 300w, https:\/\/forklog.com\/wp-content\/uploads\/img-6611604a2a4a6c6d-9511218856842579-768x432.png 768w, https:\/\/forklog.com\/wp-content\/uploads\/img-6611604a2a4a6c6d-9511218856842579.png 1440w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Source: Google.<\/figcaption><\/figure>\n<p>Senior research scientist Jane Wang, who has a neurobiology background, stressed that SIMA 2 goes far beyond ordinary game behaviour.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cWe require it to truly understand what is happening, what it is being asked to do, and to respond sensibly and meaningfully. That is actually quite difficult,\u201d she said.<\/p>\n<\/blockquote>\n<p>Integrating Gemini allowed SIMA 2 to double the metrics of its predecessor. The model combines advanced language and analytical abilities with embodied interaction skills acquired during training.<\/p>\n<h3 class=\"wp-block-heading\">Taught to play video games<\/h3>\n<p>Marino demonstrated SIMA 2 in No Man\u2019s Sky. The agent described its surroundings\u2014a planet\u2019s rocky surface\u2014and determined its next actions, using Gemini for internal reasoning.<\/p>\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\">\n<div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"SIMA 2: An agent that plays, reasons, and learns with you in virtual 3D worlds\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/Zphax4f6Rls?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div>\n<\/figure>\n<p>In another game the assistant was asked to approach a house the colour of a ripe tomato. The AI showed its chain of thought: \u201cit is red, so I should go to the house of the corresponding colour.\u201d It then moved in the right direction.<\/p>\n<p>Thanks to Gemini, the agent can even understand emoji instructions. The command \u201c\ud83e\ude93\ud83c\udf32\u201d will make it chop a tree.<\/p>\n<p>SIMA 2 navigates photorealistic worlds generated with Genie. It correctly recognises objects such as benches, trees and butterflies, and can interact with them.<\/p>\n<h3 class=\"wp-block-heading\">Self-learning<\/h3>\n<p>With Gemini, the new SIMA can self-learn with minimal human input, using provided data only as a basic guide.<\/p>\n<p>The team drops the agent into a new environment, and a separate model generates tasks for it.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">Generalization \u2602\ufe0f<\/p>\n<p>SIMA 2 is now far better at carrying out detailed instructions, even in worlds it&#8217;s never seen before.<\/p>\n<p>It can transfer learned concepts like \u201cmining\u201d in one game and apply it to \u201charvesting\u201d in another \u2013 connecting the dots between similar tasks.<\/p>\n<p>It even\u2026 <a href=\"https:\/\/t.co\/ANldQVWFd4\">pic.twitter.com\/ANldQVWFd4<\/a><\/p>\n<p>\u2014 Google DeepMind (@GoogleDeepMind) <a href=\"https:\/\/twitter.com\/GoogleDeepMind\/status\/1988986223390572778?ref_src=twsrc%5Etfw\">November 13, 2025<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>SIMA 2 analyses its shortcomings and gradually improves its skills. In essence, it is trial-and-error learning without a human in the loop: another AI system acts as the tutor.<\/p>\n<p>DeepMind sees the new system as a step towards truly general-purpose robots.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cA system for performing tasks in the real world needs two key elements: a high-level understanding of the world and the ability to reason,\u201d noted senior research engineer Fr\u00e9d\u00e9ric Besse.<\/p>\n<\/blockquote>\n<p>If someone asks a humanoid robot to check how many cans of beans are left in a cupboard, it needs to understand what beans and a cupboard are, and be able to get to the right place.<\/p>\n<p>SIMA 2 addresses precisely this \u201chigh level of behaviour\u201d, Besse said.<\/p>\n<p>There is no timeline yet for integrating the new system into physical robots.<\/p>\n<h2 class=\"wp-block-heading\">Shopping<\/h2>\n<p>Another area that interests the search giant is AI-powered shopping. The company <a href=\"https:\/\/blog.google\/products\/shopping\/agentic-checkout-holiday-ai-shopping\/?utm_source=tw&#038;utm_medium=social&#038;utm_campaign=og&#038;utm_content=&#038;utm_term=\">released<\/a> a set of new tools for online shopping. Among them:<\/p>\n<ul class=\"wp-block-list\">\n<li>voice purchases in Google Search;<\/li>\n<li>purchases in the Gemini app;<\/li>\n<li>agentic checkout;<\/li>\n<li>an AI tool that can call shops and check the availability of desired products.<\/li>\n<\/ul>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cWe believe the shopping process should not be so tedious. The idea is to keep all the enjoyable parts\u2014the browsing, the serendipity\u2014and remove the boring and hard steps,\u201d said Vidhya Srinivasan, vice president and head of ads and commerce at Google.<\/p>\n<\/blockquote>\n<p>One update is conversational purchasing in AI Mode. Users can talk to Search like a chatbot; it will display product images and add details such as price, reviews and availability.<\/p>\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" src=\"https:\/\/forklog.com\/wp-content\/uploads\/img-261424c2f342f4fb-9511598611750357.webp\" alt=\"image\" class=\"wp-image-269749\"\/><figcaption class=\"wp-element-caption\">Source: Google.<\/figcaption><\/figure>\n<p>The Gemini app can now generate fleshed-out ideas and collections rather than short text tips for shopping queries. For now, the feature is available only in the United States.<\/p>\n<p>Agentic checkout automatically monitors changes related to a product of interest. The service can send price-drop notifications.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cThis is useful for shoppers\u2014they don\u2019t need to keep checking the price of an item. And it\u2019s useful for merchants\u2014customers will come back when otherwise they would have left,\u201d said Lillian Rincon, vice president of product at Google Shopping.<\/p>\n<\/blockquote>\n<p>Another new feature lets the AI call shops on a user\u2019s behalf to ask about stock and current promotions. It is built on the Google Duplex technology introduced in 2018, Shopping Graph and Google\u2019s payments infrastructure.<\/p>\n<p>To use the tool, specify the desired product. The AI will call local shops, ask for details and deliver a short report.<\/p>\n<p>In November, Google <a href=\"https:\/\/forklog.com\/en\/news\/google-brings-message-summaries-and-notification-prioritisation-to-pixel\">added<\/a> message summaries, notification prioritisation and other AI features to Pixel smartphones.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Google unveiled new agentic-AI features, including Deep Research in NotebookLM, SIMA 2 for robots and shopping tools.<\/p>\n","protected":false},"author":1,"featured_media":91034,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"1","news_style_id":"1","cryptorium_level":"","_short_excerpt_text":"Google unveils NotebookLM Deep Research, SIMA 2 for robots, and new AI shopping tools.","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[1751,438,738],"class_list":["post-91033","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-ai-agents","tag-artificial-intelligence","tag-google"],"aioseo_notices":[],"amp_enabled":true,"views":"214","promo_type":"1","layout_type":"1","short_excerpt":"Google unveils NotebookLM Deep Research, SIMA 2 for robots, and new AI shopping tools.","is_update":"","_links":{"self":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/91033","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/comments?post=91033"}],"version-history":[{"count":1,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/91033\/revisions"}],"predecessor-version":[{"id":91035,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/91033\/revisions\/91035"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media\/91034"}],"wp:attachment":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media?parent=91033"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/categories?post=91033"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/tags?post=91033"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}