{"id":15487,"date":"2024-07-24T12:13:21","date_gmt":"2024-07-24T09:13:21","guid":{"rendered":"https:\/\/forklog.com\/en\/meta-unveils-llama-3-1-ai-model-collection\/"},"modified":"2024-07-24T12:13:21","modified_gmt":"2024-07-24T09:13:21","slug":"meta-unveils-llama-3-1-ai-model-collection","status":"publish","type":"post","link":"https:\/\/forklog.com\/en\/meta-unveils-llama-3-1-ai-model-collection\/","title":{"rendered":"Meta Unveils Llama 3.1 AI Model Collection"},"content":{"rendered":"<p>Meta has introduced a new collection of AI systems, Llama 3.1, including the much-anticipated 405B, described as the &#8220;first open-source state-of-the-art model.&#8221; The Meta AI assistant has also been updated.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">Starting today, open source is leading the way. Introducing Llama 3.1: Our most capable models yet.<\/p>\n<p>Today we\u2019re releasing a collection of new Llama 3.1 models including our long awaited 405B. These models deliver improved reasoning capabilities, a larger 128K token context\u2026 <a href=\"https:\/\/t.co\/1iKpBJuReD\">pic.twitter.com\/1iKpBJuReD<\/a><\/p>\n<p>\u2014 AI at Meta (@AIatMeta) <a href=\"https:\/\/twitter.com\/AIatMeta\/status\/1815766327463907421?ref_src=twsrc%5Etfw\">July 23, 2024<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<\/blockquote>\n<p>Meta&#8217;s flagship AI model was trained on data from 15 trillion tokens using 16,000 Nvidia H100 graphics processors. It boasts 405 billion parameters, a term describing problem-solving skills. The larger the number, the better the performance. The system can perform tasks such as coding, answering basic math questions, or summarizing documents.\u00a0<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>&#8220;Llama 3.1 405B is in a class of its own, offering unmatched flexibility, control, and cutting-edge capabilities that rival the best closed-source models,&#8221; the announcement states.\u00a0<\/p>\n<\/blockquote>\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh7-rt.googleusercontent.com\/docsz\/AD_4nXf8-DkkBmn9jpFpkE1tm8Ecz2Jkx0VLXG6-YoRcak3-8EHV9zX5E7ax3LJZK2t7IJqf9DB3Kv7A2y4oHN7qPQBp4mhDTHs5CHrblvlY_YT5aW2iL9_n1YlWWvea78sk11QoMCVAi91fUTR802BPXo_D98NH?key=K-d-qWT7ZURlmlt5UhZZSw\" alt=\"Meta \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u0438\u043b\u0430 \u043a\u043e\u043b\u043b\u0435\u043a\u0446\u0438\u044e \u0418\u0418-\u043c\u043e\u0434\u0435\u043b\u0435\u0439 Llama 3.1\"\/><figcaption class=\"wp-element-caption\">Comparison of Llama 3.1 405B with competitors. Data: <a href=\"https:\/\/ai.meta.com\/blog\/meta-llama-3-1\/?utm_source=twitter&#038;utm_medium=organic_social&#038;utm_content=video&#038;utm_campaign=llama31\">Meta<\/a>.<\/figcaption><\/figure>\n<p>Llama 3.1 405B is text-only. Developers are experimenting with multimodality, but there are no public models for images, video, or speech yet.\u00a0<\/p>\n<p>The new Meta AI system has a larger context window compared to previous systems\u2014128,000 tokens, equivalent to a 50-page book. A larger context allows for longer conversations without forgetting previously discussed topics.\u00a0<\/p>\n<p>Running Llama 3.1 405B requires powerful hardware. Alongside the flagship neural network, Meta introduced two smaller models, Llama 3.1 8B and Llama 3.1 70B. They also have a context window of 128,000 tokens.\u00a0<\/p>\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh7-rt.googleusercontent.com\/docsz\/AD_4nXdb0zuadFLiVqpnoOKH3q_R2TrrcXkf6MDUHKVx3-WfA1tyQLmEtSjn0Q3qJriHmzYdxwqBmAG4M8lPyW1jkHdp_ReZhzuM3PGwlcagV08OVTU7HTk45mwKOs-PpgKxfy9nhPf9s68VPCv5_14v0JXKYEfp?key=K-d-qWT7ZURlmlt5UhZZSw\" alt=\"Meta \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u0438\u043b\u0430 \u043a\u043e\u043b\u043b\u0435\u043a\u0446\u0438\u044e \u0418\u0418-\u043c\u043e\u0434\u0435\u043b\u0435\u0439 Llama 3.1\"\/><figcaption class=\"wp-element-caption\">Comparison of Llama 3.1 8B and Llama 3.1 70B with other AI models. Data: Meta.\u00a0<\/figcaption><\/figure>\n<h2 class=\"wp-block-heading\"><strong>Mark Zuckerberg Advocates for Open Source<\/strong><\/h2>\n<p>In an open <a href=\"https:\/\/about.fb.com\/news\/2024\/07\/open-source-ai-is-the-path-forward\/\">letter<\/a>, Mark Zuckerberg endorsed the development of open-source artificial intelligence.\u00a0<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>&#8220;Today, several tech companies are developing leading closed models. But open source is quickly closing the gap. Last year, Llama 2 could only compete with the older generation of models. This year, Llama 3 is competitive with the most advanced models and leads in some areas. We expect that from next year, future Llama models will be the most advanced in the industry. But even now, Llama already leads in openness, modifiability, and cost-effectiveness,&#8221; he noted.<\/p>\n<\/blockquote>\n<p>According to the billionaire, open source will provide more people worldwide with access to the benefits and opportunities of AI, prevent power concentration in the hands of a few companies, and enable a more equitable and safe integration of technology into society.\u00a0<\/p>\n<h2 class=\"wp-block-heading\"><strong>New Features Added to Meta AI<\/strong><\/h2>\n<p>Alongside the release of the new AI model, <a href=\"https:\/\/ai.meta.com\/meta-ai\/\">Meta AI<\/a>, the AI-based assistant on Facebook, WhatsApp, Instagram, and Messenger, has been updated. It can now speak more languages and create stylized selfies.\u00a0<\/p>\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh7-rt.googleusercontent.com\/docsz\/AD_4nXcJZODuBQ9OT2HgWQjjlo4DIUjDWkLRyH99XFqBSLWFkXxqVzGFUOnIfqkLYv9a9OlJBpn_Z6yXOxeAxWclc7EXajFwC8Xo0T2_24rXdKuv2oHaVU1ihnFPpVHUapYpCJ16qhmGtjhADTvRBqlJsI4j_l8?key=K-d-qWT7ZURlmlt5UhZZSw\" alt=\"Meta \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u0438\u043b\u0430 \u043a\u043e\u043b\u043b\u0435\u043a\u0446\u0438\u044e \u0418\u0418-\u043c\u043e\u0434\u0435\u043b\u0435\u0439 Llama 3.1\"\/><figcaption class=\"wp-element-caption\">Stylized photos. Data: Meta.<\/figcaption><\/figure>\n<p>The Imagine Yourself feature allows users to modify a photo of themselves using prompts. For example, one can take a selfie and request, &#8220;Show me surfing.&#8221; The service is currently in beta in the U.S.<\/p>\n<p>New editing features will soon be added to Meta AI. Next month, an &#8220;Edit with AI&#8221; button will be available for additional settings.\u00a0<\/p>\n<p>The new Llama 3.1 405B model is integrated into the AI service, allowing users to direct questions to it.\u00a0<\/p>\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh7-rt.googleusercontent.com\/docsz\/AD_4nXeVoi6MlymvTDxTEJXynw9RzdRespCVEWCERYvZiBPya5DpneUickWZ585VP64OoEkGP5Qr0nkuvFq_fmMKH0kI0k5qObytjSetMC0QHqG80A3K_HeBhEjjtLeuSJy6FAgY9CD7u5d92j1i_F7dyFXKNbf0?key=K-d-qWT7ZURlmlt5UhZZSw\" alt=\"Meta \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u0438\u043b\u0430 \u043a\u043e\u043b\u043b\u0435\u043a\u0446\u0438\u044e \u0418\u0418-\u043c\u043e\u0434\u0435\u043b\u0435\u0439 Llama 3.1\"\/><figcaption class=\"wp-element-caption\">Switching to Llama 3.1 405B AI assistant. Data: Meta.\u00a0<\/figcaption><\/figure>\n<h2 class=\"wp-block-heading\"><strong>Mark Zuckerberg&#8217;s Vision for AI Clones<\/strong><\/h2>\n<p>In a new <a href=\"https:\/\/www.youtube.com\/watch?v=Vy3OkbtUa5k\">interview<\/a> with Rowan Chung, Zuckerberg shared his vision of a future where content creators use AI-based bots for personal and corporate purposes. They will be able to delegate some community interaction tasks to these bots, freeing up time for other activities.\u00a0<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>&#8220;I think there will be a huge breakthrough when every creator can train these systems to reflect their values and goals, and then people will interact with that,&#8221; noted the Meta co-founder.<\/p>\n<\/blockquote>\n<p>Back in June, Zuckerberg&#8217;s company announced the addition of AI characters to Instagram.\u00a0<\/p>\n<p>In April, the corporation <a href=\"https:\/\/forklog.com\/en\/news\/meta-integrates-ai-assistant-across-its-platforms\">launched<\/a> the Meta AI assistant across its platforms.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Meta has introduced a new collection of AI systems, Llama 3.1, including the much-anticipated 405B, described as the &#8220;first open-source state-of-the-art model.&#8221; The Meta AI assistant has also been updated. Starting today, open source is leading the way. Introducing Llama 3.1: Our most capable models yet. Today we\u2019re releasing a collection of new Llama 3.1 [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":15486,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"","news_style_id":"","cryptorium_level":"","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[438,1201,611,1293,1150],"class_list":["post-15487","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-artificial-intelligence","tag-chatbots","tag-facebook","tag-meta","tag-news-plus"],"aioseo_notices":[],"amp_enabled":true,"views":"53","promo_type":"","layout_type":"","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/15487","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/comments?post=15487"}],"version-history":[{"count":0,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/15487\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media\/15486"}],"wp:attachment":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media?parent=15487"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/categories?post=15487"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/tags?post=15487"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}