{"id":17849,"date":"2024-10-17T14:36:25","date_gmt":"2024-10-17T11:36:25","guid":{"rendered":"https:\/\/forklog.com\/en\/mistral-introduces-ai-models-for-laptops-and-smartphones\/"},"modified":"2024-10-17T14:36:25","modified_gmt":"2024-10-17T11:36:25","slug":"mistral-introduces-ai-models-for-laptops-and-smartphones","status":"publish","type":"post","link":"https:\/\/forklog.com\/en\/mistral-introduces-ai-models-for-laptops-and-smartphones\/","title":{"rendered":"Mistral Introduces AI Models for Laptops and Smartphones"},"content":{"rendered":"<p>The French AI startup Mistral has <a href=\"https:\/\/mistral.ai\/news\/ministraux\/\">released<\/a> its first generative artificial intelligence models designed for use on edge devices such as laptops and smartphones.\u00a0<\/p>\n<p>The new family of neural networks, Les Ministraux, can be utilized and customized for various applications, ranging from simple text generation to more collaborative functional tasks.<\/p>\n<p>Two models have been introduced: Ministral 3B and Ministral 8B. Both feature a context window of 128,000 tokens, enabling them to process a book of 50 pages.\u00a0<\/p>\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh7-qw.googleusercontent.com\/docsz\/AD_4nXfFkT3QVkTs0Ks9KqS8PHJWYTp_wlbWfdXguB3vYnpdIsZ82MZdxqvOEJaTHfqsItSdR-uoClCkRkVhDWXYBkdWA-aEbK_63vsokGCDaz9zMx8lLPITfVxtgPJ3tIZ13-LH15tHEhZ2ELaeAMki_-pWKbE?key=b1rFunqo2hCrBCfOrMUw7Q\" alt=\"Mistral Introduces AI Models for Laptops and Smartphones\"\/><figcaption class=\"wp-element-caption\">Comparison of Ministral 3B and 8B with Gemma 2 2B, Llama 3.2 3B, Llama 3.1 8B, and Mistral 7B. Data: Mistral blog.<\/figcaption><\/figure>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cOur most innovative clients and partners are increasingly asking for local inferences with a focus on privacy for critical applications such as on-device translation, offline smart assistants, local analytics, and autonomous robotics. Les Ministraux are designed to provide a computationally efficient and low-latency solution for these scenarios,\u201d the team noted in the blog.\u00a0<\/p>\n<\/blockquote>\n<p>In September, Mistral released its first multimodal model capable of processing images and text. In July, it <a href=\"https:\/\/forklog.com\/en\/news\/mistral-ai-unveils-flagship-ai-model-large-2\">introduced the flagship neural network<\/a> Large 2, which \u201coperates on par with GPT-4o, Claude 3 Opus, and Llama 3 405B.\u201d<\/p>\n<p>Back in June, Mistral AI <a href=\"https:\/\/forklog.com\/en\/news\/mistral-ai-unveils-flagship-ai-model-large-2\">raised<\/a> $640 million at a valuation of $6 billion. Founded by former employees of Google DeepMind and Meta, the startup is considered a European competitor to OpenAI.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The French AI startup Mistral has released its first generative artificial intelligence models designed for use on edge devices such as laptops and smartphones.\u00a0 The new family of neural networks, Les Ministraux, can be utilized and customized for various applications, ranging from simple text generation to more collaborative functional tasks. Two models have been introduced: [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":17848,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"","news_style_id":"","cryptorium_level":"","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[438,1477],"class_list":["post-17849","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-artificial-intelligence","tag-mistral"],"aioseo_notices":[],"amp_enabled":true,"views":"48","promo_type":"","layout_type":"","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/17849","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/comments?post=17849"}],"version-history":[{"count":0,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/17849\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media\/17848"}],"wp:attachment":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media?parent=17849"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/categories?post=17849"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/tags?post=17849"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}