{"id":16884,"date":"2024-09-11T16:43:14","date_gmt":"2024-09-11T13:43:14","guid":{"rendered":"https:\/\/forklog.com\/en\/mistral-launches-multimodal-neural-network\/"},"modified":"2024-09-11T16:43:14","modified_gmt":"2024-09-11T13:43:14","slug":"mistral-launches-multimodal-neural-network","status":"publish","type":"post","link":"https:\/\/forklog.com\/en\/mistral-launches-multimodal-neural-network\/","title":{"rendered":"Mistral Launches Multimodal Neural Network"},"content":{"rendered":"<p>The French AI startup Mistral has unveiled its first multimodal model, capable of processing both images and text.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">We dropped a new model \u2014 Pixtral 12B, our first-ever multimodal model. Enjoy! ?? <a href=\"https:\/\/t.co\/uvXnpJf6mQ\">https:\/\/t.co\/uvXnpJf6mQ<\/a><\/p>\n<p>\u2014 Sophia Yang, Ph.D. (@sophiamyang) <a href=\"https:\/\/twitter.com\/sophiamyang\/status\/1833820604618924531?ref_src=twsrc%5Etfw\">September 11, 2024<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>Pixtral 12B is approximately 24 GB in size and boasts 12 billion parameters, a term used to describe its problem-solving capabilities. By comparison, the flagship <a href=\"https:\/\/forklog.com\/en\/news\/mistral-ai-unveils-flagship-ai-model-large-2\">Llama 3.1 405B<\/a> has 405 billion parameters.<\/p>\n<p>The AI is built on one of the Nemo 12B text neural networks and can answer questions about images.<\/p>\n<p>Currently, the model can be downloaded from GitHub and Hugging Face, and it will later be available in the Mistral chatbot. This was announced by Mistral&#8217;s Head of Developer Relations, Sophia Yang.<\/p>\n<blockquote class=\"twitter-tweet\">\n<p lang=\"en\" dir=\"ltr\">We dropped a new model \u2014 Pixtral 12B, our first-ever multimodal model. Enjoy! ?? <a href=\"https:\/\/t.co\/uvXnpJf6mQ\">https:\/\/t.co\/uvXnpJf6mQ<\/a><\/p>\n<p>\u2014 Sophia Yang, Ph.D. (@sophiamyang) <a href=\"https:\/\/twitter.com\/sophiamyang\/status\/1833820604618924531?ref_src=twsrc%5Etfw\">September 11, 2024<\/a><\/p><\/blockquote>\n<p> <script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>Mistral is considered a European rival to American AI giants like OpenAI. The firm raised \u20ac385 million in December 2023 and another <a href=\"https:\/\/forklog.com\/en\/news\/parisian-ai-startup-mistral-ai-secures-e600-million\">\u20ac600 million<\/a> in June 2024. Microsoft is among the company&#8217;s shareholders.<\/p>\n<p>In May, the startup <a href=\"https:\/\/forklog.com\/en\/news\/mistral-ai-unveils-code-generating-ai-model\">introduced<\/a> Codestral, an artificial intelligence model for code generation.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The French AI startup Mistral has unveiled its first multimodal model, capable of processing both images and text. We dropped a new model \u2014 Pixtral 12B, our first-ever multimodal model. Enjoy! ?? https:\/\/t.co\/uvXnpJf6mQ \u2014 Sophia Yang, Ph.D. (@sophiamyang) September 11, 2024 Pixtral 12B is approximately 24 GB in size and boasts 12 billion parameters, a [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":16883,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"","news_style_id":"","cryptorium_level":"","_short_excerpt_text":"","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[438,1477],"class_list":["post-16884","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-artificial-intelligence","tag-mistral"],"aioseo_notices":[],"amp_enabled":true,"views":"54","promo_type":"","layout_type":"","short_excerpt":"","is_update":"","_links":{"self":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/16884","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/comments?post=16884"}],"version-history":[{"count":0,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/posts\/16884\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media\/16883"}],"wp:attachment":[{"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/media?parent=16884"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/categories?post=16884"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/forklog.com\/en\/wp-json\/wp\/v2\/tags?post=16884"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}