Telegram (AI) YouTube Facebook X
Ру
Qualcomm Unveils AI Hub for On-Device Processing

Qualcomm Unveils AI Hub for On-Device Processing

At MWC 2024, Qualcomm announced its new tool, AI Hub. This product simplifies access to neural networks for developers of generative AI, who are increasingly shifting operations from the cloud to devices.

AI Hub features a library of over 75 generative AI models. Users can easily download them to Qualcomm-based devices. The collection will be continuously updated with new neural networks.

“Qualcomm AI Hub provides developers with an extensive library for quick and easy integration of pre-optimized AI models into their applications, leading to a faster, more reliable, and personalized user experience”, said Durga Malladi, Senior Vice President of Technology Planning at Qualcomm.

AI Hub includes some of the industry’s most popular models, such as:

  • OpenAI’s Whisper automatic speech recognition system;
  • Stable Diffusion text-to-image model from Stability AI;
  • Meta’s Llama chatbot.

The neural networks are optimized to make the most efficient use of all cores within the Qualcomm AI Engine. According to company representatives, this enhances energy efficiency, reduces memory load, and increases computation speed fourfold.

Running AI models directly on the device enhances privacy. This is particularly important when creating models that use databases with non-public data.

In November 2022, Qualcomm introduced the Snapdragon 8 Gen 2 with a dedicated AI chip.

Подписывайтесь на ForkLog в социальных сетях

Telegram (основной канал) Facebook X
Нашли ошибку в тексте? Выделите ее и нажмите CTRL+ENTER

Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!

We use cookies to improve the quality of our service.

By using this website, you agree to the Privacy policy.

OK