
At Cloud Next, Google touts Gemini 2.5 Flash, the Ironwood chip and GenAI upgrades
- Google announced the Gemini 2.5 Flash AI model.
- It also unveiled a range of tools and services, including an Agent Development Kit for deploying AI agents, the new Ironwood chip and updates to the company’s neural networks.
- Samsung and Reddit announced partnerships with Google to integrate the corporation’s chatbot into their products.
At the annual Cloud Next conference in Las Vegas, Google announced a new AI model, Gemini 2.5 Flash, geared for high performance and efficiency, TechCrunch reports.
The model will soon be available on the Vertex AI developer platform. It offers “dynamic and controllable” compute, allowing users to tune processing time by the complexity of a request.
“[You can tune] speed, accuracy and cost balance to your specific needs. This flexibility is key to optimizing Flash’s performance in high-volume, high-cost applications,” Google emphasised.
Gemini 2.5 Flash is a “reasoning” model. Like OpenAI’s o3-mini and DeepSeek’s R1, it takes a bit longer to respond to allow for additional self-checking.
Google says 2.5 Flash is well suited to “high-performance” and real-time applications.
“This workhorse model is optimized specifically for low latency and lower cost,” the company says.
Tools for AI agents
Beyond the new AI model, Google announced a raft of other offerings, including an Agent Development Kit—an open-source toolkit that makes it easier to create AI agents.
Such digital agents can be used for customer work, programming, running marketing campaigns and other routine tasks. Building an agent will require fewer than 100 lines of code.
The company also introduced Agent2Agent, a set of standards to let AI assistants work together.
A new chip
Google also said it is expanding its in-house chip efforts. Its seventh-generation Ironwood processor is designed for the demands of complex AI models.
Launch is expected in late 2025 for Google Cloud customers. The processor comes in two configurations: clusters of 256 and 9216 chips.
“Ironwood is our most powerful, best-performing and most energy-efficient TPU. It is purpose-built for operating reasoning, inference AI models at scale,” noted Google Cloud vice president Amin Vahdat.
Ironwood delivers up to 4614 TFLOPs of compute. Each chip has 192 GB of dedicated memory with bandwidth approaching 7.4 Tbit/s.
The processor includes a specialised SparseCore for data processing. Its architecture is designed to minimise on-die data movement and latency to save energy, Google says.
AI model updates
The company released updates for several content-generating AI models available on the Vertex AI cloud platform.
- the text-to-music model Lyria is available to a limited set of users;
- the Veo 2 video generator gained new editing features and visual-effects controls;
- the Imagen 3 image generator delivers “significantly” higher performance;
- the company also launched a voice-cloning feature based on Chirp 3, its audio-understanding model.
Lyria is offered as an alternative to music libraries. It lets users create songs in various styles and genres.
Chirp 3 can synthesise speech in about 35 languages. It includes an Instant Custom Voice feature that is said to clone a voice from a 10-second audio sample. The mode undergoes a process of “rigorous verification” to confirm “proper permissions to use the voice.”
Veo 2 gained the ability to remove backgrounds, logos and objects from existing videos, and to adjust camera tilt and scene tempo.
Imagen 3 improves object removal and the restoration of damaged parts of images.
Workspace
Google is updating its Workspace cloud suite to boost AI-powered productivity.
A new Flows tool automates multi-step processes such as updating spreadsheets and searching documents. It can use Gems chatbots and integrate with Google Drive to pull data.
“Just describe what you need in plain language, and Workspace Flows will design and build complex, logic-driven flows,” wrote Google Workspace vice president Yuli Kwon Kim.
Google Docs will soon be able to turn drafts into podcast-style summaries. An upcoming “Help me refine” feature will suggest structural improvements and stronger arguments.
Google Sheets will add a “Help me analyze” button to offer recommendations, identify trends and help create interactive charts.
In Google Meet, a tool called “Take notes for me” will help summarise and revisit specific topics in video calls.
Partnerships with Samsung and Reddit
Samsung said it is adding Google’s Gemini AI to its Ballie home robot through a partnership with Google Cloud. Users will be able to ask questions and receive answers from the chatbot.
“Through this partnership, Samsung and Google Cloud are rethinking the role of AI in the home. By combining Gemini’s powerful multimodal reasoning with Samsung’s AI capabilities in Ballie, we are harnessing the power of open collaboration to usher in a new era of personalized AI companions—ones that move with users, anticipate their needs and interact more dynamically and meaningfully than ever before,” said Yongjae Kim, an executive at Samsung’s visual display division.
Reddit also said it is expanding its partnership with Google. The platform’s Reddit Answers search tool has been updated through integration with Gemini, with improvements in relevance and speed claimed for users.
In March, Google unveiled a new family of “reasoning” AI models, Gemini 2.5.
Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!