
Nvidia Commits $100 Billion to OpenAI Partnership
Companies to deploy 10 GW data centers.
Nvidia Corporation and OpenAI have signed a memorandum of strategic partnership with a total investment of up to $100 billion.
.@OpenAI and NVIDIA just announced a landmark AI infrastructure partnership:
⚡ 10 gigawatts of compute
🏭 Millions of NVIDIA Vera Rubin GPUs across gigascale AI factories.
The next era of AI starts now. pic.twitter.com/8ueetzIbSu
— NVIDIA (@nvidia) September 22, 2025
According to Bloomberg, the funding will be provided in stages, with the first $10 billion to be released following the signing of the agreement. As part of the deal, Nvidia will acquire a stake in OpenAI.
The parties will deploy data centers with a capacity of 10 GW, aiming to overcome the industry’s main constraint: the lack of computing resources for training complex models.
By expanding its infrastructure, OpenAI plans to develop advanced features—from complex logical inference and multimodal data processing to systems for deep document analysis. The partners believe this will not only reduce the cost of AI-based solutions but also accelerate their transition from laboratories to real-world business applications.
In an interview with CNBC, Nvidia CEO Jensen Huang described the deal as the next breakthrough in artificial intelligence.
“We are talking about the beginning of an industrial revolution in AI,” he stated.
OpenAI co-founder and CEO Sam Altman emphasized that the new computing infrastructure will be the foundation of the future economy.
“Everything starts with computing. Our work with Nvidia aims to achieve new breakthroughs in artificial intelligence and their widespread adoption for people and businesses,” he added.
The company’s president, Greg Brockman, confirmed plans to scale the benefits of the technology for a wide range of users.
The first phase of the project is expected to be operational in the second half of 2026 on Nvidia’s Vera Rubin platform.
A Threat to the Planet?
Deploying 10 GW of computing power is no simple task and far from the most environmentally friendly.
According to energy consultancy 174 Power Global, cooling systems alone for such a facility could account for up to 40% of its energy consumption.
Deloitte experts warned that by the end of 2025, data centers will account for about 2% of global electricity consumption (536 TWh). Demand from energy-intensive AI could increase this figure to over 1000 TWh by 2030.
According to UN estimates, a request to ChatGPT consumes ten times more energy than a Google search. Meanwhile, cooling data centers requires an amount of water comparable to six times Denmark’s total consumption.
Calculations by the Institute for Energy and Environmental Research indicate that in 2018, the US had 1,000 data centers with a combined consumption of 11 GW (1.9% of total US electricity consumption and 31.5 million tons of greenhouse gas emissions). By 2025, their number exceeded 5,000.
“As data centers proliferate, their contribution to carbon emissions is steadily increasing. According to a 2024 study, the carbon footprint of these structures reached 105 million metric tons—about 2% of total US emissions compared to 31.5 million tons in 2018,” experts noted.
Back in July, Meta Corporation announced the creation of a 5 GW data center.
Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!