OpenAI has reduced its planned expenditure on computing power by 2030 from $1.4 trillion to $600 billion, according to CNBC, citing sources.
The management presented investors with a lowered estimate and clear timelines for the planned expenses. This move comes amid concerns that the company’s initial plans were overly ambitious and might not yield the expected revenue.
OpenAI forecasts that by 2030 its revenue will exceed $280 billion, with equal contributions from both consumer and corporate segments.
In 2025, the startup generated over $20 billion in revenue, as reported by CFO Sarah Friar in January. However, CNBC journalists provide different figures: revenue amounted to $13.1 billion, with actual expenses at $8 billion instead of the planned $9 billion.
Stargate Project Fails
The Stargate project, which made headlines in early 2025 with a $500 billion budget, has failed, according to media reports.
Thirteen months later, the joint venture has not hired staff, commenced construction on any facilities, and partners involved in the project cannot agree on responsibilities.
According to The Information, Stargate lacks a team, separate management, and is not engaged in building OpenAI’s data centers. For months, parties have debated basic issues: who constructs, owns the facilities, and how funding is allocated.
From September to October 2025, top executives of the AI startup repeatedly traveled to Tokyo for negotiations with SoftBank’s Masayoshi Son. However, the parties failed to decide who would be the developer and owner of the flagship campus in Texas.
OpenAI considered executing the project independently, but creditors refused to provide funds.
AI’s Energy Consumption Concerns
At The Indian Express event, Sam Altman addressed concerns about AI’s environmental impact. He stated that fears over massive water consumption are “completely fabricated.”
“This was a real issue back when data centers used evaporative cooling. Now that we don’t, you can see claims online like: ‘Don’t use ChatGPT. It consumes 17 gallons of water per query.’ This is utterly false, absolutely insane, and has nothing to do with reality,” the entrepreneur stated.
He added that it is “fair” to worry about overall energy consumption—not per query, but in general. According to Altman, the world needs to “rapidly transition to nuclear, wind, and solar energy.”
When asked by the host if a single ChatGPT query truly requires energy comparable to one and a half iPhone charges, the CEO of OpenAI responded negatively. He stated that actual figures are far from these values.
He lamented that many discuss the high electricity consumption for creating AI, but no one talks about “how much it costs a person to perform a single query.”
“Its training also requires a lot of energy. It’s about 20 years of life and all the food you consume during that time before you become smart. Moreover, it was preceded by the evolution of 100 billion people who ever lived, learned not to be eaten by predators, understood science, and so on,” the entrepreneur stated.
Altman believes it is appropriate to make such a comparison:
“How much energy would it take for ChatGPT to answer a given question compared to a human? Probably, in this context, AI has already achieved comparable energy efficiency.”
Previously, AI led to issues in the energy systems of wealthy countries.
