Scan to Download Gate App
qrCode
More Download Options
Don't remind me again today

OpenAI launches a $300 billion hardware expansion plan! 54% of fund managers warn of an AI bubble.

OpenAI has launched a $300 billion hardware expansion plan, incorporating feedback loops from its chip suppliers, financiers, and energy providers. However, a survey by American Bank in October revealed that 54% of fund managers believe AI is a bubble, and if deployment lags behind delivery plans, it could amplify market fluctuations.

OpenAI 16 Gigawatt Computing Empire's Supply Chain Layout

AMD and OpenAI Announce Strategic Partnership

The multi-year agreement reached by OpenAI with AMD and Broadcom constitutes the core of this $300 billion plan. AMD will provide 6 gigawatts of Instinct GPUs and grant OpenAI equity warrants linked to performance milestones, while Broadcom will co-design and deploy 10 gigawatts of customized silicon chips and rack systems during the same period. These transactions will be delivered gradually between 2026 and 2029, totaling approximately 16 gigawatts of new computing capacity.

OpenAI Hardware Expansion Plan Key Data:

AMD Collaboration: 6 GW Instinct GPU, first deployment in the second half of 2026, with a milestone warrant for up to 160 million shares of AMD stock.

Broadcom Partnership: 10 gigawatt customized accelerators and racks, co-designed with OpenAI, to be completed by the end of 2029.

Stargate Project: 4.5-5.5 GW, in collaboration with Oracle and SoftBank, phased construction across five sites in the United States, exceeding $300 billion within five years.

Total Computing Power: Approximately 16 gigawatts, equivalent to the total electricity consumption of some small countries.

These agreements are the foundation of the Stargate construction project in collaboration with Oracle and SoftBank, which is an expansion of five sites in the United States and could be the largest private financing infrastructure project in the history of technology. OpenAI stated that the deployment by Broadcom will be completed by the end of 2029, meaning the entire expansion plan spans a period of 4 years.

Circular economy model: Supplier equity tied to demand

The structure of these agreements points to a circular economy model for AI infrastructure, where capital, equity incentives, and purchase obligations are interrelated among suppliers, infrastructure providers, and model operators. AMD's arrangement links future GPU deliveries to milestone-based warrants, allowing OpenAI to gain upside exposure from AMD's equity performance, thereby creating a feedback loop between the supplier's valuation and the customer's capacity expansion path.

At the same time, Nvidia disclosed earlier this year that it holds approximately 7% of shares in CoreWeave, while CoreWeave expanded its agreement with OpenAI to $6.5 billion, bringing the total contract value to about $22.4 billion by 2025, linking the chip supplier's equity, the infrastructure rental company's revenue, and OpenAI's computing consumption on the same chain.

Bloomberg also reported on the vendor financing cycle, which involves Nvidia committing up to $100 billion to fund OpenAI's chip procurement, constituting part of the demand funded by the vendors themselves. This structure creates a self-reinforcing cycle: vendors provide financing and equity incentives, OpenAI expands procurement, vendor stock prices rise, OpenAI profits from warrants, and thus has more funds to continue procuring.

This cyclical pattern has sparked intense debate on Wall Street. Supporters argue that it is an innovative model of capital efficiency, while opponents warn that it could evolve into a bubble machine. When the interests of suppliers, customers, and financiers are so closely tied together, if any link encounters a problem, the entire chain could collapse.

AI Energy Demand's Epic Challenge to the Grid

In terms of energy, the availability of the power grid and the delivery cost per megawatt hour determine the feasible speed of model expansion. Goldman Sachs predicts that by 2030, the power demand of global data centers will grow by approximately 165% compared to 2023. With new clusters coming online from 2026 to 2029, this trend will drive data center operators toward long-term power purchase agreements, onsite generation, and site transitions.

Industry media cited a report from McKinsey, believing that by 2030, data centers in the United States could consume over 14% of the nation's electricity, with a compound growth rate reaching approximately 25%. If the interconnection queuing and permitting timelines extend relative to hardware delivery, planning risks will increase. OpenAI's 16 gigawatt computing capability means it will require about 140 terawatt hours (TWh) of electricity annually, equivalent to the total electricity consumption of New York State for the entire year.

This energy demand has triggered multiple challenges. First, can the grid infrastructure support such rapid growth? The power grid in many regions of the United States is aging, and increasing the load by 14% requires significant investment in upgrades. Second, how can the pressure of the clean energy transition be reconciled with the demands of AI energy? If data centers rely on fossil fuel power generation, it will conflict with global carbon reduction goals. Third, rising electricity prices may cause the costs of AI training and inference to soar, eroding the feasibility of business models.

54% Fund managers warn of bubbles and execution risks

A forward-looking perspective depends on three execution thresholds: utilization, energy consumption, and cost curves. In terms of utilization, the announced capacity of AMD, Broadcom, and Stargate will significantly increase to double-digit gigawatts by 2029, while enterprise AI revenues must scale up to keep cluster occupancy above the threshold level that can yield substantial returns.

A survey by Bank of America in October shows that 54% of fund managers believe AI is a bubble, with cash balances nearing 3.8%. If deployment lags behind delivery plans, this situation could amplify the overall market's Fluctuation. The concentration of indices has added another macro channel, as by mid-2025, the 'Magnificent Seven' will hover around one-third of the market capitalization of the S&P 500 index, which increases the sensitivity of passive portfolios to changes in AI news flow and capital expenditure guidance.

The regulatory path remains uncertain. Although the UK Competition and Markets Authority concluded in March 2025 that the partnership between Microsoft and OpenAI does not meet the criteria for a merger investigation, this benchmark may be re-evaluated if the new equity-linked supply arrangements exacerbate concerns regarding market power related to access and pricing.

The execution risk lies in the toolchain, packaging, and memory bandwidth, with a timeline starting from the second half of 2026, developing over several years until 2029. As a result, the financial outcomes for vendors and operators will track the speed at which these gains appear in auditing profit margins and contract pricing. The challenge in portfolio and financial planning is how the announced gigawatts align with the growth of realized workload, regional power transmission capacity, and cost trajectories by 2028.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)