"It used to be energy, now it's storage!" -- OpenAI COO discusses "AI bottleneck"

robot
Abstract generation in progress

Storage chip shortages are replacing power supply as the primary constraint in the expansion of artificial intelligence infrastructure.

OpenAI Chief Operating Officer Brad Lightcap said Tuesday at the Hill and Valley Forum in Washington that the shortage of storage chips and tight US energy supplies are the two major potential bottlenecks facing AI infrastructure expansion. “The current bottleneck is storage; it used to be power,” he stated directly on stage.

This statement aligns closely with the earlier assessment by SK Hynix President Chey Tae-won, who predicted that the global storage chip shortage would continue until around 2030, with industry wafer supply gaps exceeding 20%.

For the market, this means the core conflict in the AI computing power arms race is undergoing a structural shift: from data center location and grid capacity to supply chain security and capacity bottlenecks of storage chips. The strategic positions of AI accelerator providers like NVIDIA and HBM (High Bandwidth Memory) manufacturers are thus further emphasized.

Storage Replaces Power as the New Bottleneck for AI Expansion

Lightcap’s statement marks a clear shift in the AI industry’s understanding of infrastructure constraints. Over the past two years, insufficient power supply for data centers has been the most discussed issue, but now storage chip shortages have become a more urgent practical obstacle.

The root of this shortage lies in explosive demand growth. AI companies like OpenAI continue to purchase large quantities of NVIDIA AI accelerators, each equipped with substantial storage chips, thereby consuming a significant portion of global storage capacity. Lightcap pointed out that OpenAI is actively diversifying suppliers and expanding data center geographic distribution to ensure infrastructure expansion plans are not limited by a single supply chain.

According to Bloomberg, OpenAI has previously committed to investing $1.4 trillion over the next few years in data center construction and chip procurement to support the development of more advanced AI systems and broader technological adoption.

Shortage May Persist Until 2030; Traditional DRAM Also Under Pressure

Earlier, Chey Tae-won, President of SK Hynix, stated at NVIDIA GTC that the global storage chip shortage is expected to continue for another four to five years, as expanding wafer capacity will take at least the same amount of time, making it difficult for major storage manufacturers to fully meet market demand before 2030.

Chey also warned that the industry’s over-focus on high bandwidth memory (HBM) could lead to shortages of traditional DRAM, affecting smartphone and PC markets. In recent years, SK Hynix, Samsung, and Micron have shifted a significant portion of their capacity toward HBM for AI accelerators, resulting in a decline in traditional DRAM output and a sharp rise in consumer electronics prices.

Currently, SK Hynix holds about 57% of the global HBM market and approximately 32% of the overall DRAM market. The company is building a $13 billion HBM packaging and testing plant in Cheongju, South Korea, scheduled to break ground next month and be completed by the end of 2027.

Unresolved Energy Issues, Nuclear Power and Government Support Expected to Play Key Roles

Despite storage becoming the most urgent current bottleneck, energy supply pressures have not eased. Lightcap stated that OpenAI is considering diversifying its power sources, including nuclear energy, to meet the rising energy demands, and revealed that the company is in talks with fusion startup Helion Energy.

Notably, OpenAI CEO Sam Altman was previously a supporter of Helion and announced on Monday that he has resigned from Helion’s board, avoiding involvement in negotiations between the two companies.

Lightcap emphasized that government investment in energy supply is “crucial” for the success of the AI industry, and highly praised the Trump administration’s efforts to promote AI infrastructure development and accelerate government adoption of AI technologies.

On the government front, Lightcap disclosed that OpenAI currently serves over one million US federal, state, and local government employees, and plans to accelerate providing products to federal agencies as a strategic priority. At the end of last month, OpenAI reached an agreement with the US Department of Defense to deploy its AI models within Pentagon classified networks. Previously, the Pentagon announced the termination of its partnership with competitor Anthropic PBC. Lightcap described serving government agencies as a “crucial” direction for the company.

Risk Warning and Disclaimer

Market risks exist; investments should be cautious. This article does not constitute personal investment advice and does not consider individual users’ specific investment goals, financial situations, or needs. Users should consider whether any opinions, views, or conclusions herein are suitable for their particular circumstances. Investment carries risks; responsibility rests with the individual.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin