In Samsung’s full earnings report released on April 30, 2026, the company’s memory chief Kim Jaejune warned that “significant shortages” across memory products are expected to continue through at least 2027. According to the company, demand fulfillment rates have fallen to record lows as customers rush to secure future supply, SCMP reports. The warning closely mirrors comments made by “rival” SK Hynix during its earnings call just a week earlier.
Together with US-based Micron Technology, Samsung and SK hynix control well over 90% of the global DRAM market. When two of the world’s three biggest memory suppliers simultaneously warn of multi-year shortages, it wouldn’t be too out of place to worry.
The shortages are being driven largely by the need for artificial intelligence infrastructure. Modern AI systems require enormous amo unts of high-speed memory to continuously feed data to GPUs and accelerators. At the center of this demand surge is HBM (high-bandwidth memory), a vertically stacked form of DRAM designed to deliver extremely high bandwidth while remaining physically close to processors.
Article continues below You may like
HBM has become critical for AI accelerators. However, the technology is difficult and expensive to manufacture, requiring advanced die stacking, precision bonding, and sophisticated packaging techniques. As a result, supply is limited, and demand is outpacing manufacturers' ability to build capacity.
While the shortage is driven primarily by HBM demand, its effects are beginning to spill over into the broader memory market. Because HBM itself is a form of DRAM, manufacturers are increasingly reallocating manufacturing capacity, engineering resources, and investment toward high-margin AI memory products. That shift risks tightening supply for more conventional DRAM products used in servers, PCs, and mobile devices. Enterprise SSD demand is also rising as AI data centers require massive storage infrastructure alongside compute hardware.
Ironically, the industry is simultaneously searching for alternatives because current memory architectures consume enormous amounts of power. We recently reported on efforts to develop next-generation memory technologies such as 3D X-DRAM and ZAM (Z-Angle Memory), which aim to reduce power consumption and ease scaling limitations.
Yet despite massive investment into future alternatives, demand for the existing memory technologies remains overwhelming.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Samsung reportedly stated that some customers have already secured supply allocations through 2027. Earlier this year, SK Group chairman Chey Tae-won suggested that AI-related memory demand pressure may persist even toward 2030.
The shortages are not necessarily bad news for the companies themselves.
Samsung’s semicon ductor division posted 53.7 trillion won ($36.1 billion) in operating profit during the first quarter of 2026, accounting for roughly 94% of the company’s total quarterly profit as soaring AI memory demand drove record sales. Meanwhile, SK hynix reported record quarterly revenue of 52.6 trillion won ($35.5 billion), and operating profit of 37.6 trillion won ($27.8 billion), fueled largely by booming HBM sales for AI infrastructure.
What to read next
Part of the problem is cyclical. The memory industry has historically swung between oversupply and shortages. However, analysts increasingly believe this cycle is different, as growth in AI infrastructure is consuming hardware at unprecedented rates.
To address the crisis, the companies are aggressively expanding production capacity and increasing investment in advanced packaging and memory fabrication. According to the Korea Times, recent regulatory filings show that Samsung Electronics invested 465.4 billion won in its Xi’an memory chip plant in 2025, a 67.5% year-over-year increase. SK hynix also significantly increased spending, investing 581.1 billion won into its Wuxi facilities and 440.6 billion won into its Dalian operations.
However, semiconductor fabrication plants and advanced memory packaging facilities take years to expand and ramp up, meaning supply growth cannot catch up to the p ace of AI-driven demand.
The memory crunch is joining a growing list of resource shortages emerging from the AI explosion.
GPU shortages have already become severe across parts of the industry. Earlier this month, we reported Intel’s confirmation that extreme demand had become so intense that customers were even buying chips that might previously have been discarded or treated as low-value products.
Power is becoming another major bottleneck. AI data centers are consuming enormous amounts of electricity, forcing technology companies to seek increasingly unconventional energy solutions. Earlier this month, Meta Platforms backed plans involving space-based solar power systems that could theoretically beam solar energy back to Earth to help support future AI infrastructure demands.

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
No comments