Memory chip manufacturing giant SK hynix has confirmed that it's preparing to roll out higher-capacity GDDR7 memory modules later this year. In its latest earnings call, the company said that it will be increasing the maximum capacity of its GDDR7 from 16GB (2GB per module) to 24GB (3GB per module). While the comment was made in reference to AI GPUs, it is hard not to dwell on the potential for these new memory chips in future consumer-grade GPUs.
Having more VRAM doesn't necessarily mean better performance; however, in recent years, it has become clear that 8GB or even 12GB is just not enough for many modern games, especially at higher resolutions or when enabling ray tracing. Additionally, generative AI tools have become more common in creative and productivity workflows and benefit heavily from additional memory.
From a practical standpoint, this move could also help bring higher VRAM capacity to slightly more affordable GPUs, not just the flagship models. Using 3GB modules would potentially make it easier to build a 24GB card with just eight memory modules instead of 12, which would essentially help in reducing power, thermals, and cost.
You may likeSK hynix's plans to introduce higher capacity GDDR7 memory modules coincide with recent rumors surrounding Nvidia's next consumer GPU lineup. According to a recent report, the potential RTX 50 Super series could arrive with refreshed models featuring 24GB and 18GB of video memory. Meaning that these 3GB GDDR7 modules would be perfect if Nvidia wants to avoid stacking extra modules at the back of the PCB. Notably, Nvidia began using SK hynix GDDR7 chips only recently for its current-generation of RTX 50-series Blackwell graphics cards.
Of course, none of this is official confirmation. Nvidia hasn't announced a Super refresh yet, and even if it does, there's no guarantee whether it'll make use of higher VRAM modules on every single SKU. But when your memory supplier starts talking about higher-capacity modules, it's a strong hint that something bigger is on the table.
Beyond GDDR7, SK hynix says that it is pushing across its memory portfolio in response to th e accelerating AI demands. It is preparing next-generation LPDDR memory modules for servers to enable energy-efficient AI inference in data center environments. It's also continuing to lead in High Bandwidth Memory (HBM), with high-volume production of 12-layer HBM3E alongside plans to introduce HBM4 later this year. Meanwhile, the company is also on track to release enterprise-grade 321-layer NAND SSDs in the second half of 2025, targeting high-capacity storage solutions for AI and hyperscale workloads.
Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.
No comments