If you bought a laptop in 2025 you might have noticed something odd: memory modules disappearing from parts lists, RAM kits turning into collector’s items on enthusiast forums, and whispers that next year your preferred configuration could cost a lot more. That’s not rumor. It’s the fallout of an AI-driven scramble for high-performance memory that has pushed DRAM prices into rarefied territory and sent shock waves through the PC supply chain.
Why memory suddenly matters
Training and running large AI models eats memory the way rendering eats GPU cycles. Data centers building AI infrastructure have been snapping up specialized memory—especially high-bandwidth memory (HBM) and large pools of DRAM—at volumes chipmakers hadn’t seen in years. The result: market prices spiked through 2025 (analysts reported roughly 40% increases late in the year), and some spot-market quotes shot far higher, squeezing margins for laptop and PC makers and lifting valuations for memory makers.
Investors have noticed. The big memory names—Samsung, SK Hynix and Micron—saw their stocks rally as traders priced in continued demand and a coming “DRAM supercycle.” Equipment suppliers that sell into memory fabs, like those that supply the extreme ultraviolet tools needed to boost capacity, have also benefited. In short: where AI goes, the memory market follows.
How that translates to your next purchase
Manufacturers are reacting in pragmatic, sometimes blunt ways. Some are stockpiling memory. Others are quietly changing launch configurations or hiking MSRP for new machines. ASUS reportedly told channel partners to expect higher prices; OEMs from Dell to HP have signaled they may ship machines with smaller RAM skews or nudge configuration options to protect margins. Prebuilt systems occasionally appear without RAM in enthusiast channels because suppliers are simply running short.
For consumers that means two obvious outcomes: either you’ll pay more for the same spec, or you’ll pay the same and get less RAM. Gamers and creative pros feel this immediately; everyday users will notice it over time as sellers rebundle or raise prices.
Creative detours: can the PC side fight back?
Not everyone is resigned to higher costs. A couple of hardware workarounds surfaced at CES and in industry briefings that aim to blunt DRAM dependence.
One approach uses smarter storage. A Taiwanese controller maker introduced an SSD-based cache product designed to augment a laptop’s effective memory bandwidth so system designers can ship with lower DRAM capacities without sacrificing perceived AI performance. Think of it as using very fast NAND and clever algorithms to do some of the heavy lifting DRAM used to do.
Another idea is architectural: shrink or rethink cooling so there’s physical room for more memory. A startup unveiled a fanless, solid‑state thermal engine that frees board space inside thin laptops. Freed-up volume can translate into more DRAM slots or different memory topologies, which helps overall capacity and bandwidth without changing memory fab economics.
Both ideas share a hopeful premise: if enough AI work can be pushed onto endpoints (your laptop) rather than always into hyperscale data centers, memory makers might rebalance production back toward DRAM for PCs. That’s a big if. To change where fabs invest, OEMs, chip designers and enterprise buyers would have to present a convincing, sustained demand signal.
The market angle (and why chipmakers are smiling)
From a corporate vantage point this is welcome. Memory firms make fatter margins when customers will pay a premium for capacity and bandwidth, and investors have rewarded companies positioned to supply AI infrastructure. Firms that make the machines used to build chips are lining up orders to feed expanded fab capacity—another reason analysts are talking about a multi-year cycle rather than a short squeeze.
But cycles have winners and losers. If capacity expands for HBM and HBM‑adjacent products, the PC market can still be left fighting for the leftover DRAM, keeping prices elevated for longer.
What you should do if you’re shopping now
If you’re in the market for a new laptop or building a PC, a few practical moves can help:
- Buy sooner rather than later if you care about specific RAM sizes—retail promotions may still be available before OEMs adjust pricing.
- Consider models with upgradeable RAM if you want a path to add capacity when prices normalize.
- If you’re comfortable with alternatives, watch for machines that advertise smarter local AI acceleration and caching; they may offer similar real‑world performance with less headline RAM.
And if you’re tempted to wait for a sale on an Apple laptop, there are still bargain windows popping up—check current MacBook Air deals and compare configurations. If you prefer shopping direct, the MacBook line remains an option worth checking for value and longevity (MacBook available on Amazon).
This is more than a pocketbook problem. The same tensions—between centralized cloud AI and on‑device computing—are reshaping how companies plan hardware investment. Hyperscalers continue building massive facilities (some even exploring outlandish placements for compute capacity), and that keeps the memory market tight. The longer cloud demand dominates, the longer DRAM prices stay elevated and the more creative the PC industry must get to protect margins and preserve user experience. For now, expect price and configuration churn, a few clever hardware detours, and a memory market that looks a lot more like the centre of gravity for tech economics than it did just two years ago.
If you want to read more about how hyperscalers are thinking differently about where to put compute, this project on AI data center build-outs is an interesting sign of how far some players will go to scale capacity.