AI memory crunch makes even Samsung compete for Samsung’s RAM

AI memory crunch makes even Samsung compete for Samsung’s RAM
Assorted RAM modules scattered on a white surface, showcasing technology components.

The AI boom has pushed DRAM pricing and allocation to a point where even Samsung’s own device businesses can’t count on sweetheart access from Samsung’s memory unit. Under the hood, the bottleneck isn’t just wafers-it’s advanced packaging and TSV capacity tied up by high-bandwidth memory (HBM), which commands far better margins than commodity DRAM. As fabs tilt toward HBM and DDR5, supply for mobile and PC memory tightens, and internal transfer pricing starts to look a lot like the open market. What’s notable here isn’t corporate drama so much as economics: when the profit delta is this wide, vertical integration doesn’t magically mint extra capacity.

The bigger picture: OEMs of every size face bill-of-materials creep, longer lead times, and tougher configuration choices, particularly in servers and high-end laptops. Worth noting, DRAM cycles are cyclical, but the HBM-era constraints are structural-dependent on packaging throughput as much as die output-so relief hinges on new lines and process improvements, not just turning up the crank on existing fabs. Expect memory suppliers to keep prioritizing parts with higher ASPs and long-term AI contracts, while downstream teams recalibrate product plans to match what can actually be shipped. It’s a reminder that in 2025, “just buy more RAM” is no longer a simple procurement decision-even for Samsung.

Subscribe to SmmJournal

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe