The narrative in AI infrastructure over the last two years has been dominated by the enormous and growing demand for compute capacity and its economic consequences, such as the buildout of data centers and the consequent shortages of key resources such as land, water, power, and copper.

But of all these bottlenecks, memory is by far the most significant. The demand for memory is now outpacing the demand for other drivers of compute capacity. The implications of this will ripple through not just the economics of data centers, but the cost of every single consumer and enterprise hardware device.

In this piece, we unpack the market action around memory prices, its ripple effects across the consumer and industrial electronics market, and the supply and demand curve that is emerging around AI. Critically, we explain why the amount of memory being purchased by AI companies like OpenAI seems to be more than what they need, and how the threat of on-device inference might actually be incentivizing an engineered memory shortage.

  • GamingChairModel@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    11 hours ago

    I’m just not connecting the dots. The amount of money they’re spending on this is astronomical, and they are burning through the cash they have at a rate they can’t sustain, while they’re fighting for their future against Google, Anthropic, plus xAI and Perplexity and others, and maybe foreign competition like Deepseek that the government can’t fully shield them from. While also competing with major data center companies themselves, who may want to build data centers for other non-AI purposes, too. And those competitors have deep, deep pockets.

    If they don’t have a revenue model that actually keeps them afloat, then all their capital expenditures will end up going to benefit someone else.

    In other words, the central thesis that they want to choke out competition from on-device models kinda ignores that they’re facing a much more immediate, much more pressing threat from their data center competition. It’s like trying to corner the market on snow shovels when a hurricane is bearing down.

    Plus one important thing worth noting is that OpenAI purchased the option to buy that much memory, enough to persuade the memory manufacturers to change their own investment decisions for the next 5 years. They’re not necessarily going to actually buy that much. And in theory could sell that option to others. 40% of the market is enough to really move prices, but not enough to actually corner it and exclude others from buying memory. They’ll just have to make it more expensive for themselves at the same time that they make it more expensive, but not impossible, for their true competitors also outfitting data centers.