ADVERTISEMENT

Nvidia Isn’t The Only One Getting A Massive AI Boost

Nvidia Corp.’s market value jumped $207 billion in the two days after it gave an amazingly good revenue outlook

The Nvidia headquarters in Santa Clara, California, US, on Friday, May 26, 2023. Nvidia Corp. is within touching distance of $1 trillion market value, poised to become only the ninth firm ever to hit that milestone, as the artificial intelligence frenzy boosts demand for processors that can accelerate computing. Photographer: Philip Pacheco/Bloomberg
The Nvidia headquarters in Santa Clara, California, US, on Friday, May 26, 2023. Nvidia Corp. is within touching distance of $1 trillion market value, poised to become only the ninth firm ever to hit that milestone, as the artificial intelligence frenzy boosts demand for processors that can accelerate computing. Photographer: Philip Pacheco/Bloomberg

Nvidia Corp.’s market value jumped $207 billion in the two days after the US chip designer on May 24 gave an amazingly good revenue outlook following a season of bad news for the semiconductor industry. Yet there’s a handful of other technology companies that may benefit even more from the race to embrace artificial intelligence.

There are numerous ways to put this forecast and subsequent reaction into context. The sales figure is 53% more than analysts had expected, and 33% higher than the company’s previous record achieved in March last year. The first-day pop was the third-largest gain in US history, while the two-day gain eclipsed the market cap of all but 48 stocks across the globe.

Among those companies dwarfed by the $200 billion jump in Nvidia’s value are two of the most-important enablers of the AI revolution. Between them, Korea’s SK Hynix Inc. and Boise-based Micron Technology Inc. command 52% of the global market for dynamic random-access memory. Combined, they’re worth just $140 billion. Their only rival, Samsung Electronics Inc., accounts for 43% of the DRAM industry — just one of at least four global sectors it leads — while it trades at $317 billion.

If the generative AI sector is going to take off, as Nvidia and its clients believe, then established giants like Microsoft Corp. and newcomers such as OpenAI are set to pound on the doors of Samsung, SK Hynix and Micron. 

Machines that crunch reams of data, analyze patterns in video, audio and text, and spit out replicas of human-created content are going to need memory chips. In fact, AI companies are likely to buy up more DRAM than any other slice of the technology sector in history.

The reason for this demand for memory chips is quite straightforward: Nvidia’s AI chips differ from standard processors by inhaling huge amounts of data in a single gulp, crunching numbers in one go, then spitting out the results all at once. But for this power advantage to be realized, they need the  information to be fed into the computer quickly and without delay. That’s where memory chips come in.

Processors don’t read data directly from a hard drive — that’s too slow and inefficient. The first choice is to keep it in temporary storage within the chip itself. But there’s not enough room to hold much here — chipmakers prefer to devote this precious real estate to number-crunching functions. So, the second-best option is to use DRAM .

When you’re processing billions of pieces of information in a single go you need that data close at hand and delivered quickly. A lack of adequate DRAM in a system will slow down a computer significantly, neutralizing the value of spending $10,000 on the best processors to run sophisticated chatbots. Which means that for every high-end AI processor bought, as much as 1 Terabyte of DRAM may be installed — that’s 30-times more than a high-end laptop.

Such hunger for memory means that DRAM sold for use in servers is set to outpace that installed in smartphones sometime this year, according to Taipei-based researcher TrendForce Corp. 

Nvidia Isn’t The Only One Getting A Massive AI Boost

These systems also need to be able to save large amounts of their output nearby so that it can be read and written quickly. That’s done on NAND Flash, the same chips used in smartphones and most modern laptops. Samsung is the global leader in this space, followed by Japan’s Kioxia Holdings Corp. (a spinoff from Toshiba Corp.) and SK Hynix.

Together, DRAM and NAND accounted for $8.9 billion of revenue at Samsung last quarter, far outpacing the $4.3 billion Nvidia got from its data-center business that includes products used for AI. To put that in context, though, this was the worst performance for Samsung’s memory division in seven years, and its AI-related memory sales are only a fraction of total revenue. 

Both figures are set to grow. For every high-end AI chip sold to customers, another dozen DRAM chips will be shipped, and that means more revenue for Samsung, SK Hynix and Micron. As Nvidia grows, so too will these three companies that collectively control 95% of the DRAM market.

There’s no doubt the AI revolution is here, with makers of cool chatbots, ubiquitous search engines and high-powered processors among the biggest winners. But those churning out boring old memory chips won’t be left out either.

More From Bloomberg Opinion:

  • Chips Are Back. But Not Equally or for Everyone: Tim Culpan
  • Don’t Go Down That AI Longtermism Rabbit Hole: Parmy Olson
  • US Chip Curbs Highlight Cracks in China AI Strategy: Tim Culpan 

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Tim Culpan is a Bloomberg Opinion columnist covering technology in Asia. Previously, he was a technology reporter for Bloomberg News.

More stories like this are available on bloomberg.com/opinion

©2023 Bloomberg L.P.