A New Bottleneck within the Data Center Provide Chain

Simply because it appeared that the tech {hardware} provide chain was recovering from COVID-induced shortages, a brand new bottleneck has sprung up that would doubtlessly impression the provision of information middle GPUs and the growth plans of information middle builders.

In Might, South Korean reminiscence producer SK Hynix introduced that its provide of high-bandwidth reminiscence (HBM) chips was bought out for 2024 and most of 2025. Certainly one of its opponents within the subject, Micron, had issued an analogous assertion in March. Samsung, the final participant within the HBM market, has made no public touch upon product availability.

Amid the surge in AI and digital companies, the HBM scarcity poses a brand new problem for knowledge middle growth. Whereas a lot focus has been positioned on GPU bottlenecks, rising demand for high-bandwidth reminiscence may finally impression the business’s development plans.

What Are HBM Chips and Why Are They Wanted in Data Centers?

Excessive-bandwidth reminiscence is used within the precise GPU package deal itself, with the chips bodily sitting subsequent to the GPU silicon, versus normal DRAM which is mounted on DIMM sticks and sits subsequent to the CPU.

The HBM design offers a lot higher velocity and decreased latency and is vital to the efficiency of AI processing.

Throughout the COVID-19 provide chain shortages, carmakers couldn’t receive chips for his or her automobiles, in order that they merely constructed automobiles with out the chips and mothballed them till they might shore up stock.

Associated:How Operators, Distributors Hope to Sort out Data Center Provide Chain Woes

GPU distributors like Nvidia and AMD don’t have that possibility. No high-bandwidth reminiscence means GPUs can’t be assembled as a result of the HBM needs to be added to the GPU package deal on the manufacturing stage.

It’s clearly a delicate topic. When approached by Data Center Information, knowledge middle {hardware} distributors Hynix, Micron, Samsung, Nvidia, Intel, and AMD all declined to touch upon the problem.

HBM-Schematic.jpg

TrendForce predicts HBM’s share of the general reminiscence market will almost double in 2024 – from 2% in 2023 to five% this yr. Trying additional forward, HBM’s market share is anticipated to surpass 10% of the general reminiscence market by 2025.

When it comes to market worth, high-bandwidth reminiscence is projected to account for greater than 20% of the whole DRAM market worth beginning in 2024, doubtlessly exceeding 30% by 2025.

Data Center Development Continues, however HBM Scarcity Might Pose Challenges

HBM reminiscence is dearer to make, harder to make, and takes longer to make than normal DRAM. So, it’s not like reminiscence makers may pivot on a dime and change to rising their HBM manufacturing. Such fabrication vegetation, like a CPU fab, take time to construct.

Associated:Nvidia’s Subsequent-Era AI Chip Rollout Slowed by Engineering Snags

A scarcity of information middle merchandise may impression the business’s growth and development plans, however the provide chain is presently holding out. Information middle development marches on regardless of some firms having to attend for GPU {hardware}, notes Alan Howard, principal analyst for colocation and knowledge middle development with Omdia.

Demand definitely isn’t slowing. Omdia tasks that for 2024 for the 100 firms it tracks there are 37.7 million sq.ft and 6 GW of deliberate capability estimated to come back on-line globally.

“A GPU scarcity will not be prone to have a dramatic impression on knowledge middle development plans within the foreseeable future,” Howard instructed Data Center Information. “The one factor which may put a dent in knowledge middle development, with regard to compute or AI {hardware}, could be a dramatic and long-term provide chain nightmare. Unlikely, however like a pandemic, not unattainable.”

And if a major GPU scarcity does occur? “It’ll be deal metropolis,” stated Jon Peddie, president of Jon Peddie Analysis, a tech {hardware} analysis agency. “Whichever a kind of firms is prepared to pay a premium to get on the head of the record, then they’ll get the primary shipments, after which these sorts of offers are supplied on a regular basis.”

Survival of the Largest?

Associated:DCW Insights: Specialists Focus on Data Center Sustainability, AI, Market Developments

The true drawback is that this mannequin doesn’t permit for any new entrants into the market within the occasion of an HBM bottleneck, stated Anshel Sag, principal analyst with Moor Insights and Technique.

“Nvidia goes to get the lion’s share of the HBM, however AMD and others have seemingly already put their orders in for some time,” Sag defined. “So, in the event you’re making an attempt to launch one thing that makes use of HBM, and also you haven’t already negotiated your provide, you’re in all probability not getting any.”

It additionally impacts smaller gamers like SambaNova, which makes devoted AI processing servers utilizing their very own customized silicon and non-GPU elements. Sag factors out that AMD’s Versal line of FPGA processors additionally makes use of HBM reminiscence, and this may occasionally additionally undergo from additional shortages.

Peddie says there’s already a backlog of GPUs on the order books. He expects Nvidia to ship 800,000 GPUs in Q2, but it surely may in all probability promote extra. “They in all probability can’t meet the rise in demand, however they may meet 80% or extra of the demand,” he stated. “It’s just a bit little bit of discomfort. You understand, it’s like, ‘Gee, I don’t get dessert tonight, however I had a terrific dinner’.”

Sag says the one manner he may see reminiscence makers increasing capability aggressively could be if a vendor like Nvidia requested a 3rd occasion to construct up capability. “Different firms… have accomplished stuff like that with foundries. Corporations like Apple and Qualcomm have supplied instruments and capital to foundries to speed up their deployment of sure applied sciences. So, there’s a precedent for chip distributors to offer incentives to foundries to increase capability or enhance their yields,” he stated.

Past the HBM Scarcity: Broader Growth Challenges

Peddie predicts knowledge facilities presently below development will get constructed out, however going ahead, GPU and reminiscence provide points would be the least of their issues as a result of knowledge middle operators and builders are battling different issues. This contains acquiring actual property, ample energy and cooling, and all the opposite components that go into constructing a knowledge middle. So, the GPU provide concern could also be moot.

“The put in base will get to a sure level, it’ll begin to method demand after which prospects will shrug it off as a result of they may say we do not want any extra boards proper now we’ve obtained sufficient,” stated Peddie. “And never solely do we now have sufficient, we now have no place to place new ones.”