FX168 Financial News (North America) News On Monday (February 12), throughout 2023, the demand for AI GPUs and all AI is endless, and the average selling price of HBM chips from HBM memory manufacturers has risen by 500%. That's not bad for HBM makers Micron, Samsung, and SK hynix.
FX168
(Source: TrendForce)
HBM chips are one of the most important parts of AI GPUs, and companies such as AMD and NVIDIA use state-of-the-art HBM memory on their AI GPUs. Market research firm Yole Group is now on board, forecasting HBM supply to grow at a CAGR of 45% from 2023 to 2028, with HBM prices expected to remain high for "some time" given the difficulty of scaling up to meet frenzied demand.
Samsung and SK hynix dominate HBM's memory manufacturing business, while South Korean companies hold 90 percent of the HBM market, leaving only Micron. SK hynix and TSMC are working together, so you're going to get some interesting developments that will continue over time.
SK hynix recently revealed that its next-generation HBM4 memory will be mass-produced in 2026, paving the way for next-generation AI GPUs in addition to NVIDIA's upcoming Blackwell B100 AI GPUs. HBM doesn't stop and won't stop...... As a key part of AI GPUs, AI GPU manufacturers want (and need) more than just fast HBM, but more capacity.
NVIDIA's upcoming H200 AI GPU will come with up to 141GB of HBM3e memory, while the H100 will come with 80GB of HBM3 memory. AMD's new Instinct MI300X AI GPU comes with 192GB of HBM3 RAM, NVIDIA has the faster HBM3e, but AMD has more VRAM (192GB vs. 141GB).
Whichever company adopts HBM4, it's going to be a huge leap forward — how could NVIDIA not use it on AI GPUs after Blackwell — and it's going to go crazy.
SK hynix recently teased its HBM4 memory at the end of last year, saying that development will begin in 2024, and Kim Wang-soo, head of the GSM team, explained, "With the planned mass production and sales of HBM3E next year, our market dominance will be maximized again With the development of the follow-up product HBM4 also scheduled to officially launch, SK hynix's HBM will enter a new phase next year." This will be the year we celebrate. ”
We should see the first 36GB per stack HBM4 memory sample, allowing up to 288GB of HBM4 memory on a true next-generation AI GPU. NVIDIA's next-generation Blackwell B100 AI GPU will use the latest HBM3e memory, so the next-generation codename Vera Rubin should use HBM4 memory. We should also expect that some of the upgraded Blackwell AI GPUs will use HBM4.
NVIDIA is updating its H100 AI GPUs that use HBM3, as well as enhanced H200 AI GPUs that use the latest HBM3e memory standard, so it might be appropriate to do something similar with Blackwell, from HBM3e to HBM4. Or a faster HBM3e variant is planned for the next few years. ]]>