[ad_1]
Samsung says it has developed the {industry}’s first 12-stack HBM3E 12H DRAM, outpacing Micron Know-how and doubtlessly setting the stage for the subsequent technology of Nvidia’s AI playing cards.
The South Korean tech large’s HBM3E 12H affords bandwidth of as much as 1,280GB/s and an industry-leading capability of 36GB, representing a greater than 50% enchancment over the 8-stack HBM3 8H.
The 12-stack HBM3E 12H makes use of superior thermal compression non-conductive movie (TC NCF), which permits the 12-layer merchandise to satisfy the present HBM bundle necessities whereas sustaining the identical peak specification as 8-layer ones. These developments have led to a 20% improve in vertical density in comparison with Samsung’s HBM3 8H product.
The battle heats up
“The {industry}’s AI service suppliers are more and more requiring HBM with greater capability, and our new HBM3E 12H product has been designed to reply that want,” mentioned Yongcheol Bae, Govt Vice President of Reminiscence Product Planning at Samsung Electronics. “This new reminiscence answer kinds a part of our drive towards creating core applied sciences for high-stack HBM and offering technological management for the high-capacity HBM market.”
In the meantime, Micron has began mass manufacturing of its 24GB 8H HBM3E, which will probably be utilized in Nvidia’s newest H200 Tensor Core GPUs. Micron claims its HBM3E consumes 30% much less energy than its rivals, making it perfect for generative AI functions.
Regardless of lacking out on Nvidia’s costliest AI card, Samsung’s 36GB HBM3E 12H reminiscence outperforms Micron’s 24GB 8H HBM3E by way of capability and bandwidth. As AI functions proceed to develop, Samsung’s 12H HBM3E will probably be an apparent alternative for future programs requiring extra reminiscence, reminiscent of Nvidia’s B100 Blackwell AI powerhouse which is anticipated to reach by the top of this yr.
Samsung has already begun sampling its 36GB HBM3E 12H to prospects, with mass manufacturing anticipated to start out within the first half of this yr. Micron is about to start transport its 24GB 8H HBM3E within the second quarter of 2024. The competitors between the 2 tech giants within the HBM market is anticipated to accentuate because the demand for high-capacity reminiscence options continues to surge within the AI period.
Extra from TechRadar Professional
[ad_2]
Source link