News

HBM chips are one of the most important parts of an AI GPU, with the likes of AMD and NVIDIA both using the bleeding edge of HBM memory on their respective AI GPUs.
So, what’s fueling this surge for Micron? It’s all about that AI infrastructure demand, which is driving sales of high-bandwidth memory products, also known as HBM. Image by dujin yun from Pixabay ...
But that just tells you that TSMC has a monopoly on the manufacturing of those devices in a way that Micron absolutely does not. SK Hynix has the bulk of the DRAM and HBM business, and Samsung is ...
It also projects that compared to 2024, the unit sales of HBM are forecast to increase 15-fold by 2035. Figure 2 The booming AI and HPC hardware is forecast to increase HBM sales 15-fold by 2035.
HBM brings a huge 1024-bit-wide bus with 512GB/sec on tap, plus lower power usage. HBM brings a huge 1024-bit-wide bus with 512GB/sec on tap, plus lower power usage. Skip to content ...
The US government has imposed fresh export controls on the sale of high tech memory chips used in artificial intelligence (AI) applications to China. The rules apply to US-made high bandwidth ...
I can't trust Samsung Electronics' high-bandwidth memory (HBM) products and engineers. We cannot trust and do business with them because senior executives change frequently".
Figure 2 Host devices like GPUs and FPGAs in AI designs have embraced HBM due to their higher bandwidth needs. Source: Micron. The neural networks in AI applications require a significant amount of ...