News
Rising demand for AI servers is forcing memory makers to shift their production priorities, sending shockwaves through the semiconductor supply chain. As manufacturers focus more on High Bandwidth ...
Discover Micron's dominance in HBM, enabling AI infrastructure with explosive market growth. Learn why its undervalued stock ...
A new KAIST roadmap reveals HBM8-powered GPUs could consume more than 15kW per module by 2035, pushing current infrastructure ...
Nvidia has reportedly tapped Micron Technology as the first supplier of its next-generation small outline compression ...
Highlights resurgence of memory market, underpinned by surging AI workloads and strategic shift toward HBM, even as global ...
In the meantime, JEDEC expanded the maximum memory module thickness from 720 to 775mm for the HBM3E standard, which still allows for 40µm-thick chiplets. HBM standards specify a per-pin transfer rate, ...
It remains to be seen if the memory business can defy this cycle a bit because of the demand exceeding the supply for HBM memory, but overall, NAND flash seems to be riding down its pulse as normal.
DDR4 memory’s production is set to end this year, with major manufacturers like Micron, Samsung, and SK Hynix discontinuing ...
SK hynix continues to dominate the DRAM business, with the South Korean memory leader hitting 'phenomenal' yield rates with its new 1c DRAM modules.
In a new post on X by leaker "MEGAsizeGPU" we're hearing that NVIDIA has started shipping GeForce RTX 50 series "Blackwell" GPUs with new SK hynix GDDR7 memory modules, marking a change from using ...
Micron, one of the world's top three manufacturers of memory modules and products, announced that it was bringing DDR4 ...
SOCAMM is a compact, high-performance memory module optimized for AI servers. Unlike traditional server modules built using DDR5, SOCAMM uses LPDDR, the low power chip typically used in mobile ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results