SK Hynix will invest 19 trillion won in a new advanced chip packaging facility in Cheongju to boost capacity for AI-driven ...
SPHBM4 cuts pin counts dramatically while preserving hyperscale-class bandwidth performanceOrganic substrates reduce packaging costs and relax routing constraints in HBM designsSerialization shifts ...
A new technical paper titled “Making Strong Error-Correcting Codes Work Effectively for HBM in AI Inference” was published by researchers at Rensselaer Polytechnic Institute, ScaleFlux and IBM T.J.
Micron Technology has not just filled in a capacity shortfall for more high bandwidth stacked DRAM to feed GPU and XPU accelerators for AI and HPC. It has created DRAM that stacks higher and runs ...
Forbes contributors publish independent expert analyses and insights. Covering Digital Storage Technology & Market. IEEE President in 2024 Several NAND flash manufacturers were discussing higher ...
TL;DR: SK hynix is advancing AI memory technology with its 12-layer HBM4 and upcoming HBM4E, aiming for mass production this year. The company has shipped 12-layer HBM4 samples, boasting ...
TL;DR: SK hynix has sold out most of its HBM memory chips for 2026, with 2025 supply exhausted. The company is finalizing 2026 sales discussions and expects HBM demand to rise sharply due to AI ...
High Bandwidth Memory (HBM) is the commonly used type of DRAM for data center GPUs like NVIDIA's H200 and AMD's MI325X. High Bandwidth Flash (HBF) is a stack of flash chips with an HBM interface. What ...