News

HBM chips are one of the most important parts of an AI GPU, with the likes of AMD and NVIDIA both using the bleeding edge of HBM memory on their respective AI GPUs. Market research firm Yole Group ...
High bandwidth memory (HBM) chips have become a game changer in artificial intelligence (AI) applications by efficiently handling complex algorithms with high memory requirements. They became a major ...
Samsung Electronics' latest high bandwidth memory (HBM) chips have yet to pass Nvidia's tests for use in the U.S. firm's AI processors due to heat and power consumption problems, three people ...
High bandwidth memory (HBM) are basically a stack of memory chips, small components that store data. They can store more information and transmit data more quickly than the older technology ...
Next-generation GPU-HBM roadmap teases HBM4, HBM5, HBM6, HBM7, HBM8 with HBM7 dropping by 2035 with new AI GPUs using 6.1TB of HBM7 and 15,000W AI GPUs.
Micron capitalizes on AI-driven HBM DRAM demand, gaining market share amid Samsung's setbacks. Click here to read an analysis of MU stock now.
Rambus recently announced the availability of its new High Bandwidth Memory (HBM) Gen2 PHY. Designed for systems that require low latency and high bandwidth memory, the Rambus HBM PHY, built on the ...
High-bandwidth memory (HBM) is again in the limelight. At GTC 2025, held in San Jose, California, from 17 to 21 March, SK hynix displayed its 12-high HBM3E devices for artificial intelligence (AI) ...