Turning to the company's HBM memory portfolio ... capacity in an eight-layer design for up to 1.2TB/s of bandwidth per cube. The H200 will feature six modules for 141GB of HBM3e memory.
Used with the GPUs designed for AI training and other high-performance applications, high bandwidth memory (HBM) uses a 3D stacked architecture of DRAM (dynamic RAM) modules. In time, high ...