Login / Register
Advancing the rate of AI innovation
HBM3E built for AI and supercomputing with industry-leading process technology
Frequently asked questions
Micron’s HBM3E delivers an industry leading pin speed of > 9.2Gbps and can support backward compatible data rates of HBM2 first generation devices.
Micron’s HBM3E delivers an industry leading Bandwidth of >1.2 TB/s per placement. HBM3E has 1024 IO pins and Micron’s pin speed > 9.2Gbps achieves > 1.2TB/s.
Micron’s industry leading HBM3E provides 24GB capacity per placement with an 8-high HBM3E, Micron plans to announce a 36GB 12-high HBM3E device in the future.
Micron’s HBM3E delivers an industry leading Bandwidth of >1.2 TB/s per placement. HBM3E has 1024 IO pins and Micron’s pin speed > 9.2Gbps achieves > 1.2TB/s.
HBM2 offers 8 independent channels running at 3.6Gbps per pin and delivering up to 410GB/s bandwidth. HBM2 offers 4GB, 8GB and 16GB of capacity. HBM3E offers 16 independent channels and 32 psuedo channels. Micron’s HBM3E delivers pin speed > 9.2Gbps at an industry leading Bandwidth of >1.2 TB/s per placement. Micron’s HBM3E offers 24GB using 8-high stack, and a 36GB using 12-high stack is planned for the future.
Please see our Product Brief.
Featured resources
1. Data rate testing estimates based on shmoo plot of pin speed performed in manufacturing test environment.
2. 50% more capacity for same stack height.
3. Power and performance estimates based on simulation results of workload uses cases.
4. Based on internal Micron model referencing an ACM Publication, as compared to the current shipping platform (H100).
5. Based on internal Micron model referencing Bernstein’s research report, NVIDIA (NVDA): A bottoms-up approach to sizing the ChatGPT opportunity, February 27, 2023, as compared to the current shipping platform (H100).
6. Based on system measurements using commercially available H100 platform and linear extrapolation.