#hbm3e-memory

[ follow ]
#maia-200
fromTheregister
6 days ago
Artificial intelligence

Microsoft looks to drive down AI infra costs with Maia 200

Microsoft unveiled the Maia 200 inference accelerator: 144 billion transistors, 10 petaFLOPS FP4, 216GB HBM3e (7TB/s), and 750W power consumption.
fromComputerWeekly.com
6 days ago
Artificial intelligence

Microsoft introduces AI accelerator for US Azure customers | Computer Weekly

Azure US Central is first to receive Maia 200 inference accelerator, offering FP8/FP4 tensor cores, HBM3e memory, and improved cost and performance.
Artificial intelligence
fromTheregister
2 months ago

TPU v7, Google's answer to Nvidia's Blackwell is nearly here

Google's TPU v7 Ironwood narrows the performance gap with Nvidia Blackwell, delivering competitive FP8 throughput, high-bandwidth HBM3e memory, and scalable TPU pods.
[ Load more ]