Google's new chips are a shot at Nvidia and a big hint at where AI goes next
Briefly

Google's new chips are a shot at Nvidia  and a big hint at where AI goes next
"Google's new TPU 8t is designed for training the largest frontier AI models, while its TPU 8i is built for inference, marking a significant shift in focus."
"The split signals a shift happening across the industry, as focus turns to agents and applications that run on top of the models and require more computing power."
"Google's new 8i inference chip is making a big jump in high-bandwidth memory (HBM), solving the 'memory wall' issue critical for running agents effectively."
"Google Cloud CEO Thomas Kurian described the decision to create two new chips as a 'natural evolution' in response to the growing demand for power efficiency."
Google introduced its new AI chips, TPU 8t and TPU 8i, focusing on training and inference respectively. The TPU 8t targets the training of large AI models, while the TPU 8i is optimized for inference tasks. This split reflects a broader industry trend towards enhancing inference capabilities as AI models improve. Google emphasizes the importance of power efficiency in these chips, addressing the 'memory wall' issue to enhance performance. Both chips are set to be available later this year, marking a significant advancement over previous generations.
Read at www.businessinsider.com
Unable to calculate read time
[
|
]