Tech News

Google Rolls Out TPU 8t and 8i to Supercharge AI Training Speed

Google introduces TPU 8t and TPU 8i chips to accelerate AI training and real-time inference, boosting performance, reducing latency, and enabling scalable, energy-efficient infrastructure for next-generation agentic artificial intelligence systems worldwide.

Written By : Somatirtha
Reviewed By : Sankha Ghosh

Google has introduced two new chips in its eighth-generation Tensor Processing Unit (TPU) lineup. This is an important effort to meet the demand for processing AI workloads. The two chips, TPU 8t and TPU 8i, have been designed for different purposes, reflecting the growing demand for hardware specialization.

Both chips run on Google’s Axion ARM-based CPU host and use advanced liquid cooling systems. This combination improves performance while keeping energy consumption under control. The company said the new TPUs form part of its broader full-stack infrastructure, spanning networking, data centers, and energy-efficient operations.

TPU 8t: Built for Faster AI Training

TPU 8t by Google is called the ‘training powerhouse’ as its main purpose is to speed up the training of large AI models. Training large AI models typically requires substantial computational resources and time.

The firm asserts that the chip offers nearly three times the computational power of the previous one. This increase will reduce the time required to train AI models from months to weeks.

TPU 8i: Focused on Real-Time Reasoning

TPU 8i functions as an inference workload processor, which Google defines as a ‘reasoning engine.’ The system functions as a foundational component for advanced AI systems that multiple agents need to use for their ongoing interactions.

TPU 8i from Google provides greater memory bandwidth, enabling better performance for critical operations that require low latency. The problem becomes critical for AI systems operating at a large scale because even brief interruptions can cause severe operational disruptions.

Also Read: Merck and Google Cloud Launch Up to $1 Billion AI Partnership

Specialization at the Core of AI Scaling

It is worth noting that although these two chips can perform different tasks, Google highlighted their advantages due to their unique designs. The tech giant sees it as a breakthrough in developing fast, agent-based intelligent systems for global use.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

Best Crypto Airdrops to Watch in 2026: Top 10 List

How Much Will 1 Solana (SOL) Be Worth in 2030? Price Prediction Guide

Kelp DAO Hacker Launders $80M Through THORChain as Aave TVL Sinks

Base Tests Azul Upgrade With Multiproofs Ahead of May Launch

BlackRock and Strategy Tighten Bitcoin Supply as Risk Emerges