3 things to know about Ironwood, our latest TPU
Recorded: Nov. 26, 2025, 1:03 a.m.
| Original | Summarized |
3 things to know about Ironwood, Google's latest TPU Skip to main content The Keyword 3 things to know about Ironwood, our latest TPU Share Copy link Home Product news Product news Android, Chrome & Play Android Chrome Chromebooks Google Play Wear OS See all Platforms & Devices Fitbit Google Nest Pixel See all Explore & Get Answers Gemini Maps News Search Shopping See all Connect & Communicate Classroom Photos Registry Translate In the Cloud Google Workspace More on the Cloud Blog Google Cloud See all See all product updates Android, Chrome & Play Android Chrome Chromebooks Google Play Wear OS See all Platforms & Devices Fitbit Google Nest Pixel See all Explore & Get Answers Gemini Maps News Search Shopping See all Connect & Communicate Classroom Photos Registry Translate In the Cloud Google Workspace More on the Cloud Blog Google Cloud See all See all product updates Company news Company news Outreach & initiatives Arts & Culture Education Entrepreneurs Public Policy Sustainability See all Technology AI Developers Health Google DeepMind Google Labs Safety and security See all Inside Google Data centers and infrastructure Doodles Googlers Life at Google See all Around the globe Google in Asia Google in Europe Google in Latin America See all Authors Sundar Pichai, CEO Demis Hassabis, CEO and Co-Founder, Google DeepMind Kent Walker, SVP James Manyika, SVP Ruth Porat, President & Chief Investment Officer See all Outreach & initiatives Arts & Culture Education Entrepreneurs Public Policy Sustainability See all Technology AI Developers Health Google DeepMind Google Labs Safety and security See all Inside Google Data centers and infrastructure Doodles Googlers Life at Google See all Around the globe Google in Asia Google in Europe Google in Latin America See all Authors Sundar Pichai, CEO Demis Hassabis, CEO and Co-Founder, Google DeepMind Kent Walker, SVP James Manyika, SVP Ruth Porat, President & Chief Investment Officer See all Feed Subscribe Global (English) Africa (English) Australia (English) Brasil (Português) Canada (English) Canada (Français) Česko (Čeština) Deutschland (Deutsch) España (Español) France (Français) India (English) Indonesia (Bahasa Indonesia) Italia (Italiano) 日本 (日本語) 대한민국 (한국어) Latinoamérica (Español) الشرق الأوسط وشمال أفريقيا (اللغة العربية) MENA (English) Nederlands (Nederland) Polska (Polski) Portugal (Português) ประเทศไทย (ไทย) Türkiye (Türkçe) 台灣 (中文) ["How does Gemini work in Google Maps?", "What is quantum computing?", "What are the camera features on Pixel 10?"] Subscribe The Keyword Home Product news Product news Android, Chrome & Play Android Chrome Chromebooks Google Play Wear OS See all Platforms & Devices Fitbit Google Nest Pixel See all Explore & Get Answers Gemini Maps News Search Shopping See all Connect & Communicate Classroom Photos Registry Translate In the Cloud Google Workspace More on the Cloud Blog Google Cloud See all See all product updates Company news Company news Outreach & initiatives Arts & Culture Education Entrepreneurs Public Policy Sustainability See all Technology AI Developers Health Google DeepMind Google Labs Safety and security See all Inside Google Data centers and infrastructure Doodles Googlers Life at Google See all Around the globe Google in Asia Google in Europe Google in Latin America See all Authors Sundar Pichai, CEO Demis Hassabis, CEO and Co-Founder, Google DeepMind Kent Walker, SVP James Manyika, SVP Ruth Porat, President & Chief Investment Officer See all Feed Press corner RSS feed Subscribe Breadcrumb Products Google Cloud 3 things to know about Ironwood, our latest TPU Nov 25, 2025 Share Copy link Our seventh-gen Tensor Processing Unit is here! Learn what makes Ironwood our most powerful and energy-efficient custom silicon to date. Ari Marini Keyword Contributor Read AI-generated summary General summary Ironwood our seventh-generation Tensor Processing Unit (TPU) is now available for Cloud customers. It's custom built for high-volume low-latency AI inference and model serving. Ironwood can scale up to 9216 chips in a superpod which helps to reduce the compute-hours and energy required for training and running cutting-edge AI services. Summaries were generated by Google AI. Generative AI is experimental. Share Copy link Today's most advanced AI models, like those powering complex thinking and calculations, need speed and efficiency from the hardware that powers them. That's why at Cloud Next in April, we unveiled Ironwood, our seventh-generation Tensor Processing Unit (TPU). Ironwood is our most powerful, capable, and energy-efficient TPU yet, designed to power thinking, inferential AI models at scale. By acting as a hugely efficient parallel processor, Ironwood excels at managing massive calculations and significantly minimizes the internal time required for data to shuttle across the chip. This breakthrough dramatically speeds up complex AI, making models run significantly faster and smoother across our cloud.And now, Ironwood is here for Cloud customers.Here are three things to know about it.1. It’s purpose-built for the age of inferenceAs the industry’s focus shifts from training frontier models to powering useful, responsive interactions with them, Ironwood provides the essential hardware. It’s custom built for high-volume, low-latency AI inference and model serving. It offers more than 4X better performance per chip for both training and inference workloads compared to our last generation, making Ironwood our most powerful and energy-efficient custom silicon to date. 2. It’s a giant network of powerTPUs are a key component of AI Hypercomputer, our integrated supercomputing system designed to boost system-level performance and efficiency across compute, networking, storage and software. At its core, the system groups individual TPUs into interconnected units called pods. With Ironwood, we can scale up to 9,216 chips in a superpod. These chips are linked via a breakthrough Inter-Chip Interconnect (ICI) network operating at 9.6 Tb/s. Part of an Ironwood superpod, directly connecting 9,216 Ironwood TPUs in a single domain. This massive connectivity allows thousands of chips to rapidly communicate and access a staggering 1.77 Petabytes of shared High Bandwidth Memory (HBM), overcoming data bottlenecks for even the most demanding models. This efficiency significantly reduces the compute-hours and energy required for training and running cutting-edge AI services.3. It’s designed for AI with AIIronwood is the result of a continuous loop at Google where researchers influence hardware design, and hardware accelerates research. While competitors rely on external vendors, when Google DeepMind needs a specific architectural advancement for a model like Gemini, they collaborate directly with their TPU engineer counterparts. As a result, our models are trained on the newest TPU generations, often seeing significant speedups over previous hardware. Our researchers even use AI to design the next chip generation — a method called AlphaChip — which has used reinforcement learning to generate superior layouts for the last three TPU generations, including Ironwood. POSTED IN: Google Cloud
AI
Related stories Google in Asia By Nov 26, 2025 AI Nov 25, 2025 Google Labs By Nov 25, 2025 AI By Nov 21, 2025 Learning & Education By Nov 20, 2025 AI By Nov 20, 2025 . Jump to position 1 Jump to position 2 Jump to position 3 Jump to position 4 Jump to position 5 Jump to position 6 Let’s stay in touch. Get the latest news from Google in your inbox. Subscribe Follow Us Privacy Terms About Google Google Products About the Keyword Help Global (English) Africa (English) Australia (English) Brasil (Português) Canada (English) Canada (Français) Česko (Čeština) Deutschland (Deutsch) España (Español) France (Français) India (English) Indonesia (Bahasa Indonesia) Italia (Italiano) 日本 (日本語) 대한민국 (한국어) Latinoamérica (Español) الشرق الأوسط وشمال أفريقيا (اللغة العربية) MENA (English) Nederlands (Nederland) Polska (Polski) Portugal (Português) ประเทศไทย (ไทย) Türkiye (Türkçe) 台灣 (中文) |
Ironwood, Google’s seventh-generation Tensor Processing Unit (TPU), represents a significant advancement in AI hardware designed to accelerate inference workloads. This new TPU is built for high-volume, low-latency AI inference and model serving, providing a crucial boost to Google’s AI capabilities. Here’s a breakdown of three key aspects of Ironwood: 1. **Designed for Inference at Scale:** Ironwood is specifically engineered for the shift in the AI landscape – moving beyond training complex models to deploying and utilizing them for real-time interactions. Unlike previous generations, it's optimized for inference, boosting efficiency for deploying and running AI services. Achieving up to 4x better performance per chip for both training and inference, it significantly improves speed and responsiveness. 2. **Massive Interconnectivity:** Ironwood leverages Google’s AI Hypercomputer, a system comprised of interconnected TPUs called pods. A superpod containing 9,216 Ironwood chips is facilitated by a breakthrough Inter-Chip Interconnect (ICI) network operating at 9.6 Tb/s. This immense connectivity enables thousands of chips to rapidly communicate and access a substantial 1.77 Petabytes of Shared High Bandwidth Memory (HBM), effectively mitigating data bottlenecks within the most demanding models. 3. **Research-Driven Innovation:** A key differentiator for Ironwood is its close collaboration between Google DeepMind researchers and TPU engineers. This integrated approach, enabled by AlphaChip – a method utilizing reinforcement learning to design next-generation chip layouts – ensures that the hardware aligns precisely with the needs of cutting-edge AI models. This ongoing feedback loop translates to optimized performance and a continuous stream of improvements for Google’s AI systems. |