In 2021, we trained PaLM 540B on 6,144 TPU v4 chips. 5 years later -> 3 orders of magnitude growth in number of chips.
In the meantime, I’ve become even more scale-pilled. And so will you in 2026.
Today, we announced that we plan to expand our use of Google TPUs, securing approximately one million TPUs and more than a gigawatt of capacity in 2026.