Compared to traditional CPUs and GPUs, TPU v1 can achieve a performance improvement of 15 to 30 times in neural network computing, with energy efficiency improvements reaching 30 to 80 times, bringing great shock to the industry.
In 2017 and 2018, Google continued to launch stronger TPU v2 and TPU v3 for AI training and inference. In 2021, they launched TPU v4, which uses 7nm technology, with the number of transistors reaching 22 billion, achieving a 10-fold performance improvement over the previous generation, 1.7 times stronger than NVIDIA’s A100.
This is because customizing chip design requires a very high level of R&D technology for a company and is extremely costly.
Leave a Comment
Your email address will not be published. Required fields are marked *