Google: Supercomputer with more than 4000 TPUs
Google shares new details about its self-developed Chip Tensor Processing Unit v4 (TPU), which is used in the Alphabet Group’s supercomputers. The Internet giant promises a factor of 10 improved scaling in ML system performance and a simultaneous increase in energy efficiency compared to the previous generation v3. If you compare Google’s TPU v4 to other current ML Domain Specific Accelerators (DSA), the efficiency should be two to three times better and significantly less CO2 be emitted.
Google uses its custom chips primarily for training artificial intelligence. According to its own statements, more than 90 percent of all work in this area is already calculated by TPUs. In a scientific paper, the company describes how more than 4,000 of these processors can be connected to form a supercomputer. To connect the individual chips, Google uses proprietary optical switches that it has developed itself. All information can be found in the company’s announcement.
(sve)