Google Cloud on Wednesday announced that its eighth generation of custom-built AI chips, or tensor processing units (TPUs), will be split in two. One chip, named the TPU 8t, will be geared for model training and another, the TPU 8i, is aimed at inference. Inference is the ongoing usage of …
This breakdown shows exactly how the Logic Quality and Community Trust scores were calculated, providing full transparency into our evaluation process.