{"id":316753,"date":"2025-11-29T16:08:05","date_gmt":"2025-11-29T16:08:05","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/316753\/"},"modified":"2025-11-29T16:08:05","modified_gmt":"2025-11-29T16:08:05","slug":"why-googles-custom-ai-chips-are-shaking-up-the-tech-industry","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/316753\/","title":{"rendered":"Why Google\u2019s custom AI chips are shaking up the tech industry"},"content":{"rendered":"<p><img decoding=\"async\" class=\"Image\" alt=\"\" width=\"1350\" height=\"900\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/11\/SEI_275896064.jpg\"   loading=\"eager\" fetchpriority=\"high\" data-image-context=\"Article\" data-image-id=\"2506357\" data-caption=\"Ironwood is Google\u2019s latest tensor processing unit\" data-credit=\"\"\/><\/p>\n<p class=\"ArticleImageCaption__Title\">Ironwood is Google\u2019s latest tensor processing unit<\/p>\n<\/p>\n<p>Nvidia\u2019s position as the dominant supplier of AI chips may be under threat from a specialised chip pioneered by Google, with reports suggesting companies <a href=\"https:\/\/www.bloomberg.com\/news\/articles\/2025-11-25\/alphabet-gains-on-report-that-meta-will-use-its-ai-chips\" rel=\"nofollow noopener\" target=\"_blank\">like Meta<\/a> and <a href=\"https:\/\/www.anthropic.com\/news\/expanding-our-use-of-google-cloud-tpus-and-services\" rel=\"nofollow noopener\" target=\"_blank\">Anthropic<\/a> are looking to spend billions on Google\u2019s tensor processing units.<\/p>\n<p>What is a TPU?<\/p>\n<p>The success of the artificial intelligence industry has been in large part based on <a href=\"https:\/\/www.newscientist.com\/article\/2422928-nvidias-blackwell-ai-superchip-is-the-most-powerful-yet\/\" rel=\"nofollow noopener\" target=\"_blank\">graphical processing units<\/a> (GPUs), a kind of computer chip that can perform many parallel calculations at the same time, rather than one after the other like the computer processing units (CPUs) that power most computers.<\/p>\n<p>GPUs were originally developed to assist with computer graphics, as the name suggests, and gaming. \u201cIf I have a lot of pixels in a space and I need to do a rotation of this to calculate a new camera view, this is an operation that can be done in parallel, for many different pixels,\u201d says <a href=\"https:\/\/scholar.google.com\/citations?user=A70PCXoAAAAJ&amp;hl=en\" rel=\"nofollow noopener\" target=\"_blank\">Francesco Conti<\/a> at the University of Bologna in Italy.<\/p>\n<p>This ability to do calculations in parallel happened to be useful for training and running AI models, which often use calculations involving vast grids of numbers performed at the same time, called matrix multiplication. \u201cGPUs are a very general architecture, but they are extremely suited to applications that show a high degree of parallelism,\u201d says Conti.<\/p>\n<p>However, because they weren\u2019t originally designed with AI in mind, there can be inefficiencies in the ways that GPUs translate the calculations that are performed on the chips. Tensor processing units (TPUs), which were originally developed by Google in 2016, are instead designed solely around matrix multiplication, says Conti, which are the main calculations needed for training and running large AI models.<\/p>\n<p>This year, Google released the <a href=\"https:\/\/blog.google\/products\/google-cloud\/ironwood-google-tpu-things-to-know\/\" rel=\"nofollow noopener\" target=\"_blank\">seventh generation of its TPU, called Ironwood<\/a>, which powers many of the company\u2019s AI models like <a href=\"https:\/\/www.newscientist.com\/article\/2505039-googles-gemini-3-model-keeps-the-ai-hype-train-going-for-now\/\" rel=\"nofollow noopener\" target=\"_blank\">Gemini<\/a> and protein-modelling <a href=\"https:\/\/www.newscientist.com\/article\/2330866-deepminds-protein-folding-ai-cracks-biologys-biggest-problem\/\" rel=\"nofollow noopener\" target=\"_blank\">AlphaFold.<\/a><\/p>\n<p>Are TPUs much better than GPUs for AI?<\/p>\n<p>Technologically, TPUs are more of a subset of GPUs than an entirely different chip, says <a href=\"https:\/\/www.bristol.ac.uk\/people\/person\/Simon-McIntosh-Smith-73f4a083-2673-41b2-a10d-3350e16e9780\/\" rel=\"nofollow noopener\" target=\"_blank\">Simon McIntosh-Smith<\/a> at the University of Bristol, UK. \u201cThey focus on the bits that GPUs do more specifically aimed at training and inference for AI, but actually they\u2019re in some ways more similar to GPUs than you might think.\u201d But because TPUs are designed with certain AI applications in mind, they can be much more efficient for these jobs and save potentially tens or hundreds of millions of dollars, he says.<\/p>\n<p>However, this specialisation also has its disadvantages and can make TPUs inflexible if the AI models change significantly between generations, says Conti. \u201cIf you don\u2019t have the flexibility on your [TPU], you have to do [calculations] on the CPU of your node in the data centre, and this will slow you down immensely,\u201d says Conti.<\/p>\n<p>One advantage that Nvidia GPUs have traditionally held is that there is simple software available that can help AI designers run their code on Nvidia chips. This didn\u2019t exist in the same way for TPUs when they first came about, but the chips are now at a stage where they are more straightforward to use, says Conti. \u201cWith the TPU, you can now do the same [as GPUs],\u201d he says. \u201cNow that you have enabled that, it\u2019s clear that the availability becomes a major factor.\u201d<\/p>\n<p>Who is building TPUs?<\/p>\n<p>Although Google first launched the TPU, many of the largest AI companies (known as hyperscalers), as well as smaller start-ups, have now started developing their own specialised TPUs, including Amazon, which uses its own Trainium chips to train its AI models.<\/p>\n<p>\u201cMost of the hyperscalers have their own internal programmes, and that\u2019s partly because GPUs got so expensive because the demand was outstripping supply, and it might be cheaper to design and build your own,\u201d says McIntosh-Smith.<\/p>\n<p>How will TPUs affect the AI industry?<\/p>\n<p>Google has been developing its TPUs for over a decade, but it has mostly been using these chips for its own AI models. What appears to be changing now is that other large companies, like Meta and Anthropic, are making sizeable purchases of computing power from Google\u2019s TPUs. \u201cWhat we haven\u2019t heard about is big customers switching, and maybe that\u2019s what\u2019s starting to happen now,\u201d says McIntosh-Smith. \u201cThey\u2019ve matured enough and there\u2019s enough of them.\u201d<\/p>\n<p>As well as creating more choice for the large companies, it could make good financial sense for them to diversify, he says. \u201cIt might even be that that means you get a better deal from Nvidia in the future,\u201d he says.<\/p>\n<p class=\"ArticleTopics__Heading\">Topics:<\/p>\n","protected":false},"excerpt":{"rendered":"Ironwood is Google\u2019s latest tensor processing unit Nvidia\u2019s position as the dominant supplier of AI chips may be&hellip;\n","protected":false},"author":2,"featured_media":316754,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,105],"class_list":{"0":"post-316753","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/316753","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=316753"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/316753\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/316754"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=316753"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=316753"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=316753"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}