{"id":303377,"date":"2025-12-07T14:40:19","date_gmt":"2025-12-07T14:40:19","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/303377\/"},"modified":"2025-12-07T14:40:19","modified_gmt":"2025-12-07T14:40:19","slug":"google-and-anthropic-approach-llms-differently","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/303377\/","title":{"rendered":"Google and Anthropic approach LLMs differently"},"content":{"rendered":"<p>On Monday, OpenAI CEO Sam Altman <a href=\"https:\/\/www.wsj.com\/tech\/ai\/openais-altman-declares-code-red-to-improve-chatgpt-as-google-threatens-ai-lead-7faf5ea6?gaa_at=eafs&amp;gaa_n=AWEtsqeLZRp8vALG0nET4Og2R3uc1FhZQhI9E0iljsV45UYtVVmm5hkbAHohZWLptmY%3D&amp;gaa_ts=692f4461&amp;gaa_sig=R1aPKJzXRZy0WBQ7QOBq4Hi0gqzP5ccguTkSCfDmm_glG6APgzHr2YFaA2oCHN-BF3qsAZk_k3T3Z-sa7SYypQ%3D%3D\" rel=\"nofollow noopener\" target=\"_blank\">declared a \u201ccode red\u201d<\/a> in the face of rising competition.<\/p>\n<p>The biggest threat was Google; monthly active users for Google\u2019s Gemini chatbot grew from <a href=\"https:\/\/techcrunch.com\/2025\/07\/23\/googles-ai-overviews-have-2b-monthly-users-ai-mode-100m-in-the-us-and-india\/\" rel=\"nofollow noopener\" target=\"_blank\">450 million in July<\/a> to <a href=\"https:\/\/blog.google\/products\/gemini\/gemini-3\/\" rel=\"nofollow noopener\" target=\"_blank\">650 million in November<\/a> (ChatGPT <a href=\"https:\/\/techcrunch.com\/2025\/10\/06\/sam-altman-says-chatgpt-has-hit-800m-weekly-active-users\/\" rel=\"nofollow noopener\" target=\"_blank\">had 800 million <\/a><a href=\"https:\/\/techcrunch.com\/2025\/10\/06\/sam-altman-says-chatgpt-has-hit-800m-weekly-active-users\/\" rel=\"nofollow noopener\" target=\"_blank\">weekly<\/a><a href=\"https:\/\/techcrunch.com\/2025\/10\/06\/sam-altman-says-chatgpt-has-hit-800m-weekly-active-users\/\" rel=\"nofollow noopener\" target=\"_blank\"> active users<\/a> in October). Meanwhile, the Wall Street Journal <a href=\"https:\/\/www.wsj.com\/tech\/ai\/openais-altman-declares-code-red-to-improve-chatgpt-as-google-threatens-ai-lead-7faf5ea6?gaa_at=eafs&amp;gaa_n=AWEtsqeLZRp8vALG0nET4Og2R3uc1FhZQhI9E0iljsV45UYtVVmm5hkbAHohZWLptmY%3D&amp;gaa_ts=692f4461&amp;gaa_sig=R1aPKJzXRZy0WBQ7QOBq4Hi0gqzP5ccguTkSCfDmm_glG6APgzHr2YFaA2oCHN-BF3qsAZk_k3T3Z-sa7SYypQ%3D%3D\" rel=\"nofollow noopener\" target=\"_blank\">reports<\/a>, \u201cOpenAI is also facing pressure from Anthropic, which is becoming popular among business customers.\u201d<\/p>\n<p>Google ratcheted up the pressure on OpenAI two weeks ago with the <a href=\"https:\/\/blog.google\/products\/gemini\/gemini-3\/\" rel=\"nofollow noopener\" target=\"_blank\">release of Gemini 3 models<\/a>, which set new records on a number of benchmarks. The next week, Anthropic released Claude Opus 4.5, which achieved even higher scores on some of the same benchmarks.<\/p>\n<p>Over the last two weeks, I\u2019ve been trying to figure out the best way to cover these new releases. I used to subject each new model to a battery of bespoke benchmarks and <a href=\"https:\/\/www.understandingai.org\/p\/grok-2-and-llama-31-dont-stand-out\" rel=\"nofollow noopener\" target=\"_blank\">write about the results<\/a>. But recent models have gotten good enough to easily solve most of these problems. They do still fail on a few simple tasks (like telling time on an analog clock) but I fear those examples are increasingly unrepresentative of real-world usage.<\/p>\n<p>In the future, I hope to write more about the performance of these new Google and Anthropic models. But for now, I want to offer a more qualitative analysis of these models. Or rather, I want to highlight two pieces that illustrate the very different cultures at Google and Anthropic \u2014 cultures that have led them to take dramatically different approaches to model building.<\/p>\n<p><a target=\"_blank\" href=\"https:\/\/substackcdn.com\/image\/fetch\/$s_!1Xw5!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd94aefdc-dfeb-408c-833f-04b1fed9de79_5275x3517.jpeg\" data-component-name=\"Image2ToDOM\" rel=\"nofollow noopener\" class=\"image-link image2 is-viewable-img\"><img decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/uk\/wp-content\/uploads\/2025\/12\/https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/d94aefdc-dfeb-408c-833f-04b1fed9de79_5275.jpeg\" width=\"1456\" height=\"971\" data-attrs=\"{&quot;src&quot;:&quot;https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/d94aefdc-dfeb-408c-833f-04b1fed9de79_5275x3517.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2748283,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image\/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https:\/\/www.understandingai.org\/i\/180728069?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd94aefdc-dfeb-408c-833f-04b1fed9de79_5275x3517.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}\" alt=\"\"   fetchpriority=\"high\" class=\"sizing-normal\"\/><\/a>Jeff Dean, a legendary engineer who has worked at Google since 1999, has led a number of AI projects inside the company. (Photo by THOMAS SAMSON\/AFP via Getty Images)<\/p>\n<p>Last week the newsletter Semianalysis published a <a href=\"https:\/\/newsletter.semianalysis.com\/p\/tpuv7-google-takes-a-swing-at-the?_gl=1*67txzo*_ga*ODE5MDA2NDEzLjE3NjAxMTExMDY.*_ga_FKWNM9FBZ3*czE3NjQ2MTk5OTkkbzQkZzAkdDE3NjQ2MTk5OTkkajYwJGwwJGgxOTA5NTc1NTE3\" rel=\"nofollow noopener\" target=\"_blank\">deep dive<\/a> on the success of tensor processor units (TPUs), Google\u2019s alternative to Nvidia GPUs. \u201cGemini 3 is one of the best models in the world and was trained entirely on TPUs,\u201d the Semianalysis authors wrote. Notably, Claude Opus 4.5 was also trained on TPUs.<\/p>\n<p>Google has employed TPUs for its own AI needs for a decade. But recently Google has made a serious effort to sell TPUs to other companies. The Semianalysis team argues that Google is \u201c\u200b\u200bthe newest and most threatening merchant silicon challenger to Nvidia.\u201d<\/p>\n<p>In October, Anthropic <a href=\"https:\/\/www.anthropic.com\/news\/expanding-our-use-of-google-cloud-tpus-and-services\" rel=\"nofollow noopener\" target=\"_blank\">signed a deal<\/a> to use up to one million TPUs. In addition to purchasing cloud services from Google, Semianalysis reported, \u201cAnthropic will deploy TPUs in its own facilities, positioning Google to compete directly with Nvidia.\u201d<\/p>\n<p>Recent generations of the TPU were respectable chips, but Semianalysis argues Google\u2019s real strength is the overall system architecture. Modern AI training runs require thousands of chips wired together for rapid communication. Google has designed racks and networking systems that squeeze maximum performance out of every chip.<\/p>\n<p>This is one example of a broader principle: Google is fundamentally an engineering-oriented company, and it has approached large language models as an engineering problem. Engineers have worked hard to train the largest possible models at the lowest possible cost.<\/p>\n<p>For example, Gemini 2.5 Flash-Lite costs 10 cents for a million input tokens. Anthropic\u2019s cheapest model, Claude Haiku 4.5, costs 10 times as much. Google was also the first company to <a href=\"https:\/\/www.understandingai.org\/p\/gemini-advanced-is-not-that-advanced\" rel=\"nofollow noopener\" target=\"_blank\">release an LLM<\/a> with a million-token context window.<\/p>\n<p>Another place Google\u2019s engineering prowess has paid off is in pretraining. Google <a href=\"https:\/\/deepmind.google\/models\/gemini\/pro\/#performance\" rel=\"nofollow noopener\" target=\"_blank\">released this chart<\/a> showing Gemini 3 crushing other models at SimpleQA, a benchmark that measures a model\u2019s ability to recall obscure facts.<\/p>\n<p><a target=\"_blank\" href=\"https:\/\/substackcdn.com\/image\/fetch\/$s_!kZgf!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0058720e-fc02-4a6c-ab5b-87675b4f7f2a_2036x1128.png\" data-component-name=\"Image2ToDOM\" rel=\"nofollow noopener\" class=\"image-link image2 is-viewable-img\"><img decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/uk\/wp-content\/uploads\/2025\/12\/https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/0058720e-fc02-4a6c-ab5b-87675b4f7f2a_2036.png\" width=\"1456\" height=\"807\" data-attrs=\"{&quot;src&quot;:&quot;https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/0058720e-fc02-4a6c-ab5b-87675b4f7f2a_2036x1128.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:807,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}\" alt=\"\"   loading=\"lazy\" class=\"sizing-normal\"\/><\/a><\/p>\n<p>As a perceptive Reddit commenter points out, this likely reflects Google\u2019s ability to deploy computing hardware on a large scale.<\/p>\n<p>\u201cMy read is that Gemini 3 Pro\u2019s gains in SimpleQA show that it\u2019s a massive model, absolutely huge, with tons of parametric knowledge,\u201d <a href=\"https:\/\/www.reddit.com\/r\/singularity\/comments\/1p3xrky\/comment\/nq7tle5\/\" rel=\"nofollow noopener\" target=\"_blank\">wrote jakegh<\/a>. \u201cGoogle uses its own TPU hardware to not only infer but also train so they can afford to do it.\u201d<\/p>\n<p>So Gemini 3 continues the Google tradition of building solid, affordable models. Public reaction to the new model has been broadly positive; the model seems to perform as well in real-world applications as it does on benchmarks.<\/p>\n<p>The new model doesn\u2019t seem to have much personality, but this may not matter. Billions of people already use Google products, so Google may be able to win the AI race simply by adding a good-but-not-amazing model like Gemini 3 to products like search, Gmail, and the Google Workspace suite.<\/p>\n<p><a target=\"_blank\" href=\"https:\/\/substackcdn.com\/image\/fetch\/$s_!avbP!,f_auto,q_auto:good,fl_progressive:steep\/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F157d25f8-534a-482f-aba5-40fddeedd557_2261x1443.jpeg\" data-component-name=\"Image2ToDOM\" rel=\"nofollow noopener\" class=\"image-link image2 is-viewable-img\"><img decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/uk\/wp-content\/uploads\/2025\/12\/https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/157d25f8-534a-482f-aba5-40fddeedd557_2261.jpeg\" width=\"1456\" height=\"929\" data-attrs=\"{&quot;src&quot;:&quot;https:\/\/substack-post-media.s3.amazonaws.com\/public\/images\/157d25f8-534a-482f-aba5-40fddeedd557_2261x1443.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:929,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:311433,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image\/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https:\/\/www.understandingai.org\/i\/180728069?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F157d25f8-534a-482f-aba5-40fddeedd557_2261x1443.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}\" alt=\"\"   loading=\"lazy\" class=\"sizing-normal\"\/><\/a>Philosopher Amanda Askell described her work at Anthropic in a recent 60 Minutes interview.<\/p>\n<p>Last week\u2019s release of Claude Opus 4.5 also got a positive reception, but the vibes were different.<\/p>\n","protected":false},"excerpt":{"rendered":"On Monday, OpenAI CEO Sam Altman declared a \u201ccode red\u201d in the face of rising competition. The biggest&hellip;\n","protected":false},"author":2,"featured_media":303378,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,86,56,54,55],"class_list":{"0":"post-303377","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/303377","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=303377"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/303377\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/303378"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=303377"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=303377"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=303377"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}