If Google’s deal with Apple and its recent Gmail integration are any indication, the search giant is pulling ahead of the AI pack. It’s planning an unprecedented buildout of accelerated-compute data centers to run and train AI models, and it has reoriented itself around the generative AI boom kicked off by OpenAI’s ChatGPT. London-based DeepMind, a research lab that for years operated without pressure to contribute to Google’s bottom line, is now central to transforming the company’s core products.

But the AI boom is reshaping many of the dynamics that produced past research breakthroughs. The massive wave of investment in data centers is no longer just about inventing new technology and generating new breakthroughs, but about running the AI we have for ever-growing numbers of users. “We’re now also in the era of full commercialization of these systems, so you’ve also got to balance serving with training,” Hassabis said.

The global buildout of AI data centers means more entities are now competing for a limited number of resources, including semiconductors, memory, and energy. “Right now, it’s memory chips, but it’ll probably be something else tomorrow,” he said.

Adoption in areas that create real-world economic impact are slow and uneven. Companies have changed drastically since the ChatGPT moment, but it’s hard to get a read on any measurable return on investment.

Another sand in the gears is that companies are now opting to keep research private that they might otherwise have shared publicly, slowing down the cross-pollination of ideas that once helped AI go from a research backwater to a hotbed of activity. “There’s so much commercial pressure, so some of those things can’t be shared quite as openly anymore, which is a shame on the one hand, but it’s understandable,” Hassabis said.