It is often said, and with good reason, that the most compelling and persuasive arguments against the AI industry and all who sail in it come not from its detractors but from its most vocal advocates. Take, for example, Sam Altman, chief executive of OpenAI, who last week, in a public conversation in India about his industry, was asked to address widespread concerns about its extraordinary consumption of fossil energy and water. Such concerns, he claimed, were misplaced. The water usage figures in particular were wildly overstated.
He then went on to make a broader argument about the amount of energy consumed by AI relative to that required to produce an economically productive human being.
“People talk about how much energy it takes to train an AI model,” he said, “relative to how much it costs a human to do one inference query. But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart. So the fair comparison is if you ask ChatGPT a question, how much energy does it take, once its model is trained, to answer that question versus a human. And probably AI has already caught up on an energy efficiency basis, measured that way.”
In one sense, the argument is a fairly banal one, amounting to a defence of the energy usage needed to run what he and many others believe, with some justification, is a socially and economically transformative technology. Its framing, though, seems to reveal something deeper and more troubling about the way in which Altman and people like him view their fellow human beings.
I’m tempted to suggest here, to avoid just credulously feeding the current iteration of the AI hype cycle, that Altman is being deliberately provocative. That hype cycle, after all, is largely predicated on the perception that products such as OpenAI’s are on the verge of making human intellectual labour obsolete, that within a very short space of time, perhaps as little as a few years, all white-collar jobs will be effectively eliminated. This, needless to say, would be economically devastating for the vast majority of people, and, more importantly, incredibly lucrative for a very tiny minority. No one, least of all the investor class, wants to get left behind in that kind of epochal market transition.
The thing that Altman is really concerned his company is burning unprecedented amounts of is not water or fossil fuel, after all, but raw capital. In order to keep drawing more and more money into the voracious capital-burning engine of AI, he needs to continually stoke the hype furnace.
All of that is true, but I’m not convinced it explains why he would say a thing like this. I’m not convinced that AI evangelists such as Altman really understand how strange and creepy they come across when they talk about the future they want the rest of us to get excited about. I think Altman is saying what he believes here, and in doing so he is revealing something crucial about his distinctly anti-human worldview.
Essentially, what he is saying here is that humans are much less efficient, considered in terms of energy requirements and informational outputs, than LLMs such as OpenAI’s ChatGPT.
[ The inconvenient truth about artificial intelligenceOpens in new window ]
We shouldn’t be surprised that Altman would make this sort of claim. The Silicon Valley milieu of which he and his company are such exemplary products has long been animated by the transhumanist movement, among whose central tenets is the idea that a human being is already a kind of machine. It is therefore desirable and inevitable, they believe, that we should either merge with technology or be made obsolete by it. (Among the more canonical and memorable descriptions of this worldview came from the mathematician Marvin Minsky, an early pioneer of AI. “The brain,” as he put it, “happens to be a meat machine.”)
This is a deeply impoverished understanding of human existence, in which our value is calculated using, as metrics, intelligence and productivity. And if you believe that intelligence is a matter of solving complicated mathematical problems or winning games of chess, then AI has already begun to displace humanity. But if you believe that a human being counts for more than computational power or intelligence or productivity, then the very notion of obsolescence can only ever be what philosophers call a category error, a confusion of one sort of thing (human) for another (machine).
If, as Altman argues, AI is more resource efficient than humans – with their exorbitant requirements for food, water, education, housing, liveable income and so on – then it makes sense to prioritise it economically. What would be the point in allocating resources to people when those resources are more reliably and profitably converted when channelled towards AI?
[ AI is turning Ireland’s graduate recruitment market upside downOpens in new window ]
And if the framing of this question seems naggingly familiar – if its logic seems to characterise more than just the incipient antihumanism of a tech CEO’s apparently off-the-cuff remark – that’s because it’s also essentially the inner logic of capitalism in its extreme and unfettered form. The reason why so much of the stock market is determined by activity around AI, and why there is no more valuable company on the planet than Nvidia, is that the automation of vast swathes of the labour market is, from the point of view of capital, an extremely desirable outcome.
I personally believe that claims – and fears – about AI’s likely destruction of the entire employment economy are overblown. Such apocalyptic narratives are necessary, because they feed the hype cycle. But neither am I an AI sceptic, as such. It’s pretty obvious that this technology is transformative, and that it will have profound economic and social effects – not necessarily next month, or next year, as the investment-hungry AI CEOs would have us believe, but in 10 , 20 or 30 years from now.
What I am sceptical about is the idea that people like Sam Altman, who seem to think of human beings as an inefficient use of energy and resources, should wield such control over this technology and its economic dividends.