The industry’s discussions on the adoption of AI in development processes appear to have quietly but firmly shifted into a new stage.

The question is no longer whether this technology will be used, but rather about where the line will be drawn. AI tools exist, and won’t cease to exist; we need to figure out what uses are appropriate, and acceptable to consumers.

At least, that’s what I’ve been told repeatedly enough in the past few weeks that, to be honest, I’m starting to wonder if this apparent fait accompli is actually just a strategic communication line being trotted out by one of the expensive PR firms who are doing very nicely from the crusade to buff up AI’s public image.

It’s a smart line because there’s a ring of truth to it – these tools are real, they’re useful in some situations, and the technology won’t get “un-invented” now that it exists.

It relies, however, on throwing a rug over a very key distinction. “AI” is a catch-all term that’s here being used to encompass just about any computer system based on a learning process; not just the wide range of different use cases for large language models (LLMs), agents, and other Transformer-type generative tools, but also all manner of far more established and proven use cases for other, older machine learning algorithms.

So yes, it’s a fairly clever rhetorical trick. Smart upscaling algorithms for images? AI. Voice recognition? AI. Code autocompletion in your programming integrated development environment (IDE)? AI. Spellcheckers, voice assistants, smart lasso tools in image editors? AI, AI, AI.

“AI tools won’t cease to exist; we need to figure out what uses are appropriate, and acceptable to consumers”

The argument goes, how can you be “against” AI; what kind of luddite would you have to be, to deny developers the ability to use all those tools?

I’ve even heard people – who absolutely do know better – try out the outlandish claim that the games industry has always embraced AI, because enemies and NPCs in games have had “AI” since the dawn of the medium, as if the imp from Doom is to be found tossing its low-pixel fireballs somewhere up in ChatGPT’s family tree.

The problem is that while this rhetoric may muddy the waters significantly in terms of internal debate or online discourse, consumers actually seem to be pretty clear about what they do and don’t like in terms of AI.

Nobody really cares if programmers turn on LLM-driven autocompletion in their IDE. Nobody is having a meltdown about your deep learning upscaling algorithm. What they care about is, to use the word of the moment, slop.

AI slop – assets, be they art or audio, churned out using model prompts, rather than being created by a human being – is very much an issue for a lot of consumers.

There are differing levels of sensitivity to it, of course. Some consumers hate it on environmental grounds, or on moral and ethical grounds, since we’ve never actually come around to any kind of resolution to the whole “these tools were created through the single largest act of brazen IP theft in the history of humanity” issue.

Others just don’t like how it looks. AI slop is often pretty recognisable, at least to some subset of consumers. On social media, that’s regularly covered by deliberately uploading images and videos in low resolution to mask the over-glossy AI sheen. Games don’t really allow for that kind of obfuscation; AI generated assets are usually there to be seen in all their high-res glory.

For other consumers, of course, it’s not an issue. We reported recently on how uncontroversial AI assets have been in the mobile game market, at least thus far. That’s no doubt partially due to the different demographics of mobile game players, but it’s also contextual to the medium – people are far more used to seeing AI generated content on their phone screens than in PC and console games, after all.

Oh, and there’s also the minor point that almost all mobile games are free to play – because in the spirit of figuring out where consumers draw the line, and what they find acceptable, one takeaway from the past week or so has been that a lot of consumers really firmly draw the line at paying full-price for premium games with AI slop assets.

Call of Duty: Black Ops 7 is facing a pretty striking consumer backlash; it’s not just because of AI assets by any means, but the use of very blatant AI assets in key parts of the game has become a rallying point for consumers disappointed by this latest instalment in the series.

To see this content please enable targeting cookies.

Manage cookie settings

It’s not just that the assets themselves are bad – though to be clear, they’re awful, right down to uses of that terrible Studio Ghibli type filter which, for a few weeks last spring, was the most popular way to tell the whole internet that you have the media literacy of a drunken seagull.

More than that, however, what seems to have riled consumers is the whole attitude that the use of these assets suggests.

That in an expensive game, part of a franchise that has grossed billions, there is such a lack of regard for the players that nobody saw a problem with using AI instead of human artists for key assets, including some of the banner images players earn for achievements. Feed ’em slop; they won’t know the difference.

Sure, some people won’t know the difference, and others simply won’t care. But there’s a very significant chunk of consumers who do care, and it’s not an attitude driven just by some lefty ethical stance over AI or datacenters – it’s something far more fundamental and likely to become a problem for any industry segment hellbent on embracing this technology.

Consumers crave authenticity. They value “realness”, and they’re willing to pay for it – whether it’s in food, in clothing, in experiences, or in anything else, including media, this is an absolutely fundamental truth of how consumers think.

People pay a premium for genuine brand goods over replicas; they pour scorn on knock-offs. “Hand-made” commands a higher price than the most precise and efficient machine ever will. We pay more for local crafts, for art and music that makes us feel a sense of connection to a creator, for food that feels authentic and “real”, connected to a place and a history.

It’s an instinct that suffuses all areas of consumer activity. From the ludicrous prices of a brand like Hermes to the hand-made souvenirs you overpaid for on holiday because they had a story to them; we crave authenticity. To most of us, there’s nothing more valuable you can be than “real”, and nothing worse than being “fake”.

“There’s no faster way to sink your brand value than purporting to provide authenticity and serving up something machine-made”

Of course, this doesn’t mean that there isn’t room in the market for fast food or fast fashion, where we switch off those instincts and just go for something cheap that fills a gap. (With the greatest respect to the mobile sector, developments there in the past decade have absolutely pushed it into the category of being “fast games”.)

If you want to sell premium products, though, and to convince people to pay premium prices for them, you don’t get to take such blatant short-cuts.

There’s no faster way to sink your brand value than purporting to provide authenticity or realness and instead serving up something artificial and machine-made. Consumers feel it instantly; it feels cheap and worse, disrespectful.

The invention of generative AI hasn’t changed that fundamental relationship between authenticity and realness and consumer perceptions of premium products. Nor has it changed the most basic trade-off that everyone in any line of business must face over and over again; “cheap, fast, and good; pick two”. There’s always a catch to anything that claims to offer all three, and AI is no exception.

Companies who want to adopt AI as part of their development processes – and I’m ignoring the hand-wavy rhetoric over IDE code completions and image upscalers, I mean the creation of assets with generative AI – need to sit down and seriously ask themselves what business they think they’re in.

If they can honestly answer that they provide fast, cheap experiences for consumers strongly disinterested in how the sausage is made; if they’d genuinely embrace comparisons with fast food and fast fashion; then sure, perhaps there’s an argument to be made for AI.

On the other hand, if you believe your company is providing premium products – and if you’re pricing them at $70, I’d suggest that you should believe that by default – then you need to appreciate the risk you’re taking.

“People pay for authenticity, and will hate you for trying to hoodwink them with slop”

Like a high-end clothing brand advertising as “hand-stitched” but actually churning them off a factory production line, you’re one leak, one consumer backlash away from completely losing your brand’s premium identity and your capacity to maintain your price points.

That’s where the line is, and all the PR campaigns and smart lines of rhetoric in favour of AI won’t change it.

People pay for authenticity, and will hate you for trying to hoodwink them with slop. It’s a pattern of consumer behaviour that far predates AI, and it’s going to take more than a few billion LLM tokens and a ton of capital allocation to change such a basic part of human nature.