For decades, there was a rumour that Socrates (the footballer not the philosopher) had a stint with University College Dublin football club. This was, of course, wrong; he actually played for the graduate club Pegasus.

Only this was also wrong, as he merely studied medicine in UCD and didn’t play football there. Only that was also wrong, as he actually studied at the Royal College of Surgeons in Ireland and opted against playing after seeing the state of their football team.

Only this is also wrong. In more than 30 years of following the Students in the League of Ireland, I’ve heard all of these variants.

Job losses and AI: Could technology bring us back to dark days of 2008?Opens in new window ]

Fog has been a factor in the relaying of information, irrespective of the originator’s intentions, for as long as humans have been able to communicate. It’s no surprise then that plenty of misleading narratives spread wildly during last week’s fuel protests.

The age of artificial intelligence (AI) has unfortunately only made it tougher for anyone to see through the gloom and work out what was and was not accurate as tensions heightened on Irish roads last week.

There were fake documents, with An Garda Síochána having to warn about fake memos being spread. There was also the usual run of misleading or recycled videos and images to add fuel to the fire.

Has the fuel protest shown that the loudest lobby generally gets what they want?

The Defence Forces had to clarify that they were doing normal practice for Unifil [United Nations Interim Force in Lebanon] exercises in Limerick instead of staging a takeover. Anyone who lives within a mile of an Army base would probably think there was a takeover happening daily if they based their assumptions on what one X poster in Limerick observed.

Ciarán Mullooly, a MEP for Independent Ireland and a former journalist, referenced the imagery in a European Union meeting before rolling back later.

There were also videos shared on social media that were claimed to be of the Irish Defence Forces when they bore the insignia of other countries and Kevin Sorbo, who you may remember as playing Hercules on TV, shared a video from an anti-immigration protest last year purporting to be taken at the recent fuel protests.

AI has allowed this sort of information to spread faster, but it also comes with another element of mutation. Asking AI chatbots, such as Grok, for a summary could easily draw on false narratives being spread to provide what it presents as a simple report.

That’s not ideal and, as this column tends to say a lot, it gets worse. In the pre-AI world, even with social media, the dissemination of misinformation was bad, but the pace wasn’t impossible to match. Errors persisted, but there was a limit to reach. Moreover, the ways in which stories mutated took time and tended to burn out before the amendments had time to take hold.

Think back to the pandemic and the sheer number of times you received a message that the government was announcing a total lockdown at midnight or dawn. At least then, if you asked five people what a total lockdown meant, you’d get six answers and quickly realise it was likely bunk.

AI has accelerated not only the speed at which rumours and false information spread, but has also weaponised the fog that comes with such tension. The same rumour can be repurposed, with pictures and video, to target multiple audiences in ways each would find plausible.

That makes confirmation bias far more likely to affect people’s views, especially as we were all unfortunately raised to believe that where there’s smoke, there’s fire. I’ve burned plenty of dinners but the only ones that had flames were meant to.

The aim of the spread of misinformation isn’t some explosive change in how we consume the media. It’s far easier to achieve the goal of eroding trust in professional media outlets and sow distrust in institutions.

We’ve already seen what this kind of erosion and embracing of the fog can do in other democracies. It’s a brutal battle to fight because those spreading misinformation are selling simple messages and using high-tech tools to do it.

There is no way to wholly eliminate the dissemination of misleading information. That doesn’t mean we can’t work to reduce its impact. It begins with common sense.

There are two reasons the “Socrates to UCD” story lasted so very long. It was fun and it was low stakes. Nothing really mattered when it came to the veracity of it. See also the tale of Andre the Giant and Samuel Beckett, which is partially true but wildly overblown.

Ellen Coyne looks at the political fallout of the fuel protests and if the Government measures will placate the protestors.

If something doesn’t really matter, don’t waste energy on it. Leave that to the likes of me. When it comes to something that does affect you, like the fuel protests and the responses to them, then be sceptical of everything and look to see how and why something is being reported or shared.

We all know there is a lot of AI-sourced material in the world; we’ve all probably complained about some terribly dull images made with it. Remember that when it comes to anything being shared online, especially something likely to reinforce your own views.

The danger isn’t just in misinformation but in becoming a participant in its endlessly reframing and allowing it to fester.