A councillor in Newcastle aired fears this week over efforts to get local authority staff to use artificial intellingence more, after reports of it making major errors
Newcastle Lib Dem councillor Gareth Kane(Image: Newcastle Chronicle)
Plans for North East council workers to make more use of artificial intelligence (AI) in order to slash public spending have sparked worries over “terrifying” errors. Budget plans approved by Newcastle City Council on Wednesday night include getting staff to “work smarter” by using AI, as the local authority bids to streamline its services and close a projected £37 million deficit over the next three years.
But the proposals, which will also mean 75 jobs at the Civic Centre are also due to be lost, have sparked concerns over public services becoming overly reliant on the technology. Lib Dem councillor Gareth Kane urged this week that human beings must “remain part of the process”, especially when dealing with sensitive matters.
He highlighted recent reports of AI making significant factual errors. The Guardian reported last month that research across 17 English and Scottish councils had found AI tools made potentially harmful errors in social work records, including false indications of suicidal thoughts when summarising meetings.
A report from Parliament’s Home Affairs Committee has also criticised the use of AI in the decision to ban Maccabi Tel Aviv fans from a Europa League fixture against Aston Villa in November 2025. That review found that West Midlands Police (WMP) relied on “inaccurate information” and “failed to do even basic due diligence” on its intelligence.
Microsoft’s Copilot AI was blamed for hallucinating a fictitious match between West Ham and Maccabi Tel Aviv, which was used to back up the decision to block the Israeli club’s fans from the fixture at Villa Park.
Coun Kane told a full council meeting on Wednesday night: “I am very worried about reliance on AI. There are environmental impacts, but also people assume that AI tells you the truth – it doesn’t.”
The Ouseburn councillor added: “We say we train it, but it trains itself. It decides how it makes decisions and there is no traceability through it. Reports in the press that commercial AI have been hallucinating suicidal ideation in young people when social workers are interviewing young people is terrifying. Fortunately in those cases there was enough human intervention to make sure the wrong decisions were not made. It could end in tragedy.”
At a meeting of the council’s corporate scrutiny committee on Tuesday, Labour councillor Andrew Herridge also raised concerns about the water and energy required to power AI data centres. At the planned £10 billion QTS data centre in Cambois, Northumberland, each of a proposed 10 data centre buildings is set to require eight separate electricity lines, 55 diesel-powered generators to act as an emergency power supply during outages, and cooling systems to keep their huge network of computers at the right temperature.
Jenny Nelson, Newcastle City Council’s assistant director for customer contact, ICT and digital transformation, told the scrutiny panel: “We are really clear that where we use AI, it is governed by our golden rule. We must use AI tools that have been approved within the authority, we must stay within our data protection policy, and our golden rule is that there can never be an excuse that ‘AI did it’. The human must be in the loop at all times.”
She added: “The work we are doing is about using AI to support our colleagues in the work that they do. We are totally committed to transparency in that.”
Did you know you can make ChronicleLive a preferred source of North East news in Google, which will mean you get more of our breaking news, exclusives, and must-read stories straight away? Here’s more information about what this means and how to do it – you can also do it straight away by clicking here.