Your article on the Iran school bombing rightly challenges the reflex to blame artificial intelligence (AI got the blame for the Iran school bombing. The truth is far more worrying, 26 March). However, the deeper problem lies not in the technology but in the language now forming around it. To say that there was an “AI error” quietly removes the human subject from the sentence. Where once civilians were “dehoused” or “collateral damage”, responsibility is now displaced altogether: from people to systems.

This matters because moral accountability depends on clarity about who acts. However complex the chain of analysis and command, it remains human beings who design, authorise and execute these decisions. To obscure that fact is not a technical error but a civic one.

AI may accelerate warfare, but it is also accelerating a subtler shift: from euphemism to automation as alibi. If public language cannot name human responsibility, public scrutiny cannot hold it to account.
Anthony Lawton
Market Harborough, Leicestershire

Your article about losing control over AI agents (Number of AI chatbots ignoring human instruction increasing, study says, 27 March) was as alarming for its language as for its content. You say that AI agents “connived”, “conned”, “admitted” and “confessed”; that they “lie” and “cheat”. The term widely used to describe AI rule-breaking – scheming – is similarly anthropomorphic. Such language ascribes moral agency to large language models and in so doing obscures where responsibility actually lies.

Imagine a company had released high-speed vehicles on to the roads before fitting them with effective brakes. We would not say the vehicles “connived” to kill other road users; we would say the humans behind the company had behaved with the utmost recklessness. If out-of-control AI does ever cause harm, we will have no hope of holding the technology companies (and the governments that promote them) to account unless we properly attribute moral agency when we speak about their products.
Dr Felicity Mellor
Director, Science Communication Unit, Imperial College London

Have an opinion on anything you’ve read in the Guardian today? Please email us your letter and it will be considered for publication in our letters section.