A trial of controversial live facial recognition technology that scanned thousands of faces of passengers travelling from Dublin to Holyhead produced no matches, the UK Home Office has said.
Thousands of passengers on the ferry route to the Welsh port were scanned in February as part of a UK immigration enforcement pilot scheme.
Over three days, passengers were checked against a watch list of 6,535 suspected immigration offenders, with no matches detected. The outcome raises fresh questions about the effectiveness and proportionality of the technology, which Irish and UK authorities are planning to deploy more widely.
Despite no passengers matching the watch list, two people were arrested during the operation. The Home Office did not say how the arrests were linked to the facial recognition system.
A further trial of the powerful surveillance technology was carried out by police forces in England and Wales over six days in November. Across all the trials since then, more than 10,000 faces were scanned against watch lists that grew from 1,942 to 6,535 individuals. The initiative has resulted in just two alerts overall, with none arising in the most recent leg.
The operation is a “proof-of-concept pilot” by the Home Office’s immigration enforcement division, which plans to use the technology to locate people within their “Population of Interest”.
The Home Office said the six-day pilot in November cost £50,000.
A Home Office spokesman said they would not comment on operational matters, but that live facial recognition technology was an essential part of “safeguarding the integrity of the UK’s immigration system”.
Olga Cronin, a senior policy officer at the Irish Council for Civil Liberties, questioned the proportionality of the programme. “How can it be necessary and proportionate to subject more than 10,000 individuals with no connection to wrongdoing to indiscriminate biometric facial data processing?” she asked.
“Processing a person’s biometric data, in any context, constitutes a serious interference with people’s privacy and data protection rights.”
Cronin said that, as the UK expands the technology, the “trajectory is stark” and “should serve as a warning to us here in Ireland”.
Attempts to grant gardaí such powers since 2022 have faced opposition from TDs and civil liberties groups.
Last year, the Government’s expert group on artificial intelligence warned that plans for facial recognition technology risk “gradual mission creep towards an untargeted mass surveillance state”.
An Garda Síochána will soon be granted powers to use live facial recognition, but the Coalition has said it will only be used in emergency situations such as terrorism incidents or missing persons cases.
The UK pilot goes much further and will use the technology to detect immigration offences.
[ AI advisory group warns of potential for mass surveillanceOpens in new window ]
Sinéad Gibney, Social Democrats TD for Dublin-Rathdown, has previously raised concerns in the Dáil about the technology, which she said was “deeply flawed, and prone to errors”.
She said there was an “increasing push for the public to accept that every aspect of their lives is subject to surveillance”.
Elizabeth Farries, of University College Dublin’s Centre for Digital Policy, said the policing benefit appeared “minimal” and the results “demonstrates that FRT does not work reliably in real-world conditions”. However, she said, “the problem isn’t accuracy, it’s bias”.
In the UK man is suing the police after being arrested for burglary in a city he had never visited. Facial recognition misidentified him and suggested he was another person of south Asian heritage.
Separately, the Metropolitan Police is being taken to court by Shaun Thompson, a black community worker, who was wrongly identified as a criminal suspect.