The scenes of unhinged violence and looting on Dublin’s O’Connell St less than two years ago are still fresh in people’s memories.

The task of identifying those involved in the rioting and criminality on November 23, 2023, has proved to be a colossal undertaking — one largely done manually.

The following February, then garda commissioner Drew Harris told the Oireachtas justice committee that digital footage from that night ran to 22,000 hours, the equivalent of 916 days.

He said the investigation had a “viewing team of eight people” who have been working round the clock since the riot.

He said that this manual processing was “unfeasible and ineffective”, adding that AI technology would speed up this process.

Mr Harris said there were people on the night of the riot that had masks over their faces, but were wearing “distinctive clothing or have distinctive characteristics”.

AI filter

He said gardaí sought digital methods to “pursue them, tag them in effect, and follow their movements until we have an image that identifies them”.

For example, if the person concerned had a red jacket, a baseball hat, or was carrying a distinctive bag, the AI software could rapidly go through all the imagery and follow the person until they took their mask off.

“We are doing that manually but, if we were able to apply software to look for specific characteristics of a person’s clothing or whatever it might be, it would speed up the process,” said Mr Harris.

The offences could range far beyond taking part in a riot to assault, a sexual attack, a shooting, murder, or child abduction.

Also at the committee, then garda chief information officer Andrew O’Sullivan said: “What we are primarily talking about is the use of the technology to filter, cluster, or sift evidence, and to boil it down to a series of suggested cases at which the examiner would look.

It does not make definitive identifications

Now, gardaí have even more digital footage at their disposal that they could use from individual garda body worn cameras, currently completing a pilot stage and set for national rollout.

Facial recognition

Committee members expressed concern at the processing of images of innocent people or of the wrong person being identified.

This points to the heart of concerns around, and objections to, the technology — which the Government is in the process of legislating for this year — particularly AI that involves facial recognition technology (FRT).

The Government initially plans for “retrospective” use, for crimes that have happened, with subsequent proposals for AI — including FRT — in “live” situations.

FRT is a tool used in many countries such as Britain, Canada, the US, Russia, and China.

Global police organisation Interpol uses FRT, which it says has helped identify “several thousand” individuals — including terrorists, criminals, fugitives, persons of interest, and missing persons.

FRT analyses patterns, shapes, and proportions of facial features, and it contours and tries to match those against facial images in a reference database. Interpol says good quality images are “crucial”.

Invaluable tool

Europol, the EU police agency, says FRT has become an “invaluable tool” for law enforcement agencies. It swiftly identifies suspects by comparing facial data collected during investigations, and compares it against criminal databases.

It said it also plays a “crucial role” in locating missing persons and children.

Europol notes concerns of “bias” with the technology, saying some studies indicated discrepancies in the system’s efficiency — particularly in identifying people from “specific ethnic backgrounds, genders, or age groups”.

FRT is a legally and technically complex area, and it is opposed by civil rights and digital rights groups and many academics both here and abroad.

At the committee, Mr O’Sullivan dismissed fears of mass surveillance: “It is not a question of a blanket identification of everybody on the street. It is only those individuals that we can identify for specific incidents, where we suspect they have committed a serious crime.”

Mr Harris said once AI technology went through all the footage and got to a point where the software could suggest clear images of a suspect, gardaí would try to identify the person “through normal policing methods”, by consulting divisional and regional colleagues, “before we would consider other steps”.

FRT requires a database of images to compare against. Mr O’Sullivan told the committee that, as it stood, “we do not have a reference database”.

Observers expect that to be dealt with under the amending legislation.

Both Mr Harris and Mr O’Sullivan pointed out that biometric technology was currently being used in processing child sex abuse imagery, by sifting and clustering large numbers of images, although not using FRT.

However, FRT could be of huge benefit in this area in terms of speed and limiting the mental and emotional damage on police viewing the material.

A November 2022 study by the Australian Institute of Criminology said the “proliferation” of child sex abuse imagery video files highlighted a “growing need” for technology to analyse videos.

It said manual processing not only compounds “unmanageable workloads and burnout”, it results in “significant psychological harms” and secondary traumatic stress disorder as well as “intrusive thoughts” and interpersonal and marital problems.

It said face and voice AI technology can automatically group victims and/or offenders by face and/or voice matches that would be very difficult, “if not impossible”, to accomplish manually and “dramatically reduce” workloads and strain.

Mr Harris told the committee that seized digital devices can have over 1m child sex abuse imagery, adding that viewing this material can have a “traumatic impact” on gardaí.

Live FRT

Europol said live FRT (LFR) involves real-time reading of all people passing an FRT camera, which can be compared to a closed “watch-list” of persons of interest — whether criminals, terrorists or missing persons.

It said live FRT “poses significant challenges” from a technical and a human perspective such as system load, human capacity, and biases.

“Police forces in the UK and in some EU countries have trialled LFR applications with varying degrees of success,” it said.

Live FRT has been in use in England and Wales for years, and Scottish police are about to start.

Last July, Scottish police said they envisaged it in three scenarios: Policing the night-time economy, where “the risk of sexual offending or violence” was high; searching for vulnerable or missing people; security for indoor events, which could be targets for terrorism; and monitoring offenders subject to restrictions, such as registered sex offenders.

Scottish rights groups oppose the development and have called on Police Scotland to “immediately abandon” the plans.

It cites tests on the use of FRT by London’s Met and South Wales Police, which it says identified disproportionately higher inaccuracy rates when attempting to identify people of colour and women.

Last July, London Met said it planned to more than double its use of live FRT.

A recent investigation by The Guardian and UK right group Liberty said internal police documents showed that 4.7m faces were scanned in live FRT in 2024 in England and Wales, twice as many as in 2023. They also found there were more than 250,000 retrospective facial recognition searches in 2024, compared to almost 139,000 the year before.

The legal basis of the British system is different to what is intended in Ireland, in that its use of FRT is still not set out in law.

Proposals by our Government have stressed that AI and FRT technology would not be for “indiscriminate surveillance”, and its use would be clearly defined in law. It says prior approval would to be sought, adding that there would be judicial oversight.

Analysis

A detailed analysis of Government policies and legislation, EU laws, as well as legal and human rights concerns can be found in a recent publication by the Oireachtas Library and Research Service’s Karen McLaughlin.

Government plans were deferred until December 2023, when the general scheme of the Garda Síochána (Recording Devices) (Amendment) Bill 2023 was published. That led to the pre-legislative scrutiny the commissioner took part in the following February.

The schedule of offences AI and FRT technology could be used include: False imprisonment; sexual offences and child abuse imagery; homicide; public order (assault and obstruction of a garda); arson and criminal damage with intent to endanger life and drug supply.

The programme for government committed to FRT for serious crimes and missing persons, along with the introduction of live FRT in cases of terrorism, national security, and missing persons.

Ms McLaughlin said this was the “first time” the Government had indicated plans for live FRT.

Asked for an update, the Department of Justice told the Irish Examiner: “Drafting of the Garda Síochána (Recording Devices) (Amendment) Bill is currently ongoing. It is intended to publish this bill during the coming Dáil term.

“A second amending bill is also proposed to allow gardaí to compare an image of a person who is reasonably suspected of being involved in the commission of a serious crime against certain sources of images that An Garda Síochána is legally entitled to capture and retain.

This bill will transpose the ‘Prum II Regulation’, and will permit the use of live facial recognition in certain limited circumstances. It is expected that the scheme of this bill will be published during 2026

The EU Prum II directive came into force in March 2024, and requires participating countries to establish a national database of facial images that can be shared through the system.

Ms McLaughin said: “Although there was no provision in the general scheme for the establishment of a national database of facial images, it is evident that Ireland will be obliged to set up such a database under Prum II.”

She said that, given developments in law and policy, the full bill is likely to be “quite different” than the general scheme. There could be some time yet before AI is part of the garda arsenal.

A technology that spotted a sex offender, but misidentified a man

Last January, David Cheneler, aged 73, picked up a six-year-old girl from school in London as a favour for her mother.

He had done so twice before after building up a relationship with them both over the course of a year.

However, unknown to the girl’s mum, Cheneler was a registered sex offender.

Under the conditions of his court order, he was prohibited from being alone with a child under the age of 14.

Cheneler happened to pass a live facial recognition (LFR) camera, which captures faces of people walking by and compares them against a database of people on a watchlist.

This includes offenders who have conditions they must adhere to.

Once a match is detected, the system generates an alert which, in this case, was reviewed by an officer. They moved to intervene. They found Cheneler and the girl.

The Met’s lead on LFR said that, without the technology, Cheneler could have had the opportunity to cause further harm.

The following month, Shaun Thompson was outside the London Bridge tube station when police exited a van and told him he was a “wanted man”.

Perplexed, he asked what he was wanted for. He said the police responded: “That’s what we’re here to find out.” Mr Thompson, aged 39 and black, said the officers asked could they take his fingerprints, but he refused.

He said they held him for up to 30 minutes, and he was only let go when he showed them a photo on his passport.

Mr Thompson had just finished a shift in Croydon with community group Street Fathers, which aims to protect young people from knife crime.

He is now bringing a legal challenge based on live facial recognition technology wrongly identifying him as a suspect.

Rights group Big Brother is taking the court challenge with Mr Thompson, saying it is the first time a misidentification case has come before the High Court.

Mr Thompson described the technology as “stop-and-search on steroids”, and said his experience of being stopped and searched had been “intimidating” and “aggressive”.

The London Met Police said it could not comment as proceedings were ongoing, but said its use of LFR was lawful. It said that LFR had led to 457 arrests, with seven false alerts, so far in 2025.

Facial recognition technology ‘intrusive’ and ‘dangerous’

Facial recognition technology is an “intrusive, unreliable, and dangerous” form of surveillance, according to the Irish Council for Civil Liberties.

In a statement to the Irish Examiner, the rights body said risks include the misidentification of individuals as suspects of crime.

It also said there is “inherent biases” in the data sets used to train FRT algorithms, which meant that women and ethnic minorities are at a “significantly higher risk” of being wrongly identified.

It has called on Garda HQ and the Department of Justice to “urgently clarify” what facial image reference databases will be used, along with addressing accuracy and discriminatory concerns of FRT.

The council is concerned by the possibility of gardaí using FRT to monitor people attending a protest or public assembly. It said this would have a “chilling effect” on citizens taking part in protests.

Darragh Murray, a senior lecturer at Queen Mary University of London, penned an academic article for The Modern Law Review which examined FRT from a “human rights law perspective” in 2023.

He also highlighted this chilling effect, which would give rise to “compound human rights harms”.

He said this would affect the “rights of privacy, expression, and assembly”, but also interfere with “the societal processes by which individuals develop their identity and engage politically”.

The Artificial Intelligence Advisory Council told the Government in June 2024 that, while FRT had the potential to speed up investigations and the apprehension of offenders and finding missing persons, these benefits “must be balanced” against the impact on rights.

It urged that “satisfactory independent evaluations” be conducted before deploying FRT.

It said FRT must comply with the EU AI Act, including a fundamental rights impact assessment before procurement. There must also be a complaints procedure as well as periodic independent auditing.

The Irish Human Rights and Equality Commission said it considered FRT “a serious interference with individual rights”. However, it added that it recognised a need for An Garda Síochána to “transform its digital technologies” in order to support a modern police service.

It said respect for human rights is an essential part of democracy and the rule of law, adding that “an appropriate balance must be struck between competing rights”.