Two experts who work in Liverpool worry an AI-powered tool in Ring doorbells could be used for something more worrying
12:39, 08 Mar 2026Updated 12:39, 08 Mar 2026

Some have raised concerns about the search party function on Ring doorbells(Image: Smith Collection/Gado/Getty Images, / Stylised by Marianna Longo/Reach Plc)
Two Liverpool academics fear that a feature within Ring doorbells could be misused. Doorbells with video cameras attached have become an increasing part of everyday life, allowing people who have one to answer the door when they’re not in the house or check where a parcel has been left.
But these doorbells have also become a huge part in the investigation into and reporting of crime. The ECHO has often used doorbell footage to illustrate crime stories as well as showcasing heartwarming acts of goodwill.
Police forces meanwhile use doorbell footage as critical evidence in high-profile cases. Last month, Ryan Walsh-Westhead was jailed for the murder of Rikki Berry in Kirkby in 2024 after he was caught on his own Ring doorbell footage discussing the killing.
Finding lost dogs – or something more worrying?
Concerns were raised last month however when Ring showcased the ‘search party’ feature in its doorbells during an advert in the Super Bowl. The tool allows people to upload pictures of their missing pet. After this, search party uses artificial intelligence (AI) to scan nearby Ring accounts for any animals that resemble the photo of the lost pet.
This brought the feature to widespread public attention. The search party tool is switched on upon activating your Ring doorbell but can be turned off.
Search party was highlighted again when investigative news site 404 Media reported on a leaked email, sent by Ring CEO Jamie Siminoff, which suggested that the company was planning to use the feature to “zero out crime in neighbourhoods.”
Another email, also obtained by 404 Media and allegedly sent by the CEO on September 4 last year, saw Siminoff discussing how “public agencies” could use the tool.
A Ring spokesperson previously told The Independent that Mr Siminoff’s leaked emails were “intended to speak broadly to the long-term potential of customer-controlled features and technologies working together to support safer communities.”
On the search party function, a Ring spokesperson told the ECHO that the tool “does not work to find people or conduct other searches, and participation is optional”.
‘An invasion of privacy’
The ECHO asked two experts in AI at Liverpool John Moores University (LJMU) about what they think about the function.
Dr Áine Mac Dermott, senior lecturer in cyber security and digital forensics at Liverpool John Moores University, says the use of video camera doorbells has infringed on people’s privacy despite the good they can be used for.
Dr Mac Dermott, 35, from Liscard, said: “I think (they have) invaded a lot of privacy. When people go missing or there’s been certain cases, I know it’s up to people if they want to share doorbell footage or CCTV.

Dr Áine Mac Dermott, senior lecturer in cyber security and digital forensics at Liverpool John Moores University (LJMU)(Image: Supplied)
“But I think being able to quickly search (using search party) could potentially be misused, or government or police might use that as a justification to start trying to search for people or use it in different ways.
“It could be used to look for lost pets. But I think it could be applied to other things. I think more and more people have Ring doorbells and footage.
“I think it is quite invasive. I can’t walk down my street without (being on them), every other house has one and a lot of the time people don’t actually know they’re being recorded.
“I know in the EU, when GDPR first came in, people were trying to find out like, is there a way to not consent (to being recorded)? How do you basically argue that you don’t want to be recorded? I do think there’s a lot of ethical concerns over that.”
In her line of work, Dr Mac Dermott has seen the positives and negatives of this technology. She said: “I know some researchers in my department, they do a lot of stuff on AI for gait analysis – so looking at people from a distance or in black and white, and then trying to identify or de-identify them.

Ring doorbell camera footage of Ryan Walsh-Westhead returning home with his uncle Connor Walsh following the murder of Rikki Berry(Image: Merseyside Police)
“At Edge Hill (University) they did some stuff on body language, so looking at CCTV to see if their body language shows that maybe they’re being coerced or something.
“Again, I can see the benefits of that, but I feel like the more we help identify people by large data sets, then the easier it is for people to potentially misuse it or track people. I could see the benefits, but I can also see the issues associated (with it).”
‘AI has become embedded in everyday devices’
Dr. Wasiq Khan is a research leader in AI and data science within the School of Computer Science and Mathematics at LJMU. He has worked on analysing body language and facial de-identification. This is the process of concealing or altering personally identifiable facial features to prevent recognition through methods such as blurring or pixelation.
Dr Khan, 43, from Preston, said: “I started a project on a facial de-identification system. These two goals (identification and privacy preservation) are essentially reversed and can contradict each other, which reflects the broader tension within this technology. As someone who supports AI, I would still say: why not use AI to do this? It can be very effective for safety, security, and surveillance.
“But at the same time, as someone who works in ethical and responsible AI, I believe we must give equal attention to privacy and responsible use. Most people assume that AI surveillance only means facial recognition, but that is not always the case.

Ring doorbell are becoming a common feature on front doors(Image: Chip Somodevilla/Getty Images/ Stylised by Marianna Longo/Reach Plc)
“For instance, in our recent research on walking manner (gait) identification, we can identify individuals simply from the way they walk, even when their face is not visible. This sort of capability raises important ethical concerns for me.
“So the positive side is that technology can help the community and stakeholders such as the police and local neighbourhoods. However, the major concern for me is ethical AI. For example, whether consumers, users, or even a person at the door know that they are being recorded, and whether we have their consent. These are such concerning things we need to consider.”
Dr Khan feels the search party function is a good example of the ethics of surveillance and AI in wider society. He said: “With its increasing use, we work from technical contributions to real-world applications across different sectors including policing, personal identification, de-identification, education system, healthcare technologies, and many more.
“We know that AI has become embedded in everyday devices such as doorbells, security cameras, and even systems within our kitchens and homes. However, we must think carefully about how these systems are designed, ensuring that they protect both safety and privacy.
“One of the key areas where I would say I stand out is responsible and ethical AI. Over the past seven years, we have been working on explainable AI models, as well as human-AI models and applications.
“Given this, my expertise lies in ethical and responsible AI. I believe that related stakeholders must ensure that such systems protect safety and privacy, and that the public is also assured of these protections.”
Another concern is deepfakes. These are pictures and videos that are seemingly real but are actually be AI generated-fakes, which people are now widely seeing on their social media feeds. The criminal use of deepfakes is a key plotline of BBC drama The Capture, which is returning for its third series today (Sunday, March 8).
Dr Mac Dermott specialises in this topic and is writing a book on it, entitled ‘The Deepfake Dilemma: Technology, Truth, and Forensic Integrity’.
She said: “There’s a lot of reports from Interpol and Homeland Security where they’ve basically started saying that they think deep fake crimes or AI enabled crimes, were on the rise.”
What do Ring say?
Ring say the search party function “does not work to find people or conduct other searches”(Image: Joe Raedle/Getty Images/ Stylised by Marianna Longo/Reach Plc)
The ECHO approached Ring about the arguments made by the academics. In response, a Ring spokesperson said: “Ring’s products are not designed to track people, and privacy and customer control are built into how our features are designed and how they work.
“Search party was purpose-built to help reunite lost dogs with their families, does not work to find people or conduct other searches, and participation is optional.
“When a Ring camera spots a potential lost dog, it lets the camera owner know and the camera owner chooses whether to contact the pet owner – similar to calling the number on a collar. Ring does not automatically share video as part of that process.”
A Ring spokesperson previously told The Independent that Siminoff’s leaked emails were “intended to speak broadly to the long-term potential of customer-controlled features and technologies working together to support safer communities.”
The spokesperson added: “No single feature is designed to ‘zero out crime,’ and tools like search party for dogs are purpose-built for specific use cases — like helping reunite lost pets — with privacy and user choice at the centre. Jamie writes these emails knowing they may be shared externally, this isn’t the first (or last) time his notes have been shared.”