“Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with [non-governmental organisations],” it told the BBC.

The company added: “We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy.”

It is part of a wider crackdown from Ofcom on services it suspects could be flouting the UK’s sweeping online safety requirements – including toughened-up rules for tech firms to tackle CSAM, which it is illegal to possess or share in the UK.

“Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities,” said Suzanne Cater, director of enforcement at Ofcom.

She added while there had been progress with tackling CSAM on smaller services, including file-hosting and sharing platforms, the issue “extends to big platforms too”.

Children’s charity the NSPCC welcomed Ofcom’s Telegram probe.

“Recent NSPCC research revealed around 100 child sexual abuse image offences are being recorded by police every day,” said Rani Govender, its associate head of policy.

“The scale of this abuse is stark and we strongly welcome Ofcom ramping up action to tackle it, including opening this investigation into Telegram.”

The probe was also welcomed by the Internet Watch Foundation (IWF), which works to identify and remove CSAM online, including on Telegram.

IWF communications director Emma Hardy said the organisation shared concerns about “bad actor networks” on the platform, and “that not enough is being done to prevent known, detected, child sexual abuse imagery from being distributed”.

She said while the company has taken some action, for these “to be truly effective, they need to do more”.

This, Hardy said, should see safeguards expanded across Telegram, including to chats users can protect with end-to-end encryption.