Coimisiún na Meán’s review of whether the two platforms have breached the Digital Services Act arises from concerns that mechanisms they provide to allow people report illegal content are not easy to access and not user-friendly.

They may also have deceptive interface designs, the regulator said.

This arises from a review launched in September 2024 into whether a number of online providers – also including YouTube, X, Meta and Pinterest – were in compliance with Article 16 of the Digital Services Act (DSA).

There are concerns that TikTok's reporting mechanisms are not easy to access. Photo: Getty

There are concerns that TikTok’s reporting mechanisms are not easy to access. Photo: Getty

Today’s News in 90 – Tuesday, December 2

Coimisiún na Meán said that a number of these platforms have made changes to their reporting mechanisms as a result, while “supervision engagement” with others is ongoing.

Launching the investigation last year it said that concerns had been raised by an initial review, and from information gathered by its contact centre, as well as complaints passed on by other European regulators.

“Concerns arose in relation to potential ‘dark patterns’, or deceptive interface designs, of the illegal content reporting mechanisms, specifically that they were liable to confuse or deceive people into believing they were reporting illegal content, as opposed to content in violation of the provider’s terms and conditions,” the commission said.

“If correct, this might mean that the illegal content reporting mechanisms are not effective in preventing the dissemination of illegal content and the rights of people under the DSA might be undermined.”

John Evans, the digital services commissioner at the regulator, said that at the core of the DSA is the right of people to report content they suspect to be illegal, and the requirement on providers to have reporting mechanisms that are easy to access and user-friendly.

“Providers are also obliged to not design, organise or operate their interfaces in a way which could deceive or manipulate people, or which materially distorts or impairs the ability of people to make informed decisions,” he said.

In the cases of TikTok and LinkedIn, there was reason to suspect that their reporting methods were not easy to access, did not allow people to report child sexual abuse material anonymously, and that the design of their interfaces might deter the public from reporting content as illegal.

“We have requested further information from several other providers to assess their compliance with Article 16 and Article 25 of the DSA, and we are not ruling out further regulatory action, if needed, to ensure compliance,” he added.

A LinkedIn spokesperson said: “We’re committed to keeping LinkedIn safe, trusted and professional, and have effective mechanisms for users to report content that may be illegal. We will continue to engage with regulators and adhere to the laws and regulations of the markets in which we operate.”

TikTok said it was committed to keeping its platform safe and meeting its obligations under the DSA. “We have received a notice of investigation today,” said a spokesperson. “We will review it in full and engage with Coimisiún na Meán as required.”

Mr Evans urged people across the EU who use platforms that are based in Ireland to report illegal content that they see online. If they can’t find an easy way to do this, or if they are not happy with a platform’s response, the regulator can provide advice and support.

Last month the regulator also launched an investigation into whether Elon Musk’s X has breached the DSA by not giving people the opportunity to appeal decisions made by moderators.