The companies have defended the safeguards they have in place, with YouTube owner Google saying it was surprised by Ofcom’s approach, urging it to focus on higher risk services instead.

But both regulators said the social media platforms needed to strengthen their commitment to stopping children under 13 from signing up.

Currently, many platforms rely on people who sign up to self-report their own ages.

“As self-declaration is easily circumvented, this means underage children can easily access services that have not been designed for them,” the ICO said in an open letter to social media and video platforms.

Most social media platforms have a minimum age limit of 13, but Ofcom research suggests 86% of children, external aged 10-12 have their own social media profile.

Ofcom wants firms to use “highly-effective age checks,” which are currently only required by law for certain services which provide over-18 content, such as pornography.

Implementing similar methods for young children’s social media would require the big tech firms to voluntarily bring most robust measures in.

The ICO’s focus is on the handling of young children’s data.

“Where services have set a minimum age – such as 13 – they generally have no lawful basis for processing the personal data of children under that age on their service,” its letter, from Chief Executive Paul Arnold, said.

Technology secretary Liz Kendall said no platform would get a “free pass” when it came to protecting children and that Ofcom had her full support in holding the platforms to account.

“No company should need a court order to act responsibly to protect children,” she added.