Many platforms are now required to use secure methods like facial scansAn iPhone screenAn iPhone screen(Image: PA)

The way children experience the internet has fundamentally changed, it comes as new laws under the Online Safety Act have come into force to protect under-18s from harmful online content they shouldn’t ever be seeing. This includes content relating to pornography, self-harm, suicide and eating disorders.

Gov.uk has shared information online about keeping children safe and the changes that have been made. There are changes to data privacy, virtual private network (VPN), legal adult content and protecting freedom of speech. Gov.uk says: “To protect the next generation from the devastating impact of this content, people now have to prove their age to access pornography or this other harmful material on social media and other sites.”

Platforms are required to use secure methods like facial scans, photo ID and credit cards checks to verify the age of their users. This means it will be much harder for under-18s to accidentally or intentionally access harmful content.

The measures platforms have to put in place must confirm your age without collecting or storing personal data, unless absolutely necessary. For example, facial estimation tools can estimate your age from an image without saving that image or identifying who you are.

The government and the regulator, Ofcom, are clear that platforms must use safe, proportionate and secure methods, and any company that misuses personal data or doesn’t protect users could face heavy penalties.

A message will appear alongside a loud alarm on millions of mobile phones across the UK A message will appear alongside a loud alarm on millions of mobile phones across the UK (Image: PA)

While virtual private networks (VPNs) are legal in the UK, according to this law, platforms have a clear responsibility to prevent children from bypassing safety protections.

Online safety laws do not ban any legal adult content. Instead, the laws protect children from viewing material that causes real harm in the offline world. Gov.uk says: “Under the Act, platforms should not arbitrarily block or remove content and instead must take a risk-based, proportionate approach to child safety duties.”

Technology Secretary Peter Kyle said: “This marks the most significant step forward in child safety since the internet was created. The reality is that most children aren’t actively seeking out harmful, dangerous, or pornographic content – unfortunately it finds them. That’s why we’ve taken decisive action.

“Age verification keeps children safe. Rather than looking for ways around it, let’s help make the internet a safer, more positive space for children – and a better experience for everyone. That’s something we should all aspire to.”

Internet Matters added: “This milestone matters because the risks children face online remain high. Our latest Internet Matters Pulse shows that 3 in 4 children aged 9-17 experience harm online, from exposure to violent content to unwanted contact from strangers. With the Codes now enforceable, Ofcom must hold platforms accountable for meeting their obligations under the law.”