The Online Safety Act’s central aim is to make the internet safer for people in the UK, especially children.
It is a set of laws and duties that online platforms must follow, being implemented and enforced by Ofcom, the media regulator.
Under its Children’s Codes, platforms must prevent young people from encountering harmful content relating to suicide, self-harm, eating disorders and pornography from 25 July.
This will see some services, notably porn sites, start checking the age of UK users.
Ofcom’s rules are also designed to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.
Firms which wish to continue operating in the UK must adopt measures including:
Failure to comply could result in businesses being fined £18m or 10% of their global revenues – whichever is higher – or their executives being jailed.
In very serious cases Ofcom says it can apply for a court order to prevent the site or app from being available in the UK.