Australia will require pornography websites to check all users’ ages and restrict access to only proven adults under a set of new internet rules registered by the eSafety commissioner.

On Tuesday, eSafety commissioner Julie Inman Grant officially registered six regulations for various internet industries, including websites, social media platforms and app stores, which are set to come into effect six months from now.

One of these regulations, known as the “designated internet services” industry code, which was drafted by the industry itself, establishes enforceable rules under the Online Safety Act for most websites and apps that aren’t considered social media.

Providers of “pornography websites, gore websites, and/or pro-suicide websites that contain sexually explicit, shocking violent and/or high impact self-harm end-user generated content, that qualifies as online pornography, high impact violence or self-harm material” will now have a minimum age requirement.

“The provider of the service must, where technically feasible and reasonably practicable, implement: (a) appropriate age assurance measures; and (b) access control measures, before providing access to the designated internet service or the relevant high-risk materials,” the code says.

Independent. Irreverent. In your inbox

Get the headlines they don’t want you to read. Sign up to Crikey’s free newsletters for fearless reporting, sharp analysis, and a touch of chaos

By continuing, you agree to our Terms & Conditions and Privacy Policy.

Know something more about this story?

Contact Cam Wilson securely via Signal using the username @cmw.69. Or use our Tip Off form.

These websites and apps are also expected to test and monitor the effectiveness of these policies over time.

Related Article Block Placeholder

Article ID: 1219874


Meta, Google, TikTok and others told to prepare for teen social media ban

The codes do not specify which age-assurance and access-control measures will be required to comply with the rule. Last week, the federal government published the report of its trial of age-assurance technologies that can be used to enforce these restrictions. These include face scans, uploading your ID, data analysis and other methods.

Among the other registered codes, there will also be a new requirement that restricts AI chatbots from having sexual, violent or harmful discussions with children, and will also similarly require providers to determine a user’s age.

Failure to comply with industry codes comes with a penalty of up to $49.5 million.

Inman Grant said the codes were developed with industry to deliver “stronger protections and safer online spaces for children”.

“We know this is already happening to kids from our own research, with 1 in 3 young people telling us that their first encounter with pornography was before the age of 13 and this exposure was ‘frequent, accidental, unavoidable and unwelcome’ with many describing this exposure as being disturbing and ‘in your face’,” she said in a media release.

The decision to age-check users marks the culmination of a long process laid out by the eSafety commissioner’s 2023 roadmap. In July, the UK introduced similar online restrictions, leading to a spike in VPN downloads and traffic towards websites that didn’t comply with age-checking requirements.

Australia’s new industry codes will come into effect on March 9, 2026.