Australia’s internet watchdog has accused the world’s biggest social media firms of still “turning a blind eye” to online child sex abuse material on their platforms, and said YouTube in particular had been unresponsive to its enquiries.

In a report released on Wednesday, the eSafety Commissioner said YouTube, along with Apple, failed to track the number of user reports it received of child sex abuse appearing on their platforms and also could not say how long it took them to respond to such reports.

The federal government decided last week to include YouTube in its world-first social media ban for teenagers, following the commissioner’s advice to overturn its planned exemption for the Alphabet-owned Google’s video-sharing site.

“When left to their own devices, these companies aren’t prioritising the protection of children and are seemingly turning a blind eye to crimes occurring on their services,” eSafety Commissioner Julie Inman Grant said in a statement.

“No other consumer-facing industry would be given the licence to operate by enabling such heinous crimes against children on their premises, or services.”

Julie Inman Grant

eSafety commissioner Julie Inman Grant says large technology companies are failing to answer basic questions about how they are handling reports of child abuse material on their platforms. (ABC News: Ian Cutmore)

Google has said previously that abuse material has no place on its platforms and that it uses a range of industry-standard techniques to identify and remove such material.

Meta — owner of Facebook, Instagram and Threads, three of the biggest platforms with more than three billion users worldwide — has said it prohibits graphic videos.

Labor won’t be ‘intimidated’ out of banning YouTube for under 16s

The prime minister vows not to be intimidated by Google, as the tech giant threatens to sue over the government’s decision to ban YouTube for children under 16.

The eSafety Commissioner, an office set up to protect internet users, has mandated Apple, Discord, Google, Meta, Microsoft, Skype, Snap and WhatsApp report on the measures they take to address child exploitation and abuse material in Australia.

The report on their responses so far found a “range of safety deficiencies on their services which increases the risk that child sexual exploitation and abuse material and activity appear on the services”.

Safety gaps included failures to detect and prevent live-streaming of the material or block links to known child abuse material, as well as inadequate reporting mechanisms.

It said platforms were also not using hash-matching technology on all parts of their services to identify images of child sexual abuse by checking them against a database. 

Google has maintained its anti-abuse measures include hash-matching technology and artificial intelligence.

The Australian regulator said some providers had not made improvements to address these safety gaps on their services despite it putting them on notice in previous years.

“In the case of Apple services and Google’s YouTube, they didn’t even answer our questions about how many user reports they received about child sexual abuse on their services or details of how many trust and safety personnel Apple and Google have on-staff,” Ms Inman Grant said.

Reuters