An inquiry into the online harm experienced by young New Zealanders is finding little industry support for an overall social media ban for under-16s – but regulation is still desperately needed, say experts.

On Monday, the education and workforce committee received a mixed bag of advice at a day-long hearing into the online harm faced by young New Zealanders, but with one unifying message: age-restricting social media likely won’t be the best way to counter it. The inquiry, called by Act MP and committee member Parmjeet Parmar, seeks to understand the scope of digital harm experienced by young people in New Zealand, and what controls the committee could recommend to prevent this. It coincides with National MP Catherine Webb’s member’s bill to ban social media for under-16s, and another member’s bill lodged by Parmar’s caucus colleague Laura McClure, seeking to criminalise the creation of pornographic deepfakes.

Submitters included children’s rights activists, victims of online exploitation, digital safety advocates and major tech platforms Meta and TikTok. Parmar told The Spinoff the breadth of perspectives shared in the committee showed this was a “very complex issue”, and hearing from experts as to how online restrictions operate internationally as well as locally allowed for the committee to carefully consider the next steps. “Instead of us being the guinea pig, it’s good to learn from others, and bring in a system that balances the concerns,” Parmar said. “We can sit there and say ‘we’ll ban this, restrict this, verify age here’ but if it’s practically not possible, it will be a waste of time.

The Free Speech Union’s Steph Martin warned the committee that protections for children from harmful content should not come at the expense of freedoms and privacy rights for adults. Martin warned “over-broad definitions” in legislation and “vague language” would open the door to “weaponisation and asymmetrical justice … on individuals silenced and punished for expressing an alternative perspective”. She recommended the government “enable grown-up approaches” to online safety in the form of opt-in digital literacy tools for individuals and parents.

Laura McClureAct MP Laura McClure presented a pornographic deepfake of herself to the House in May (photo: Act Party).

Asked by committee member and National MP Vanessa Weenink whether the harm of adults not being able to express themselves freely should overrule the harm of children exploited online, Martin replied that the committee needed “to look down the road”. “It’s more about how exactly is the implementation of this going to work, and what are the impacts on adults going to be”.

Antonia Lyons, health psychologist and director for the Centre for Addiction Research, likened the detriment to health caused by social media to that of the tobacco industry. Lyons shared findings from her recent research that found young people used an average of nine social media platforms, had largely positive experiences online and were savvy as to the methods social media platforms used to keep them coming back.

But recognising the ways harmful substances – such as nicotine and alcohol – are marketed online is usually more difficult for young users, Lyons said. For example, primary school-aged children playing Meta’s Roblox have the option of taking their avatar to a bar to consume as many as 10 drinks, while the screen becomes progressively fuzzier to mimic the sensation of being drunk. 

“In terms of unhealthy commodity marketing, we know that it influences the age at which young people start drinking,” Lyons said. The onus shouldn’t be on young people to recognise online marketing, but for platforms to provide transparency with how their algorithms target vulnerable people, she recommended.

A young woman and older man sit in front of a camera, shoulders up, on Zoom.15-year-old Honour, alongside NextGen Leader’s director Danny Kettoola.

Speaking to the committee on behalf of NextGen Leader, 15-year-old Honour said she found it “disgusting that young women like me” were being exposed to misogynistic content online, and suggested establishing the likes of an e-safety commissioner. Asked by committee member and National MP Grant McCallum why she didn’t just leave social media herself, Honour responded that she knew “it’s not just a personal experience … it’s not just me that’s going through it”.

Privacy commissioner Michael Webster said he welcomed the inquiry, but warned that the reliance on data mining that an online age verification would require would put young people’s privacy at risk. Instead, Webster recommended “responsible self-policing” and modernising the privacy regulatory framework as more effective methods of online protection for young New Zealanders. “No one wants social media to become a place dominated by violence, hate and despair,” Webster said. “But we cannot rely on hoping that won’t happen. Hope is simply not a viable option.”

Inside select committee room 4, where privacy commissioner Michael Webster sits at the head of a long table speaking to MPs.Privacy commissioner Michael Webster.

Former Netsafe CEO Martin Cocker, submitting on behalf of Online Safety Exchange, told the committee that Aotearoa should first establish an e-safety commissioner to understand which specific regulations should be implemented to curb digital harm. “We can regulate and push the big tech companies, but are we going to push them harder than Europe is?” Cocker said. “Our regulatory efforts are not going to have the same impact … Let’s spend our time thinking about what it is that we need for New Zealand.”

Mia Garlick, director of policy for Meta Australia and New Zealand, told the committee the platform recently establishing “teen accounts” showed its commitment to protecting the safety of young people online. Asked by Weenink how Meta would monitor algorithmic transparency – concerning controls such as infinite scrolling, which Weenink likened to “bio-hacking the human dopamine axis” – Garlick said that describing the platform’s design as “intentionally addictive really misrepresents the intentions and work we do”. She said Meta users and monitoring agencies could already access algorithmic transparency by way of its systems cards.

On the social media ban for under-16s in Australia, due to come into force in December, Garlick said Meta was still waiting for the Australian government to provide a full scope of services affected and privacy guidance. “I do think the law in Australia was rushed through,” Garlick said, adding that more conversations were needed around the industry approach to online age assurance.

Australia woman with curly hair sits in a Zoom meeting.Emma Woods-Joyce of TikTok.

TikTok public policy lead Emma Woods-Joyce warned that blunt social media bans risked driving young people to online platforms that didn’t have safety restrictions as strong as TikTok’s. The “online world” provided benefits such as “cultural learnings”, Woods-Joyce said, and TikTok had the ability to remove 99% of violating content within 24 hours, and had already removed around 230,000 Aotearoa-based accounts belonging to under-13s.

Asked about monitoring agencies and academics being allowed access to TikTok’s For You Page system, Woods-Joyce said the platform already offered “some access and controls” for researchers. Other stakeholders should step up and introduce the appropriate tools and settings to combat online harm, and if New Zealand were to introduce a social media ban for under-16s, Woods-Joyce recommended  watching Australia’s implementation of the law for “bumps in the road”.