As part of our research for the series, we carried out two experiments, one on TikTok and another on Instagram, setting up accounts as a 13-year-old girl to see what the algorithm would serve up. We said we liked animals, Taylor Swift and netball. It took TikTok 21 minutes and 15 seconds, on a brand new account, before it served up the first suicide-related content.
Instagram was awash with distressing eating disorder and mental health content. A teenage girl looks to camera and tells us she hasn’t been outside for two years, the next unlocks deadbolts off her bedroom door explaining she’s a paranoid sleeper … When I was 13, I had no idea you could have a fear of going outside or be a paranoid sleeper. Why is their algorithm pushing this content to teenage accounts?
We know the answer, their business model and profits are driven by time on device. To anyone who thinks teens should just be able to scroll on past, I ask you, what did you do the last time you drove past a car accident? We don’t want to look, but we do. These companies know we are biologically wired to pay attention to distressing content and they are weaponising it back at us.
Meta’s defence tried to argue that protecting young users is a priority for them. Across our teen experiments, we tried to report pro-anorexia content sent to our 13-year-old test account but were told it did not breach their community standards for eating disorders. Take a look and decide for yourself if their standards are humanly designed.
The LA verdict came hot on the heels of a separate New Mexico trial. Just one day before, Meta was also found guilty of concealing the risks of harm, including child sexual exploitation. Thousands of other cases are waiting in the wings. Mark Zuckerberg famously described his approach to development as “move fast and break things”. The collateral damage of that approach is all too human and finally there is a legal avenue to challenge this.
Outside the US, the hope is that these cases may push the platforms towards safer design. For too long as parents, we’ve been told to accept these risks as an unavoidable cost of our kids growing up in a digital world. For Gen Z and now Alpha, they’ve been handed addictive platforms and the expectation to be accountable for their own wellbeing.
We have travelled up and down the motu as part of our upcoming docuseries, gathering the lived experience of Gen Z and speaking with those at the forefront of these issues. Three things everyone can agree on – the harm is real, there is a knowledge gap between what young people are encountering online and what we as parents think they’re navigating, and more needs to be done to hold these companies accountable for designing safer services.
Big Tech will appeal, it will drag on (Meta employs one lobbyist for every six members of Congress) but right here, right now, this feels like a monumental victory.
Catch up on the debates that dominated the week by signing up to our Opinion newsletter – a weekly round-up of our best commentary.