Open this photo in gallery:

Survivor parents listen as a lawyer speaks to the press outside the Los Angeles Superior Court on Wednesday.Jill Connelly/Getty Images

Gus Carlson is a U.S.-based columnist for The Globe and Mail.

As Meta’s chief executive officer Mark Zuckerberg faced grilling in Los Angeles Superior Court this week alleging his company creates highly addictive social-media hooks that harm young users, there was a nagging but provocative question no one seemed willing to ask: Where were the parents when things were going wrong?

The lawsuit that landed Mr. Zuckerberg in court was filed by a 20-year-old California woman who alleges that Meta and other big technology companies deliberately engineer their platforms to target young users, particularly “tweens” under 13.

She claims that as a young user in elementary school, she became addicted to Meta’s Instagram and Google’s YouTube social-media platforms, which caused her anxiety, depression, body dysmorphia and suicidal thoughts. She seeks to hold the companies accountable.

The blockbuster trial, which could cost Meta billions of dollars and redefine social-media accountability, is seen as a bellwether for hundreds of similar lawsuits against Meta, Google and other social-media platform providers.

Meta CEO Mark Zuckerberg blocked curbs on sex-talking AI chatbots for minors, court filing alleges

Meta and Google have denied the allegations that they target young users by rigging their platforms with addictive hooks. Mr. Zuckerberg told the court that Meta’s goal for Instagram is to provide long-term utility and value, not a short-term addictive fix.

As he did two years ago when he appeared before a U.S. Congressional committee looking into the issue, he pointed to efforts Meta has made to impose safety guidelines and guardrails – including age screening of users and parental controls – to protect young users from inappropriate content such as sexually explicit posts. And he has committed to doing more.

The dirty secret of many of these safety measures is this: Tracking shows that precious few parents have enabled the parental controls available on behalf of their children. No matter how advanced or effective safety measures may be, they won’t do any good unless they are turned on by the grown-ups in charge.

But in a hyper-litigious victim culture, where the term “personal responsibility” is considered by some to be an offensive social construct of past generations, it is so much easier to blame someone else for parental shortcomings – and maybe make a few bucks in settlement payouts in the process.

Sadly, many parents have been quick to delegate responsibility and accountability for their childrens’ online well-being to companies like Meta, and then have been outraged when problems arise that could have been avoided if they had been paying attention to what their kids were doing.

One pending case alleges that a six-year-old became addicted to Instagram. Really? What responsible parent allows their six-year-old to cruise social media unsupervised? Are they the same parents who would let their young child ride a bicycle in traffic unsupervised, then blame the bike manufacturers for negligence if their child were hurt or killed?

Are they the same parents who would leave Tide pods accessible to their infants at home, then sue Procter & Gamble if their child consumed one and got sick?

Car companies have spent billions of dollars on safety features such as ABS brakes, airbags, and camera systems that warn drivers about everything from unsafe lane changes to following another car too closely. Yet more than two million people are injured – and more than 40,000 people die – in traffic accidents every year in the U.S.

There’s a purely commercial – if perhaps, callous, in this context – argument as well that suggests it’s not worth Meta’s time to engage in targeting. Mr. Zuckerberg testified that teens make up less than 1 per cent of Instagram’s revenue. “Most teens don’t have disposable income, so they’re not valuable for advertisers,” he said.

As social media becomes more ubiquitous – and increasingly powerful with the infusion of artificial intelligence – there’s no question platform providers need to do their part to ensure the safety of users to the best of their ability.

But accountability is a two-way street. Parents need to step up and take responsibility, too. Pay attention. Take the phone away if they must. Block certain sites. Turn on the safety guardrails. Active parenting is as important as technology in the era of social media and AI – maybe more so.