Legislators and tech experts are hopeful the back-to-back jury trials can restructure how tech giants are held accountable for harmful social media usage by youth
SACRAMENTO, Calif. — Back-to-back court rulings against Meta and Google are offering California legislators hope as they continue to push for tighter regulations on social media companies, and as officials weigh how to protect users while preserving technological innovation.
The decisions are being described as a potential turning point for the tech industry, with some comparing the moment to “Big Tobacco” as social media giants face increasing accountability for harm linked to their platforms.
“These platforms have a duty of care,” Democratic Assemblymember Josh Lowenthal of Long Beach said. “They need to live up to the standards in order for our products to be out in the marketplace safely for our kids.”
California leaders say the rulings could significantly shift how Silicon Valley companies are held responsible.
“I’m happy to see it, and I think it establishes a trend,” Democratic state Sen. Steve Padilla of San Diego said. “And I think it would apply very much nationwide.”
In one case, a jury in New Mexico found Meta — the parent company of Instagram and Facebook — liable for failing to protect young users from online child predators and for misleading consumers about platform safety. The company was ordered to pay $375 million for violating state law.
Just a day after in California, a jury found Meta and Google — the parent company of YouTube — responsible for the mental health struggles of a 20-year-old woman who had used social media since age 6. She was awarded $6 million.
“So for them, it’s in their best interest to make these platforms not only appealing, but desirable or some would even use the word addictive to bring more viewership and ad revenue,” Matthew Smith, a tech expert from the company Grapevine MSP, said. “From a business model it makes sense. But when you’re looking at it from a morality perspective?”
Both companies have said they will appeal the rulings.
“They’re not leaning into their moment,” Lowenthal said in response to the continued pushback. “They need to have an inflection period.”
California, which is home to more than 60% of the world’s top artificial intelligence companies, has been at the forefront of efforts to increase online safety. The state has recently clashed with the federal government over authority to regulate AI.
“When cars came out, it was really essential that we had stoplights, lanes. Without them, nobody would drive, because it would be chaos and totally unsafe,” Democratic Sen. Christopher Cabaldon on the need for basic online regulations, said.
Cabaldon added, “The algorithms, the operations, what, the things, the services feeding you to addict you or polarize you, that is subject to law.”
One proposal gaining traction — including support from Gov. Gavin Newsom — would set a minimum age of 16 to create a social media account, modeled after a ban in Australia.
“What’s healthy, based on brain development at a given age. What’s appropriate, just like the way we look at driving or drinking,” Lowenthal, author of the bipartisan bill AB1790, said.
Lowenthal is also backing a bill — AB1700 — that would create an e-safety commission, a central watchdog entity for social media that would help with age verification.
“We need help. We have a generation that’s never been more anxious, less free,” Newsom said when asked about the “age-gating efforts” during an unrelated press conference in February. “It’s long overdue that we’re having the debate we’re now having in the Legislature. And I’m very grateful the Legislature is taking this very seriously.”
Newsom commended the rulings out of New Mexico and California, writing on the social media platform X, “Big Tech is finally answering for the harm it has caused our children — after years of fighting against common sense regulations, today’s verdict shows that they can’t escape accountability. California isn’t backing down. We’ve enacted the nation’s strongest protections, and we’ll keep holding companies accountable if they put profit over the lives and well-being of children.”
California Attorney General Rob Bonta also applauded the rulings, writing on X, “Juries in Los Angeles and New Mexico have found Meta responsible for what we at CA DOJ know to be true: Meta is prioritizing profits over the safety of children and violating consumer protection laws… CA DOJ looks forward to holding Meta accountable in our own upcoming August trial in the Bay Area.”
That ongoing lawsuit is part of a multistate effort alleging Meta intentionally designed addictive social media features harmful for young users.
This is state-led lawsuit is separate from the more than 1,000 suits filed by a large group of individual plaintiffs in California’s state court over harmful platform features. One of those cases was the ruling out of California, K.G.M. v. Meta et al.
Still, experts warn that age verification systems could require users to share more personal information and may be difficult to enforce.
“Where does the responsibility and accountability lie? Is it with the parents or with the platform?” Smith said, noting the complexities of the debate.
“Most of these systems get put in place, and day 1, people are finding ways around them. Especially kids, they’re crafty,” Smith warned, emphasizing the implementation of such well-intentioned bills could make or break the success.
Both bills will be discussed in the Assembly Privacy and Consumer Protection Committee, as the debate over how to regulate social media continues.
Watch more from ABC10: President signs order to pay TSA workers amid DHS funding standoff
Download the ABC10 News app
ABC10: Watch, Download, Read












1
/
12