Three years ago, I wrote this piece about Meta CEO Mark Zuckerberg’s quest to dominate the virtual reality industry.

His desire to corner the market on VR headgear had been abundantly clear before then — after all, the name change from Facebook to Meta was a nod to the company’s intent to focus on building products for the “metaverse,” where virtual reality activities are hosted. And although Meta’s VR efforts have struggled at times over the years, the company’s push to be top dog has been obvious.

Nonetheless, it was stunning to hear Tuesday’s testimony by two Meta whistleblowers before a Senate subcommittee. The two former Meta researchers, Jason Sattizahn and Cayce Savage, allege that some of the company’s executives blocked them from thoroughly researching the impact of Meta’s VR headset on children as it pursued its goals.

NBC News reported:

“Meta is aware that children are being harmed in VR,” Savage testified at the hearing.

“I quickly became aware,” she said, “that it is not uncommon for children in VR to experience bullying, sexual assault, to be solicited for nude photographs and sexual acts by pedophiles, and to be regularly exposed to mature content like gambling and violence, and to participate in adult experiences like strip clubs and watching pornography with strangers.”

Savage, who said she left Meta in 2023, said she was stymied when she wanted to find out how common the problems were.

“I wish I could tell you the percentage of children in VR experiencing these harms, but Meta would not allow me to conduct this research,” she said.

Yikes. Not exactly a shining portrayal of Meta’s approach to protecting America’s children — or children anywhere, for that matter.

Sattizahn’s experience sounded equally disturbing:

“Meta’s immediate response to congressional concern was not to do the right thing but rather roll out new processes and policies to manipulate control and erase data,” he said. “We researchers were directed how to write reports to limit risk to Meta. Internal work groups were locked down, making it nearly impossible to share data and coordinate between teams to keep users safe.”

He told the senators that when his team “uncovered that underage children using Meta VR in Germany were subject to demands for sex acts, nude photos and other acts that no child should ever be exposed to, Meta demanded that we erase any evidence of such dangers that we saw.”

Meta spokesperson Andy Stone called the allegations “nonsense” and said they were “based on selectively leaked internal documents that were picked specifically to craft a false narrative.”

In recent years, reports have raised concerns about minors being preyed upon by users of Meta’s VR technology, and the company has often responded by touting efforts it says it has made to curb such behavior and make its VR platforms more age-appropriate.

But this is just the latest in a growing list of concerns about Meta products and their potential harms to children.

But this is just the latest in a growing list of concerns about Meta products and their potential harms to children. You may remember Meta whistleblower Frances Haugen’s congressional testimony back in 2021, when she alleged that the company was aware of its social media platforms’ harms to children. And earlier this week, Sen. Ed Markey — a Massachusetts Democrat who wrote to Meta in 2023 with concerns about its platforms’ potential risks for children — urged the company to stop allowing minors to interact with its artificial intelligence chatbots after multiple reports found that the bots had engaged in sexually charged conversations with children.

Tuesday’s allegations on Capitol Hill raised concerns about one of Meta’s most signature products and one of Zuckerberg’s pet projects — its VR headset. But safety issues at the company seem far more widespread.