HARRISBURG — This week, the Pennsylvania state senate voted 49-1 on a bill that regulates artificial intelligence companion chat bots, with specific guidelines around protecting minors.

Dubbed the SAFECHAT Act, SB1090 follows through on a policy priority for Democratic Gov. Josh Shapiro. He mentioned the issue during his budget address in early February.

A recent Pew Research study found that nearly one in three teenagers use chatbots on a daily basis. The number of users who rely on chatbots for emotional support is also increasing; a user might seek therapy/mental health help or rely on chatbots as “companions”.

The bill gives specific limits on what bots can do in conversations and mandates that users are informed they are talking with a bot and not a human.

“Requiring a disclosure that says, ‘hey, you’re interacting with a computer and not a human,’ hopefully will be enough to trigger them to say, OK, I’m done,” said Republican Sen. Tracy Pennycuick. “Let’s go out and play. Let’s, you know, talk to our siblings, talk to our parents.”

Pennycuick and Democratic Sen. Nick Miller co-sponsored the legislation, as chairs of the senate Communications and Technology Committee.

Both say the trend of companion chatbots coaching people towards self harm or harming others has added urgency to this legislation. In the United States, multiple cases have risen with ai chatbots potentially contributing to suicide or other harms.

“We want to lead on innovation, but also protect our community members as well,” Miller said. “Especially our minors that may turn to this like it’s a companion, but it’s really not. And if it’s giving bad advice, we need to address that.”

The SAFECHAT act, if enacted, would mandate seven rules for an “AI Companion” program.

The bot must disclose its non human status.Companies must implement “suicide and self harm” safeguards. This includes preventing chatbots from producing suicidal, self harm, or harm to other content. A bot must also recognize when a user references these topics and stop the conversation while giving mental health resources.Protocol details must be published on the program’s public website.

On rules specific to use of the program by minors,

Again, disclose non human status.Give a reminder every three hours of the bots non human status and encourage a user to take a break.Don’t allow the bot to create sexual material or directly encourage sexual activity.Give a warning in the program that ai companions might not be suitable for all minors.

Pennycuick says these rules can protect minors who may not recognize they are talking to a bot and not a real person; or who are legitimately choosing connection with ai over connecting with people.

“We have to kind of get back to basics. Crack a book. Play with your siblings. You know, help mom and dad in the kitchen making dinner,” Pennycuick said. “… I think that’s where kids get their foundation. And I think that’s where kids learn about their self-esteem, their likes, their dislikes.”