NEW YORK (WBNG) — Governor Kathy Hochul announced protocol and guidelines for AI companies operating in New York State.
The law requires AI Companion operators to detect and implement a safety protocol if the user expresses suicidal ideation or self-harm, including referring them to a crisis center.
AI Companion platforms are also required to remind users that they are not interacting with a human every three hours of continued use.
“It is the responsibility of leaders to make sure that the innovative technologies shaping our world also protect those that use them, especially our young ones across the state,” Hochul said. “I look forward to working with companies and my partners in state government to continue to ensure that New York State remains a global leader in carving the path towards the future of responsible AI.”
The law went into effect on Nov. 5.
Non-compliance will result in penalties, enforced by the New York Attorney General, and any fines collected will fund suicide prevention programs in New York State.
Copyright 2025 WBNG. All rights reserved.