New York Attorney General Letitia James on Tuesday led a bipartisan coalition of 36 attorneys general in urging Congress to reject language in the National Defense Authorization Act, which would prevent states from passing or enforcing laws which govern artificial intelligence.
It comes as congressional leaders consider adding language to the annual bill which would preempt state laws governing AI and President Donald Trump considers an executive order which would create a taskforce to challenge them.
James and the coalition insist that blocking states from regulating AI constitutes a safety and security threat.
“Every state should be able to enact and enforce its own AI regulations to protect its residents,” she said. “Certain AI chatbots have been shown to harm our children’s mental health and AI-generated deepfakes are making it easier for people to fall victim to scams. State governments are the best equipped to address the dangers associated with AI. I am urging Congress to reject Big Tech’s efforts to stop states from enforcing AI regulations that protect our communities.”
States have varying degrees of AI regulations, and New York is ramping up. Regulations went into effect this month requiring companion operators to implement safety features which interrupt users who are engaging for long periods along with safety protocols in place for if a user expresses suicidal ideation or self-harm, including referring them to a crisis center with more legislation working through the Legislature.
That includes a bill headed for Gov. Kathy Hochul’s desk to require large AI companies to publish safety protocols and disclose when an AI model behaves dangerously. The Responsible AI Safety and Education, or RAISE Act, would hold developers liable for potential public safety threats and prevent the use of AI to make bioweapons.
Despite the uncertainty at the federal level, New York continues working to balance safety and security concerns with the pressures of economic development.
Justin Wilcox, executive director of Upstate United, is encouraging state leaders to take a “smart” approach to implementing any AI regulations. He says that must include investments in the power grid and embracing a wide portfolio of energy sources, which he argued current climate law does not allow, and he said Hochul’s calls for an “all of the above” approach to energy ring hollow without changes to the law.
“Reliability margins, looking into the future, are diminishing because we’re losing sources of energy and generation quicker than we’re replacing them, and that’s in part due to the CLCPA,” he said.
On the other hand, other bills making their way through the state legislature seek to place limits and reporting requirements on energy usage by data centers.
Dr. Michael Mandel, vice president and chief economist for the Progressive Policy Institute, has crafted a toolbox for states who are implementing AI policy. In it, he outlines a list of strategies, which, in addition to those grid enhancements, include leveraging of educational partnerships like Empire AI and career technical education and retraining to spur job growth.
He also insists that those initiatives must be paired with a robust tax incentive program to attract innovation.
“Studies have shown a positive economic effect from tax incentives, and you have to make sure you’re not giving something for nothing, that these are smart tax incentives and it’s the right thing to do,” he said.