And so too on the state level
In Wisconsin, the two bills enacted into law in 2023 are good pieces of legislation, but they really are only peripherally concerned with AI; they merely apply old law to new technology.
The work of the legislative task force was more important, but ultimately it too failed to delve into the bureaucratic use of AI, though its members touched on some points of departure if such an inquiry is undertaken.
Basically, the subcommittee made seven recommendations, which I repeat here, if not quite verbatim:
First, instead of focusing on regulating the emerging technology that is AI, the legislature should focus on ensuring that data, the “raw material that powers AI,” remains private and the consumer protected.
Second, the Legislature should learn from the experiences of other states and avoid the potential overreach of comprehensive AI legislation, and should instead prioritize high-risk areas susceptible to exploitation or abuse.
Third, the Legislature should ensure that existing laws apply to AI models in the same way that those laws apply to individuals, but should avoid creating duplicative statutes that unnecessarily single out AI.
Fourth, the legislature should ensure that programs related to education and workforce development, such as Fast Forward, have a scope that is broad enough to include AI up-skilling, training, and education; are funded accordingly; and work to address any disparity in access between rural and urban communities.
Fifth, the Legislature should consider establishing a permanent study committee, new legislative standing committee, or inter-branch commission to review emerging technologies, including AI, and make legislative recommendations regarding the same.
Sixth, the Legislature should, as AI technology advances, examine and invest in technology powered by AI that will assist with public safety, such as gun detection software.
And seventh, the legislature should direct the executive branch to promulgate administrative rules to establish clear, consistent guiding principles for state-level AI governance and to provide the legislature with oversight regarding the state’s procurement, development, and use of AI.
Now most of these are just performative, the kind of vague empty shells that lawmakers spout out when they don’t have a clue but want to be seen as actually doing something.
Then there’s one that’s downright scary: Using AI for gun detection is a recipe for disaster, not to mention a egregious violation of individual liberties. As the Electronic Frontier Foundation has pointed out, the software “rarely produces evidence of a gun-related crime” and often misses weapons that people are carrying, but then fingers people who aren’t carrying. In a New York City pilot program in the subways last year, there were 118 false positives out of 2,749 scans, or 4.3 percent. There were no guns found.
Of course, when officers approach someone they think is armed, that automatically makes for a dangerous situation, and particularly so for the person who turns out not to have a gun. And, say, what about illegal searches?
Earlier this year in Florida, for those very reasons two Republican state lawmakers introduced legislation to ban AI gun detection, except for certain exemptions such as in police stations, prisons, and courthouses. State Senator Blaise Ingoglia called the use of AI for gun detection “nothing but a technological infringement upon both our 2nd and 4th Amendment rights,” while Rep. Monique Miller said the state should not allow local governments to infringe upon either the right to carry a firearm or the Fourth Amendment right to not be illegally searched just because artificial intelligence makes it possible.
I agree with the Florida lawmakers. Not investing would be the better way to go. Otherwise we’re just slip-sliding our way to totalitarianism, one technological advance at a time.
But the real swing-and-a-miss for the Wisconsin legislative AI task force was calling for the governor to direct the bureaucracy to promulgate rules to establish guiding principles for the state’s use of AI. Allowing the bureaucracy to set rules for itself would be a blunder of massive proportion, and, given the recent Supreme Court decision on legislative oversight, there would be no oversight of those rules.
Vague direction on rule promulgation is what causes most of our troubles. The legislature needs, through committee or task force, to do the job itself and by enumerated legislation establish the specific concerns needing to be addressed, as well the standards for doing so. That’s the first action, and it needs to be undertaken soon.