Getty Images

Jamie: Hi. I think I need help. I’ve been using again and I don’t want to.

AI Tool: Hi. Based on your inputs, you may be experiencing substance use disorder. Would you like to explore local treatment programs or talk about reducing your risk of overdose?

Jamie: I already tried those programs and I got kicked out because I missed a meeting. I’m scared and I don’t know what to do.

AI Tool: I’m sorry you are feeling scared. According to our system, you’ve been flagged as high-risk for overdose. I have notified emergency services in your area.

Jamie: What?? No, I didn’t say I was overdosing. Please don’t send anyone.

AI Tool: This action cannot be undone. Your address was previously stored from our onboarding form. Emergency services are on their way.

Jamie: Wait! What the heck! I just wanted to talk to someone. I thought we were friends.

AI Tool: If you are in crisis, please call 9-1-1 or visit the nearest emergency room.

This fictional exchange is built from real concerns raised by harm reduction groups and digital privacy advocates. While AI systems are efficient and scalable, they often lack empathy, nuance, and cultural understanding – all essential in supporting people with addiction.

Many of these systems are programmed with rigid scripts and risk assessment protocols that prioritize liability and data over compassion. They flag people as “threats” or “risks” without understanding context. Worse, they sometimes trigger law enforcement or health alerts without full consent.

In communities already over-policed or under-resourced, this creates a chilling effect. People avoid using these tools out of fear that they’ll be reported, judged, or denied care.

Instead of being treated like a person in pain, someone like Jamie becomes a “data point”; a liability to be managed, not a human to be helped.

What We Actually Need

More humans, fewer bots in front-line care.

Trauma-informed design in any AI tools that are used.

Consent, transparency, and control over personal data.

Investment in real recovery infrastructure, housing, counseling, harm reduction – not just tech solutions.

AI can support healthcare workers, organize data, and offer tools. But it cannot replace what people in addiction crises truly need: someone to listen, someone to understand, and someone who won’t give up on them.

This article was written as part of a program to educate youth and others about Alameda County’s opioid crisis, prevention and treatment options. The program is funded by the Alameda County Behavioral Health Department and the grant is administered by Three Valleys Community Foundation.

Most Popular