Therapists in Winnipeg have started using artificial intelligence-powered tools to listen in and transcribe sessions, which some say helps provide better patient care — but it’s raising concerns around privacy and ethical risks for some patients and experts.

Wildwood Wellness Therapy director Gavin Patterson has been using a tool called Clinical Notes AI at his Osborne Village practice for the past 11 months to summarize and auto-generate patient assessments and treatment plans with the click of a button.

Once he has consent from clients to use the software, he turns it on and it transcribes the sessions in real time. Patterson said it’s improving care for his 160 patients.

“Notes before were good, but now it’s so much better,” he said. “When I’m working with clients one-on-one, I’m able to free myself of writing down everything and be fully present in the conversation.”

Patterson sees up to 10 patients daily, making it difficult to remember every session in detail. But AI lets him capture the entire appointment.

“It gives me a lot of brain power back, and it helps me deliver a higher product of service,” he said.

The software also cuts down on the time it would normally take to write clinical notes, letting him provide care to more patients on average.

A picture of the Clinical Notes AI website displayed on an iPad.Tools like Clinical Notes AI can listen to, and transcribe, therapy sessions in real time. (Jeff Stapleton/CBC)

Once patient notes are logged, Patterson said the transcripts from the session are deleted.

As an extra layer of security, he makes sure to record only information the AI absolutely needs.

“I don’t record the client’s name,” he said. “There’s no identifying marks within the note,” which is intended to protect patients from possible security breaches.

But 19-year-old Rylee Gerrard, who has been going to therapy for years, says while she appreciates that an AI-powered tool can help therapists with their heavy workloads, she has concerns about privacy.

“I don’t trust that at all,” said Gerrard, noting she shares “very personal” details in therapy. Her therapist does not currently use AI, she says.

A woman stands smiling, there are trees and a building behind her.Rylee Gerrard, 19, says she has concerns about privacy when it comes to AI use in therapy. (Travis Golby/CBC )

“I just don’t know where they store their information, I don’t know who owns that information … where all of that is kind of going,” Gerrard said, adding she’s more comfortable knowing that her therapist is the only person with details from her sessions.

Unlike the artificial intelligence Patterson uses, there are tools like Jane — used by some clinics in Winnipeg — which can record audio and video of a patient’s session and make transcriptions.

Recordings are stored until a clinician deletes them permanently, but even then, they stay in the system for seven days before being permanently deleted, according to the software’s website.

CBC News reached out to the company multiple times asking about its security protocols but didn’t receive a reply prior to publication. A section on security on the Jane website says it has a team whose “top priority is to protect your sensitive data.”

Caution, regulation needed: privacy expert 

Ann Cavoukian, the executive director of Global Privacy and Security by Design Centre — an organization that helps people protect their personal data — says there are privacy risks when AI is involved.

“AI can be accessed by so many people in an unauthorized manner,” said Cavoukian, a former privacy commissioner for the province of Ontario.

“This is the most sensitive data that exists, so you have to ensure that no unauthorized third parties can gain access to this information.”

A portrait of a woman with glasses gazing into the camera.Privacy expert Ann Cavoukian says the use of AI increases the risk of sensitive information ending up in the wrong hands. (Dave MacIntosh/CBC)

She says most, if not all, AI transcription technologies used in health care lack adequate security measures to protect against external access to data, leaving sensitive information vulnerable.

“You should have it in your back pocket, meaning in your own system — your personal area where you get personal emails … and you are in control,” she said.

In Manitoba, AI scribes used in health care or therapy settings have no provincial regulations, according to a statement from the province.

Cavoukian said she understands the workload strain therapists face, but thinks the use of AI in therapy should be met with caution and regulated.

“This is the ideal time, right now, to embed privacy protective measures into the AI from the outset,” she said.

She wants governments and health-care systems to proactively create regulations to protect sensitive information from getting into the wrong hands.

“That can cause enormous harm to individuals,” she said. “That’s what we have to stop.”

Recording sessions not a new technology

The concept of recording therapy sessions is not new to Peter Bieling, a clinical psychologist and a professor of psychiatry and behavioural neurosciences at McMaster University in Hamilton. Therapists have been doing that in other ways, with the consent of patients, for years, he said.

“It used to be old magnetic tape and then it was cassettes,” said Bieling, adding there was always the risk of recordings falling into the wrong hands.

He understands the apprehension around the use of AI in therapy, but encourages people to see it as what it is — a tool and an updated version of what already exists.

A man sits with headphones on smiling. Clinical psychologist Peter Bieling agrees there are concerns around AI and security, but says its use could improve patient care. (CBC)

The use of scribing tools will not change therapy sessions, nor will it replace therapists, he said — artificial intelligence cannot diagnose a patient or send in documentation, he noted, so practitioners still have the final say.

“Electronic health records have been making recommendations and suggestions, as have the authors of guidelines and textbook writers, for many years,” said Bieling.

But like Cavoukian, he believes more regulations are needed to guide the use of AI. Failing to implement those may lead to problems in the future, he said.

“These agencies are way too late and way too slow,” said Bieling.

For now, Cavoukian advises patients to advocate for themselves.

“When they go in for therapy or any kind of medical treatment, they should ask right at the beginning, ‘I want to make sure my personal health information is going to be protected. Can you tell me how you do that?'”

Asking those types of questions may put pressure on systems to regulate the AI they use, she said.

How Winnipeg therapists are using AI in sessions

From transcribing patient notes to generating assessments and treatment plans, AI-powered tools are helping therapists with heavy workloads, but some are concerned about privacy and ethical risks.