BATTLE CREEK, Mich. — Protecting our children from harmful online content just reached a terrifying new level.
According to the National Center for Missing and Exploited Children, reports of artificial intelligence used in child sexual exploitation increased more than 1,000% in the last year.
Michigan State Police (MSP) said sophisticated software and even free apps are allowing anyone to take a real picture of someone from online, and alter it to make them look naked.
News Channel 3 talked to the father of a Michigan teen who took his life after being victimized in an online sextortion scheme.
We also sat down with a state police investigator who said the technology is getting so complex, it’s hard to tell if an image is real or fake.
“Jordan was an amazing young man. He really was the epitome of the all-American kid. He was smiling, he was happy,” said Jordan DeMay’s father, John DeMay.
DeMay told News Channel 3, late one night, Jordan was chatting with someone he thought was a young woman on Instagram, but turned out to be a vicious online sextortion scheme.
“He was eight weeks away from his 18th birthday, eight weeks from graduating high school, and he was brutally victimized for six hours in the middle of the night while I was sleeping. And I woke up to him and found him in his bedroom…and he decided to take his life over it, and that can never happen again,” DeMay said.
Since that tragic day in 2022, DeMay has traveled the world raising awareness, advocating for tougher federal legislation that would strip big tech companies of immunity and hold them accountable for hosting harmful images of children on their sites.
“Literally every teenager from 13 to 17 is going to be targeted for sextortion at some point on their online space, so they really need to recognize it,” DeMay said.
In January of this year, Jordan D’s Law was signed by Governor Gretchen Whitmer, establishing greater protections against threats of online sextortion targeting minors in Michigan.
The law also mandates collaboration between schools, the Department of Education and law enforcement.
But Michigan State Police told News Channel 3 investigating cases of AI sextortion is becoming extremely challenging.
“So the genie’s out of the bottle. It’s never going to go back in. We’re going to see more and more of these cases,” Michigan State Police D/Sgt. Gerald Yott said.
D/Sgt. Yott said in the last year, his computer crimes unit investigated six cases of AI sextortion in Southwest Michigan.
The biggest concern he said, is how accessible this software has become and the amount of people who will be victimized.
“The sky’s the limit for them to choose their own adventure, to do whatever they want with these images now,” D/Sgt. Yott said.
The statistics are jaw dropping, according to Stacy Garrett with the National Center for Missing and Exploited Children, 4,700 cases were reported in 2023, skyrocketing to 67,000 last year.
“About more than 1000% increase in the number of reports of child sexual exploitation involving GAI that were being made to the cyber tip line. And again, as with sextortion, we see that number going even higher this year so far in 2025,” NCMEC Vice President of Content & Community Engagement Stacy Garret said.
D/Sgt. Yott said the people creating this harmful content typically aren’t cyber criminals, but teenagers using free “nudify apps.”
“Kids are doing this in school now to their classmates,” D/Sgt. Yott said.
“So cyberbullying has taken on a whole new level,” News Channel 3 Anchor Jessica Harthorn said.
“And the sophistication is taking out our ability to detect forensically what is real and what is fake,” D/Sgt. Yott said.
There’s now a push for these highly advanced AI generation models to include a watermark within the code, so police can detect if its a fake image, but D/Sgt. Yott said it may be too late.
“This became open sourced and everybody had access to it, meaning anybody could write language models that didn’t adhere to any of these standards, and that’s already out there,” D/Sgt. Yott said.
“It sounds like prosecution would be a nightmare,” Harthorn said.
“It can be, because the laws were lacking for a long time,” D/Sgt. Yott said.
Its why Democratic State Representative Matt Longjohn co-sponsored bipartisan House Bill 4048, which would put felony charges behind the misuse of AI deep fakes.
He also signed a letter, saying a moratorium should not be placed on regulating AI.
“And the epidemic of youth suicide and suicidal ideation and just mental health in general is deeply concerning to me,” Rep. Matt Longjohn (D-Portage) said.
As for DeMay, he’ll continue to raise awareness, creating a lasting legacy for his son Jordan.
“He wanted to be a professional football player, and he wanted to be the “King Dog” all the time, you know? And I think this is kind of, unfortunately, his legacy as a hero in some sense, because his story is saving lives, and it’s much greater than the short 18 years that we got physically with him on this planet,” DeMay said.
If you become a victim of AI sextortion, first, file a police report.
Next, report the incident to the National Center for Missing and Exploited Children’s Cyber Tip Line 1-800-THE-LOST.
That way, they can contact big tech companies on your behalf to remove the picture.
Finally, contact the tech company directly to report the harmful image.