YouTube creators looking for easy ways to make a quick buck have finally found an audience that the lowest effort AI-generated video content resonates with: those of us with the least developed frontal lobes. According to a report from Bloomberg, AI slop videos are starting to fill YouTube feeds and are particularly targeting young users, guided by business and “hustle” YouTubers who see the content as an easy way to stack their hallowed passive income.

To be clear, Bloomberg didn’t have particular metrics on just how much of YouTube for Kids is now dominated by AI-generated trash, but it appears that particular type of content has become fertile ground for opportunists. Honestly, it’s kind of a no-brainer that this would be a strategy lots of people would try. The publication highlights several creators with over one million followers who have made tutorials on how to create low-effort videos that’ll get played by young kids who have less than discerning taste. One creator, Monique Hinton, reportedly claimed that people could make hundreds of dollars a day with these types of videos.

Part of that is surely a bit of creative bluster. A simple rule when it comes to online hustlers is that if the techniques they sell actually made money, they would just do that instead of making tutorials or selling courses. But two true things come to a head here: it is very easy to create AI slop, and YouTube videos aimed at kids are historically a cesspool of largely unchecked garbage.

You might remember way back in the day when YouTube was flooded with knock-off versions of popular children’s characters, and those videos contained inappropriate and often adult-oriented content that would end up on kids’ screens either because of auto-play pulling it up after a legitimate video or because no one notices that it’s fake until it’s too late. The New York Times reported on this problem and found a video titled “PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized,” just to provide an example of what kind of stuff slipped through the cracks of both YouTube’s moderation and parents’ supervision.

Combine that with the fact that young kids are a growing audience on YouTube, and you have a recipe for kids getting zapped by the monetized slop ray. Per Pew Research Center, about 60% of parents with a child under the age of two say their kid watches YouTube, including about one-third who watch videos on the platform every day. That’s a lot of eyes racking up watching hours, and most of those viewers don’t have the ability to discern when they are being fed AI-generated nonsense.

That makes them targets for people looking to make a quick buck with low-effort slop, but it also very well could have long-lasting negative impacts on those kids who have increasing trouble telling real from fake and get fed misinformation that goes unchecked. The American Academy of Pediatrics recommends that media use for children two years old and younger be “very limited” because of the important brain development that happens during that time.

AI slop for adults is bad enough, but there is something particularly malicious about viewing children as a viable money-making audience, especially when the product you are selling is extremely low-effort garbage. But hey, if it becomes ubiquitous enough, everyone’s children will be able to perfectly recite their A-T-Ps, or whatever way AI manages to mess up basic, foundational knowledge.

When reached for comment, YouTube assured us that there’s nothing to worry about here. “YouTube has become one of the biggest streaming services in the country precisely because families trust our age-appropriate experiences, like YouTube Kids,” a spokesperson said in a statement. “Mass-producing low-quality content is not a viable business strategy on YouTube, as our systems and monetization policies are designed to penalize this type of spam.”

We will leave this one to you, reader. In your experience, is “mass-producing low-quality content” something you think of when you think about YouTubers?