As a YouTuber myself, and as a heavy user of YouTube for both work and my own entertainment, I watch a lot of content on the platform. Which means I’m in a good position to pick up certain trends and changes in the videos that people produce and publish, which is why I noticed a subtle but awful trend—AI slop content.
Now, I’m not talking about the blatantly-obvious AI-generated videos that are effectively a new generation of YouTube poop. No, I’m talking about videos that have the veneer of being good classic YouTube content, but if you scrape away that thin veneer, it’s utter nonsense under the surface.
First, as someone who spends quite a bit of time and money on video production with my own YouTube work I have no issue personally with using AI tools to improve my workflow, or improve the quality of my videos.

Credit:Â Sydney Louw Butler / How-To Geek / Veo-3
I’ve used AI-generated images to illustrate points that stock photos can’t, I’ve used tools like Google’s VEO 3 to make animated interstitials for segments, and music-generators to make the sort of background noise that’s often mistaken for music, but is actually perfect for the background of a video. Some creators are making incredible, creative AI content, like those amazing Bigfoot vlogs.
AI Slop Videos Are, However, a Big Problem
Like any other tool, like Photoshop or a video effects suite, generative AI tools can be used to elevate your work, or to create slop. It’s the resulting slop I have an issue with.
So what sort of slop are we talking about here? Well, it’s now possible to use a combination of text generation, voice generation, and image or video generation to create an entire video with very little effort. There are even AI tools to edit it all together, though I suspect most of these videos are still edited by hand, probably using cheap labor from online freelancer sites.
These tools have become quite good, so the images don’t immediately raise alarms, the voice doesn’t sound robotic, and the script feels as authoritative and plausble as AI hallucinations tend to do.
There’s Zero Quality Control With AI on YouTube
The fastest way to clock that these videos are almost entirely AI-generated is the lack of quality control. If you spend the time, it’s possible to create AI images or videos that don’t have obvious glitches and errors, but the slop producers make their dime using volume, not quality of content.
So errors in images, errors in the facts or writing in the script, or issues with the AI voice glitching or mispronouncing words are just left as-is. The second a voice in a tech video pronounces “2MB” as “two em-bee” I know it’s AI slop.
I’d Much Rather Listen to a Real Human Voice

Credit:Â Lucas Gouveia/How-To Geek | Roman Samborskyi/Yellow Cat/Shutterstock
As someone who is a second-language English speaker and has a weird accent that’s so muddled by contact with different places and languages, I can understand how content creators who want to make videos for an English audience feel they need to use these AI tools to produce professional voice content. However, personally, I’d much rather listen to your real voice, even if it’s not going to sound like a BBC documentary.
In fact, one of the best things about YouTube is that it lets us make human connections with creators, and having generic AI voices everywhere destroys that.
YouTube’s Own Policies Aren’t Being Followed

Credit:Â Sydney Louw Butler/How-To Geek | GPT-4o
As outlined on YouTube’s blog, the platform has a policy about disclosing when you’ve used generative AI in a video. It doesn’t require that all videos that use generative AI disclose it, but it does mandate disclosure in cases where its use could be misleading or dangerous. So, for example, if you’re creating a video about history, and you generate realistic “historical” photos or images of things that could be misconstrued for reality, you need to disclose that.
One obvious issue with this policy is that it’s far too vague and open to interpretation, but there are plenty of these AI slop channels that are clearly in violation of this policy. However, YouTube isn’t doing anything about it. Realistically, this is because there are just too many of these channels, and if you take one down, two more spring up in its place. Like some sort of weird, six-fingered hydra.
The best we can do is to report these videos as being in violation of YouTube policies, but I’m not very optimistic about what sort of difference this would make. Especially since YouTube itself is adding to the AI slop problem.
If You’re Going to Use AI, Use It RIght!
If you’re a creator or want to be a YouTube creator, there’s nothing wrong with embracing generative AI tools to make the best content you can. All I’m asking is that you steer away from making buckets of slop and flooding the platform with it.
While you might make a quick buck in the short term, slop will drive away YouTube viewers, which in turn drives away advertisers. You’ll be drowing the goose that laid the golden egg in buckets of cheap nonsense, and be spreading potentially harmful content.