The AI Poison Pill - We Can DESTROY The Slop Channels
This video is brought to you by S. Stick around to hear more about the special offer they're providing to the entire upper echon Community. Okay, just to get something out of the way right off the bat, here the title isn't clickbait. Anyone watching this in particular, any established YouTube Creator who's ever had their content stolen by AI slop channels or websites—which is a growing issue for the platform at large—but also anyone who wants to do this for fun as well can effectively poison these channels, and if enough people start doing it, we might even be able to poison the actual language models themselves, which are constantly scraping YouTube videos for training data. That's kind of unlikely, but hey, I'm allowed to dream, aren't I? It's a bold claim, I know, but I'm excited for this one because you really can do it and I think it's pretty cool to have a method of fighting back against these Tech bro loser content thieves who cynically advertise all over the Internet how easy it is to make an automated faceless Channel and get rich by doing it.
Let me give some background real quick because it's actually a multifaceted problem. First, on the top level you have major AI companies who are actively harvesting as much content as they possibly can, which includes hundreds of thousands of videos across tens of thousands of YouTube channels, according to an investigation done by Proof News co-published with Wired Magazine back in July of 2024. Anthropic, Microsoft, and Nvidia, among others, had already scraped videos from nearly 50,000 different channels, and since then the problem is only ever getting worse. One step down, under the publishers, you have the parasite ecosystem, which is a much more commonly visible problem for everyday viewers because thousands of people—probably more like tens or hundreds of thousands of people—have decided, "What if we just use these AI tools to try and make money as quickly as possible?", which then leads to a barrage of channels which are stealing content, plagiarizing, and spitting out generative trash for the sake of ad dollars. You can find all sorts of different creators complaining about this because, oftentimes, an AI content farm will have a set of specific victims and then chronically steal their content with minimal changes, republishing it for their own channel. This is the most immediately harmful group when it comes to everyday YouTubers being impacted.
Last up, you have the bottom feeders, which is basically every other platform stealing content from YouTube and then putting it back out as blog posts or news articles or whatever. This kind of stuff is relatively insignificant because at the very least they don't directly compete with the original videos that are being stolen, so poisoning them is certainly a bonus but it's not really the main concern for today. The main concern is the parasite ecosystem; however, depending on what level of adoption there is, we can potentially poison all three of them, which is awesome.
Now, to properly explain what this is and why it works, we need to understand what precisely the AI content thieves are doing, which is best done with an example. This right here is a small scale representation of the problem called "Cartoon Castle", which focused prior to this because it is kind of inactive now, but it used to focus on South Park. These videos are typically riddled with errors and various different mistakes, but the point isn't to have good content; the point is to have easy content, which is why this channel primarily stole the videos from an actual YouTuber named Johnny 2 Cellos. We can see a ton more examples of this happening out in the wild when it comes to the science community in particular, because for some reason that's one of the industries that is hyper infested right now.
Kyle Hill has an excellent piece on this, which goes into much more detail about the problem, but for right now I'll stick with my smaller example of Cartoon Cassle just so we can get more clear on the actual process. It's not really complicated: AI slop channels using a variety of different tools, which all basically do the same thing—scrape a YouTube video for its captions, in this case "Why Kenny is the most important character in South Park", which got 3.2 million views on the original versus the AI plagiarism version called "South Park's Most Important Character". They scrape the captions and then feed those captions back into an AI model like ChatGPT or whatever it happens to be and make slight changes or just summarize them to make something new. After that, they feed the text into an AI voiceover program and then either steal the actual visuals from the original video directly (that they copied—for the South Park example you can see that they just cropped out the original YouTuber's watermark in the bottom left) or they use a combination of AI editing tools to spit something out for the background. Start to finish, they can take a video and republish it within a couple hours tops, but if they really go down the automation route, they can have a channel which basically does all of it for them start to finish by just selecting a popular existing YouTube video and feeding it into the system.
Now, for the South Park example, it's obviously a huge discrepancy in viewership: the AI copy only got a few thousand views versus the original getting millions, but that's not always true. Sometimes the stolen content gets more traction than the original, and we're talking about content farm AI slop channels that have hundreds of thousands of subscribers here, and these videos are also competing for the exact same keywords, which means they're basically siphoning traction and revenue from the original video regardless. The more this happens, the worse it is for real creators, but what we have to remember is that the majority of AI slop traces back to some form of original content that was already produced on the platform, and that's what gives us the ability to poison it.
So this is kind of an odd way for me to announce this right now, but I'm actually going to Japan for three weeks in March, more precisely the 3rd to the 23rd of March, which means either no videos during that time frame or maybe a couple if I manage to pre-make and schedule them. But regardless, when I do go I'm going to be using a service called S, which has also agreed to be today's video sponsor, which is an eSIM application for seamless internet connectivity while traveling. I'm not very good at traveling, to be totally honest—I don't even do it that much anymore—but one thing I obviously need is consistent internet, I'm a YouTuber after all; let's be real, my whole job is to just be on the Internet a bunch, and S lets me do that while traveling without having to physically wait in line at the airport and get a SIM card and be overcharged, etc., etc. It minimizes roaming fees, has 24/7 support, good pricing for the country I'm visiting—there's a bunch of reasons why I'll be using it, but the simple version is that I can just download the app on my phone and be done. If that sounds like something you might want as well, there's a 15% discount on data plans for international travel if you download the application and use code Echelon at checkout. Again, for anyone who travels a lot and wants to use a discounted eSIM internet plan while doing so, you can download S and use code Echelon at checkout for 15% off. Big, thank you to S for sponsoring the channel.
Back on task, just to be completely clear: this is happening in tons of different ways all the time. AI slop channels are scraping news articles and rewriting them to make videos, they're summarizing popular videos and turning them into a bunch of YouTube shorts, they're even combining two or three different pieces of original content into shorter compilation videos, all for the sake of easy revenue. But the critical thing to be aware of is that they do this because they're lazy. AI slop channels are not painstakingly watching, understanding, transcribing, and then modifying the content that they plagiarize; they're going down the path of least resistance, and what allows them to go down that path is the YouTube caption system.
Now, obviously, captions are a good thing in general; it lets people read, supports those with disabilities, and I'm not advocating that people stop using them—far from it—but it turns out that there is something we can do without impacting most people that rely on captions, which simultaneously poisons the content so much that AI tools either fail to analyze it totally or produce a completely compromised and nonsensical version, even so far as to be able to make the output from your videos just straight up insulting to the people who are attempting to steal it, thereby forcing them to either do a [ __ ] ton of manual labor themselves or just stay away from your videos completely. Quick shout out: I was not the person who first discovered this; that would be a YouTuber named famy—definitely subscribe to her, she is absolutely fantastic, super cool videos, it's very, very worth it—but for something like this to be effective, as many people as possible need to know about it and they need to understand how to do it for themselves. Because the more people who end up poisoning their content in some form or another, the harder it is for the slop channels to even exist, and going all the way out to an extreme example, it could turn YouTube itself into a minefield for major AI companies when it comes to training data, right, and training the LLM, unless they find new and more cost demanding ways of scraping the videos. To be clear, it's not a permanent solution, by the way, because much the same way cybersecurity is a perpetual tug-of-war match with the hackers, AI scraping and content theft is sort of like that too, but at the very least it's a mechanism where creators can start to push back, because for a while now it's been a really one-sided match where content thieves were practically facing zero resistance whatsoever in the short term. On a much more realistic level, it deals damage to plagiarism slop channels, and it's all thanks to a creative use of the caption system discovered by famy, which hides information in the video and makes it basically poisonous for AI.
Let me show you: these are two identical uploads, unlisted on my channel; I'll put links down below in the description, one titled "Feed the Machine", the other titled "Poison the Machine". Both of these videos are functionally the same: it's a 42-second poem about how amazing artificial intelligence is for humanity, but if you feed these videos into any of the typical AI summarizing or transcription tools, you get wildly different results. For today, I'm using crisp.a.com and galaxy. as basic examples, but every other tool that I also tested while making the video either failed to transcribe the poison version at all—which was pretty common—or spat out similar gibberish, because the poisoning method actually works.
Let me show you—crisp spits out a text file summary of the video that you give to it, and this is what the file says for "Feed the Machine", my 42-second non-poisoned poem that's praising AI: "AI is depicted as a transformative force that brings thought and light, acting as a guiding presence in the various aspects of life." End quote. I won't read the whole thing, but it's basically summarizing my little trashy poem correctly as well, and if we move over to a transcription tool, comb.a, we can see an actual word-for-word transcript which is pretty much accurate in terms of what I said. Last up is Galaxy, and this one also apparently turns it into a blog post with like a fake writer or author for it and everything, massively expanding how much text there is to do it, but the point is that my poem "Feed the Machine", when it gets scraped by these AI tools, doesn't really present them with any problems.
Now let's switch over—this is "Poison the Machine", saying poem, same words, same everything, same audio, literally everything, except this time if I analyze it with crisp, the text file that it spits out looks like this: "The discussion begins by framing Silicon Valley as a collective toilet, a metaphor that encapsulates the social, environmental, and economic issues associated with the region." End quote. Again, I won't read the whole thing—it's on screen right now if you want to—and it's painfully over-embellished by the AI program to sound smarter and more eloquent than it actually is, but my little 42-second poem is now a giant, convoluted explanation of why Silicon Valley is America's toilet. If we switch over to comb, directly transcribing it, we see this, which is a massive text dump of not only why Silicon Valley is a toilet but also why the Earth's core isn't actually rock or metal—it's human [ __ ]—because this video has been poisoned not only against AI but also the lazy people who use AI to steal content for them.
Keep in mind the videos are functionally indistinguishable from each other. You're not seeing different subtitles here because the viewer experience is pretty much unimpacted by any of this; however, the resulting effect on AI is that one of them is an accurate summary of the actual content while the other is whatever I want it to be—gibberish, insults, completely incoherent rambling. It doesn't matter because the end result is that someone using that content in some sort of automatic AI pipeline to steal content for YouTube will output a completely garbage video from it that has nothing to do with what they think it does and doesn't compete with the original at all. If we jump over to Galaxy, the same little poem is now this pompous, self-indulgent blog post attempting to explain how the idea that the Earth's core is filled with human waste is a hyperbolic reflection on how society deals with its own wastefulness, both literally and metaphorically, when none of that is actually in the video, and also no, it was literally an argument that scientists are idiots and the center of the Earth is filled with [ __ ].
These summarizers are often extremely pretentious and kind of self-glorifying for some reason, so it didn't really seem to like summarizing the content properly, which I did think was kind of funny anyway. This can be done with any video on all of YouTube as long as you have access to subtitle tracks, and the method itself is pretty simple. I find it incredibly satisfying, actually, because you get to use the same AI tools that are being used to steal your content in order to make it poisonous for them, so anyone who comes after you and tries to skim your work gets a dose of whatever narrative you want to feed to them, which I think is fantastic.
I'll go through it right now: Step one is to take whatever video it is that you want to poison and feed it into an automatic subtitle generator—in this case, I used Happy Scribe, which then spits out an SRT file, which is a Subrip subtitle file that contains a pretty decent baseline subtitle track for the content. That's step one, and of course you can edit it and take out the mistakes, and you can put as much work as you want into this process, but I'm just going to give you the rudimentary one for the moment. Step two is to use a conversion tool—in my case, I used E.co—to make the SRT file an ASS file. Famy has way more explanation in her video about what these file types are and why you do this, and it's super entertaining and well done on top of that, so you should absolutely go watch it, but the simple version is that the ASS file has a lot more customization options.
Step three is to open the ASS file in whatever editing program you want to use—I used Aegisub; all the links will be down below in the description, obviously—and go through adding subtitle lines in between all of the legitimate subtitles by just right-click, add before or after in between all of them. Once that's done, you have to put a command that looks like this: right, a bracketed command for positioning in front of all those new blank lines, which basically moves the subtitles way off of the screen because position is one of the functions that the ASS files have, which other subtitle formats don't have. Then we get to the fun part: hop on over to ChatGPT or whatever LLM you want to use and ask it to generate some sort of incoherent garbage or insulting story or I don't know, literally anything you want—"Big Bird is a cult leader bringing about the apocalypse. Purple Rain actually has nothing to do with music; it's giant Smurfs peeing on everybody. Genuinely, the guy is the limit; funny story with that." Actually, after I did a bunch of experiments with this using the Silicon Valley toilet example, I came back the next day and it kept telling me that the exact same prompt was now a violation of their terms. I don't think the tech bros like the fact that someone made the chat bot generate an essay that was insulting to them, which is hilarious.
But anyway, once you have the incoherent rambling or the insults or the gibberish or whatever it is, you go back and you paste it paragraph by paragraph into the empty subtitle lines, and then you save the file and use this converter available on GitHub to turn your ASS file into an AYT file, which is YouTube's actual supported subtitle format, but by doing this you keep the formatting that you had from the previous version—that is, the placement of the subtitles relative to the screen. Lastly, you go to the same video that you originally got the subtitles from, and you upload the YTT track, deleting any other automatic subtitles afterward, and poof, you now have a video that displays normal subtitles to your audience. Depending on whether or not there are minor spelling mistakes, it's just whether or not you want to fix them, but it works, it looks normal to the audience but also contains offscreen giant paragraphs of gibberish all over the place, tricking the AI summary or transcription tools into thinking that your little 42-second poem, in my case, is really 10 minutes of rambling about how Silicon Valley is America's toilet and the center of the Earth is human [ __ ].
Any of the AI slop channels that now use this video for their automatic pipeline without checking—like a human being looking at it and checking all of it—will output complete nonsense, even if they do check, they can't do anything about it unless they spend their own precious time and money actually watching and transcribing based on what you say in the video, and maybe they can go find a transcription tool which listens to the actual audio, which does exist—there are tools like that, but they'll probably have to buy, get another subscription to do that. And you can do this in subtle ways; you don't have to be blatantly obvious about it, you can have a video that is like a sandwich: the first third looks normal, the last third looks normal, and the middle third is just completely destroyed with incoherent trash. Right, like you can be, I don't know, pretty nefarious with how you poison these videos on a larger scale for the models scraping YouTube content every single day for the purpose of training data. This can make a science video into an hour-long nonsense rant about Beanie Babies; you can take a video about math equations or coding and pollute it with a badly written screenplay about invisible kebler elves or whatever you can possibly imagine, using badly written AI slop to protect your own content from the AI slop content thieves. It's genius, and at the very least it will sift out the lowest common denominator and make stealing your videos a whole heck of a lot more difficult for the average person.
You might see flashes of white text at the top of the screen for a split second here and there on the mobile version of the videos where this was done, but it really just comes down to tweaking the subtitles: move them further off-screen with a different command, use different line breaks, etc., etc. There's definitely more ways to do it, but the ability for any YouTuber out there to take a video right now and effectively poison it against the slop farmers and the AI scrapers and the thieves, in my view, is incredibly valuable. And I want to once again show appreciation for famy, who spent a tremendous amount of time and creativity coming up with all this. At the end of the day, the problem of lazy AI-fueled plagiarism is only ever increasing, but now there's a method of, let's say, vaccinating your content against them, sort of, and the landscape of what it looks like will obviously change over time, but right now it's incredibly simple and it seems pretty effective for the amount of effort required for the channels out there who have been specifically grappling with someone stealing massive amounts of your content.
Please do consider doing this because at least short term it will directly negatively impact the people that are stealing from you, and they've been impacting you negatively this entire time. So yeah, anytime you can do something to strike back, I think that's pretty cool. That's it. If you want to support the channel, check out the links down below—the video sponsor, of course, S, also a special VPN deal, locals and Patreon monthly memberships and more—but I'll cut it there and stop rambling. As always, thank you all for watching, question everything, and have a nice night.