Deepfake Adult Content Is a Serious and Terrifying Issue
As of 2019, 96% of deep fakes on the internet were sexual in nature, and virtually all of those were of non-consenting women. With the release of AI tools like Dolly and Mid Journey, making these deep fakes has become easier than ever before, and the repercussions for the women involved are much more devastating. Recently, a teacher in a small town in the United States was fired after her likeness appeared in an adult video. Parents of the students found the video and made it clear they didn't want this woman teaching their kids. She was immediately dismissed from her position.
But this woman never actually filmed an explicit video; generative AI created a likeness of her and depicted it onto the body of an adult film actress. She pleaded her innocence, but the parents of the students couldn't wrap their heads around how a video like this could be faked. They refused to believe her, and honestly, it's hard to blame them. We've all seen just how good generative AI can be.
This incident and many others just like it prove how dangerous AI adult content is, and if left unchecked, it could be so, so much worse. The truth is, the technology itself isn't the problem; it's the way people are using it and the lack of regulations surrounding its use. Tech has given us amazing things, from the connectivity of social media to giving everyday people like you and me the ability to invest in art through the sponsor of today's video, Masterworks.
Masterworks is an award-winning fintech company in New York City that allows everyday investors with little capital to invest like billionaires and reap the potential benefits. By allowing ordinary people to invest in shares of contemporary art from legends like Picasso, Mascaré, and Manxi, Masterworks has sold over 45 million dollars worth of artworks and distributed the net proceeds to investors.
Why invest in art, though? Art has outpaced the S&P 500 by a stunning 131 percent over the past 26 years, and even as the banking crisis continues, Masterworks has sold two more pieces in just the last month. Outlets like CNBC, CNN, and the New York Times have taken notice, and over 700,000 people have signed up so far. Demand is currently so high that art can sell out in minutes, but the subscribers of the channel can claim a free no-obligation account using the link in the description below.
Back to our story. At first glance, AI pornography might seem harmless. If we can generate other forms of content without human actors, why not this one? Surely it may reduce work in the field, but it could also create more problematic issues in the industry. If the AI was used to create artificial people, it wouldn't be so bad. But the problem is that generative AI has been mainly used with deep fakes to convince viewers that the person they're watching is a specific real person, someone who never consented to be in the video.
Speaking of consent, convincingly portraying women in suggestive situations, the perpetrators commit sexual acts or behaviors without the victim's permission, and that, by definition, is sexual assault. But does using generative AI to produce these videos cause any actual harm? Beyond being defined as assault for the victims involved, there are numerous consequences to being portrayed in these videos. This is what it looks like to see yourself naked against your well-being, spread all over the Internet.
QTCinderella is a Twitch streamer who built a massive following for her gaming, baking, and lifestyle content. She also created the Streamer Awards to honor her fellow content creators, one of whom was Brendan Ewing, a.k.a. Atrioc. In January of 2023, Atrioc was live streaming when his viewers saw a tab open on his browser for a deep fake website. After getting screenshotted and posted on Reddit, users found that the site addressed featured deep fake videos of streamers like QTCinderella doing explicit sexual acts.
Cinderella began getting harassed by these images and videos, and after seeing them, she said, "The amount of body dysmorphia I've experienced seeing those photos has ruined me. It's not as simple as just being violated; it's so much more than that." For months afterward...