The Rising Threat of Deepfakes


Deepfakes, a term blending “deep learning” and “fake,” refer to AI-generated videos and images that look remarkably real but are entirely fabricated. The misuse of this technology has sparked significant ethical and legal concerns. Recently, South Korea has drawn global attention due to the rampant spread of deepfake content featuring female citizens. These disturbing videos and images are often shared on platforms like Telegram and other encrypted sites, with creators and distributors ranging from adults to middle school students. Tragically, no one is safe from these digital crimes—students, teachers, mothers, and sisters are all being exploited by men and boys who use this content to gain access to these illicit sites.


As public figures, female K-pop idols, and actresses are particularly targeted, with their faces superimposed onto bodies in an overwhelming amount of sexually explicit videos and images. These deepfakes not only violate the privacy and dignity of the idols but also have the potential to cause significant emotional and psychological harm. This disturbing trend has sparked outrage among fans and has led to a movement aimed at protecting these artists from more exploitation.


K-pop fans, known for their fierce loyalty and dedication, have taken it upon themselves to help Korean feminists combat the rise of deepfakes targeting Korean women and girls. Online fan communities have mobilized to report and take down deepfake content, using social media platforms and fan forums to spread awareness and gather support. These fans are not just passively observing the issue—they are actively participating in campaigns to protect their idols, and urging the agencies of female idols to take legal action.


One prominent example of fans taking part is the Global Hashtag Campaign Against Telegram, Deepfake Sex Crimes which emerged in response to a particularly egregious deepfake incidents currently affecting women and girls in South Korea. This campaign, along with many others, serves as a rallying cry to stand against the objectification and exploitation.



Although there is a very long path to go, small victories can be celebrated. Fans joined together to report videos by the YouTuber PPKKa who made videos downplaying the seriousness of deepfake crimes. The YouTuber with over a million subscribers is known to attack people he regards as feminists through his videos leading them to be harassed by his viewers. Due to the diligence of international fans, some of the videos were taken down by YouTube, and his channel was eventually demonetized.


Collaboration with Industry and Authorities


The movement to protect female idols from deepfakes is beginning to gain traction within the K-pop industry itself. Entertainment companies are beginning to recognize the importance of addressing this issue and have started to take legal action against those who create and distribute deepfake content. JYPE was the first to release a statement on August 30, 2024, revealing that they are collecting evidence to pursue action with a leading law firm. Other agencies followed suit such FCENM the company of the girl group ILY:1 will be doing the same.



People worldwide are calling for the South Korean government to implement harsher penalties for those caught creating or sharing these videos, emphasizing the need for a legal framework that adequately protects victims of this digital abuse.


Under the Act on Special Cases Concerning the Punishment of Sexual Crimes, individuals caught producing or distributing deepfake pornography can face up to five years in prison or a fine of up to 50 million won (approximately $37,740). If the victim is a minor, stricter penalties under the Juvenile Protection Act apply, with sentences ranging from a minimum of five years to life imprisonment. To combat this issue, the Ministry of Gender Equality and Family has announced plans to support victims by assisting in the removal of deepfake content detected through systems used by investigators to track illegal videos. 


Adversely, there are fellow K-pop fans who are making the situation worse by falsely accusing random male artists of participating in the Nth rooms through social media posts and TikTok videos– causing mass panic and confusion among other K-pop fans who trust information from fan social media accounts rather than reputable news sources. Creating this content with the intent to stir drama and social media engagement is dangerous for several reasons; including the defamation of innocent people and taking away the attention from the victims of sexual crimes in South Korea. Fans should only trust reputable sources for news regarding this crime. Due to the diligent action of fans calling out the posts, many have been deleted with apologies from the posters.


As the battle against deepfakes continues, it is clear that the combined efforts of fans, the K-pop industry, and authorities will be crucial in protecting women and female idols from this insidious threat. In a world where technology can be used to both create and destroy, the determination of K-pop fans to shield their idols from harm serves as a powerful reminder of the impact that collective action can have in the digital age.

Leave a Reply

Your email address will not be published.

Previous post Gojira say new album will be “a clear step forward and upward” from ‘Fortitude’
Next post Han So Hee’s Label Directly Responds To Her Mother’s Illegal Gambling Charges

Goto Top