DE

AI and Gender
The Dark Shadow of AI: Deep Fake Porn and Gender Bias

image of the event 'the dark shadow of ai'

On October 21, 2024, the Friedrich Naumann Foundation for Freedom Korea office, in collaboration with social cooperative Parti, the German Embassy in Seoul, and Goethe-Institut Korea, hosted a panel discussion event, “The Dark Shadow of AI: Deep Fake Porn and Gender Bias."

© FNF Korea

The rise of deep fake technology has opened Pandora’s box in the digital realm, blending innovation with significant ethical challenges. South Korea has found itself at the forefront of this issue, grappling with the devastating consequences of deep fake exploitation.

On October 21, 2024, the Friedrich Naumann Foundation for Freedom Korea office, in collaboration with social cooperative Parti, the German Embassy in Seoul, and Goethe-Institut Korea, hosted a panel discussion event, “The Dark Shadow of AI: Deep Fake Porn and Gender Bias,” at the Goethe-Institut Korea. This event brought together experts, activists, and the public to explore the intersection of artificial intelligence, gender, and digital rights.

The evening opened with remarks from Frederic Spohr, Head of FNF Korea, David Bieger, First Secretary of the German Embassy in Seoul, and Dr. Clemens Treter, Director of Goethe-Institut Korea. Central to the discussion were panelists: Won Eunji from Team Flame, a group known for pioneering awareness around the “Nth Room” case in South Korea; Sandi, the founder of the AI Ethics Newsletter; and Raphael Rashid, an experienced journalist who has recently reported on deep fake issues in Korea for The Guardian. Moderated by Ohyeon Kweon, chairman of social cooperative Parti, the panel delved into the distinct characteristics of digital sexual violence in Korea and broader questions about the sociocultural and technological factors exacerbating these issues.

Panels on the stage
© FNF Korea

Won Eunji is a reporter and activist from Team Flame, the group that first exposed the Nth Room case—a network of private chatrooms used to share and sell sexually exploitative content, including material involving minors. Team Flame not only investigates digital sex crimes, but also supports victims, helping them connect with support centers, accompany them to law enforcement.

As the first presenter, Won Eunji shared her perspective on the unique challenges in combating deep fake exploitation in Korea. She highlighted how platforms such as Telegram have become hotbeds for such crimes, enabling interconnected networks of perpetrators to operate without being punished. Despite legal advancements, including the recent enactment of anti-deep fake legislation, victims continue to face significant barriers in seeking justice. For her pivotal role in exposing digital exploitation, Team Flame was honored with a special award at the Amnesty International Media Awards.

Sandi from AI Ethics Newsletter provided a technical and ethical lens on the issue, exploring how biases ingrained in AI models contribute to harmful outcomes. She noted that deep fake technology itself emerged as a tool for exploitation, with its earliest uses targeting women and producing non-consensual explicit content. Sandi emphasized that AI is not a neutral force; the biases embedded in its design and application can deepen societal inequalities if left unchecked.

Adding a journalistic perspective, Raphael Rashid addressed the critical role of media in shaping public understanding of deep fake exploitation. He argued that journalists must move beyond sensationalism and instead focus on the lived experiences of victims while holding institutions accountable for their responses, or lack of responses to this growing crisis. Rashid also emphasized the importance of raising international awareness about South Korea's struggles and successes in confronting these issues, as they offer valuable lessons for other nations.

image of audience
© FNF Korea

The panelists explored the widespread consequences of deep fake pornography in South Korea, a country shaken in recent years by crimes involving deep fake exploitation, with cases even extending to minors. The speakers shared insights into why this digital crime is alarmingly prevalent in South Korea and what unique social, technological, and regulatory factors contribute to the issue. Panelist Won Eunji, who has a deep history of addressing digital sexual exploitation, shared the emotional and legal struggles victims face and the gaps in current protections.

In addition to deep fake pornography, the panel also examined gender bias within AI systems. Sandi highlighted how AI models, when trained on biased data, risk perpetuating discrimination and reinforcing stereotypes, which can lead to biased outcomes in areas from hiring practices to law enforcement. Raphael Rashid drew from his journalistic experience to emphasize the importance of media exposure and societal awareness, which can push for more robust policies to protect individuals from such technological abuse.

Throughout the evening, an interactive digital tool developed by Parti named ‘Townhall’ allowed the audience to engage directly, posing questions and sharing reflections, ensuring a dynamic, community-oriented dialogue.

*Lin Choi is the Program & Communications Officer of the Friedrich Naumann Foundation for Freedom Korea office.