International Center for Ethics in the Sciences and Humanities (IZEW)

News

30.04.2025

Policy-Paper on sexualising deepfakes

Policy-Paper published by CEE Digital Democracy Watch

Together with Mateusz Łabuz from the Institute for Peace Research and Security Policy at the University of Hamburg, Maria Pawelec has published a policy paper on the urgent issue of non-consensual sexualised deepfakes, also known as pornographic deepfakes or deepnudes.

Non-consensual sexualised deepfakes are on the rise. The increasing quality and accessibility of deepfake technology makes it possible for anyone to create deepfakes of almost anyone who has shared images online (e.g. on social media) or privately with the perpetrator. The result: a flood of synthetic non-consensual intimate images (NCII) with devastating consequences for those affected, especially women.

Policy makers around the world are responding. A new EU directive, the Directive on Combating Violence Against Women and Domestic Violence, will criminalise non-consensual sexualising deepfakes, but it must first be transposed into national law by 2027. The issue is so urgent that earlier measures are needed, and legislative discussions are underway in Germany, for example.

The policy paper analyses the growing threat posed by non-consensual sexualised deepfakes, past and ongoing regulatory efforts in Europe, with a focus on case studies from Germany and Hashtag#Poland, and contains a set of key recommendations for policymakers and society to combat this threat to individuals' privacy and life chances, but also to social cohesion and democracy.

The policy paper is published by CEE Digital Democracy Watch, a Warsaw-based non-governmental organisation that promotes democracy in times of technological innovation. The target audience includes European and national decision-makers.

You can find the policy paper here:  https://ceeddw.org/wp-content/uploads/2025/04/NCII_DeepFakes_ThreatsRecommendations.pdf

Back