It astonishes me that society apparently believes that women and girls should accept becoming the subject of demeaning imagery.
The New York Times has an opinion piece by Nicholas Kristof on deepfake porn, The Deepfake Porn of Kids and Celebrities That Gets Millions of Views. The opinion says what is becoming obvious, that deepfake tools are being used overwhelmingly to create porn of women, whether celebrities, or girls people know. This artificial intelligence technology is not neutral, it is hurtful of a specific group – girls and women.
The article points to some research like a study 2023 State of Deepfakes by Home Security Heroes. Some of the key findings:
- The number of deepfake videos is exploding (550% from 2019 to 2023)
- 98% of the deepfake videos are porn
- 99% of that porn women subjects
- South Korean women singers and actresses are 53% of those targeted
It only takes about half an hour and almost no money to create a 60 second porn video from a single picture of someone. The ease of use and low cost is making these tools and services mainstream so that any yahoo can do it to his neighbour or schoolmate. It shouldn’t be surprising that we are seeing stories about young women being harassed by schoolmates that create and post deepfake porn. See stories here and here.
One might think this would be easy to stop – that the authorities could easily find and prosecute the creators of tools like ClothOff that lets you undress a girl whose photo you have taken. Alas, no. The companies hide behind false fronts. The Guardian has a podcast about trying to track down who owned or ran ClothOff.
What we don’t talk about is the responsibility of some research projects like LAION who have created open datasets for training text-to-image models that include pornographic images. They know their datasets include porn but speculate that this will help researchers.
You can learn more about deepfakes from AI Heelp!!!