Explicit deepfake imagery is one of the most real and tangible threats facing society today, much more so than disinformation related to deepfakes, yet our industry’s discussions tend to focus on the latter concern. This issue disproportionately affects women across the globe, and terrifyingly, young girls, as they are most often the victims when it comes to explicit AI images. There has been an increasing number of reported cases of women and girls who have had deepfake imagery made of them. Reporters have found instances where sites are not just generating deepfake pornography of adults, but child pornography as well. There are a growing number of websites that offer services allowing the user to make AI pornography of any woman they want at a very small price.
In this panel, our speakers will discuss their experiences investigating and reporting on this rising threat. They will talk about the difficulties with reporting on the shadowy environment in which these websites often operate in and provide practical advice on how to navigate those challenges when conducting investigations. The session will cover the legal concerns that may arise, technical and contextual experience of the risks of this technology and practice, tips on reporting and revealing details of such an investigation, and the effect this type of investigation can have on the reporter themselves.
The session will conclude with thoughts from each panelist on what they anticipate the future holds when it comes to AI-powered deepfaked sexual imagery, and what they hope will be done to protect future victims of this technology.