The Monash team interviewed 25 individuals, 15 who identified as victims and 10 who admitted creating or sharing deepfake sexual content. Among key findings:
Many perpetrators described creating fake nude or sexual images not with financial motive, but as a way to bond with peers, show off technical skill, or elevate status within a social group.
Some rationalised their actions: claiming that “AI tools make it so easy” it doesn’t feel like wrongdoing. Others treated it as a “prank” or dismissed it as harmless, echoing victim‑blaming attitudes seen in broader sexual violence.
On the victims’ side, the harm was real and severe: the emotional and psychological impact of seeing one’s likeness misused, often with no meaningful recourse. In many cases, attempts to report to police failed to lead to legal consequences.
According to the report, women are overwhelmingly the main targets, especially in cases involving sexualised or controlling deepfakes, although men were also victimised in scenarios tied to sextortion, humiliation or blackmail
As generative‑AI tools proliferate and become easier to use, the barrier to creating convincing, damaging fake intimate content is falling rapidly. The study warns that normalisation among certain peer groups (especially younger males) may lead to an increase in both creation and distribution of non-consensual deepfake content.
Moreover, current legal frameworks and enforcement in Australia remain limited. A separate statistical review by Australian Institute of Criminology (AIC) of image‑based sexual abuse (IBSA) offences across several jurisdictions found that the majority of cases in 2022–23 involved distribution of explicit material, but did not distinguish between digitally manipulated and real‑image offences.
Because deepfake creation itself is often not criminalised separately (or difficult to prove), many perpetrators escape legal liability, even when victims come forward. That reflects a gap in both legislation and the structures for victim support.
Subscribe to The Security Briefing for monthly updates!