A growing number of people are viewing the creation and sharing of non-consensual sexual deepfakes as nothing more than harmless entertainment, according to a recent police-commissioned survey. The study found that nearly one in four respondents felt neutral or accepted this behavior, even when it involves digitally altering intimate content without consent.
The disturbing trend has sparked warnings from law enforcement officials, who claim that the rise of AI technology is fueling an epidemic of violence against women and girls (VAWG). Senior police officer Det Ch Supt Claire Hammond described the situation as a "serious crime" that requires urgent action. She urged victims of deepfakes to report any images to the authorities, assuring them that they would receive support.
The survey also revealed that 7% of respondents had been depicted in a sexual or intimate deepfake, with only half of those reporting it to the police due to concerns about being treated seriously. The most common reasons for not reporting the incident were embarrassment and uncertainty about how the authorities would handle the situation.
Research suggests that men under 45 are more likely to view creating and sharing deepfakes as acceptable behavior, which is linked to a broader pattern of misogyny and views on pornography. However, experts warn against jumping to conclusions, citing the need for further research into this phenomenon.
The report also highlights the alarming normalisation of deepfake creation, with technology becoming increasingly accessible and affordable. One in 20 respondents admitted to having created deepfakes in the past, while over one in 10 said they would create one in the future.
Activists are sounding the alarm about the devastating impact of these technologies on victims of abuse. They argue that a lack of education, safeguards, and laws has led to a generation of young people growing up without adequate protection from online harassment.
As the technology continues to evolve, it is clear that urgent action is needed to prevent the spread of non-consensual sexual deepfakes and protect vulnerable individuals from exploitation.
The disturbing trend has sparked warnings from law enforcement officials, who claim that the rise of AI technology is fueling an epidemic of violence against women and girls (VAWG). Senior police officer Det Ch Supt Claire Hammond described the situation as a "serious crime" that requires urgent action. She urged victims of deepfakes to report any images to the authorities, assuring them that they would receive support.
The survey also revealed that 7% of respondents had been depicted in a sexual or intimate deepfake, with only half of those reporting it to the police due to concerns about being treated seriously. The most common reasons for not reporting the incident were embarrassment and uncertainty about how the authorities would handle the situation.
Research suggests that men under 45 are more likely to view creating and sharing deepfakes as acceptable behavior, which is linked to a broader pattern of misogyny and views on pornography. However, experts warn against jumping to conclusions, citing the need for further research into this phenomenon.
The report also highlights the alarming normalisation of deepfake creation, with technology becoming increasingly accessible and affordable. One in 20 respondents admitted to having created deepfakes in the past, while over one in 10 said they would create one in the future.
Activists are sounding the alarm about the devastating impact of these technologies on victims of abuse. They argue that a lack of education, safeguards, and laws has led to a generation of young people growing up without adequate protection from online harassment.
As the technology continues to evolve, it is clear that urgent action is needed to prevent the spread of non-consensual sexual deepfakes and protect vulnerable individuals from exploitation.