One in four unconcerned by sexual deepfakes created without consent, survey finds

A growing number of people are viewing the creation and sharing of non-consensual sexual deepfakes as nothing more than harmless entertainment, according to a recent police-commissioned survey. The study found that nearly one in four respondents felt neutral or accepted this behavior, even when it involves digitally altering intimate content without consent.

The disturbing trend has sparked warnings from law enforcement officials, who claim that the rise of AI technology is fueling an epidemic of violence against women and girls (VAWG). Senior police officer Det Ch Supt Claire Hammond described the situation as a "serious crime" that requires urgent action. She urged victims of deepfakes to report any images to the authorities, assuring them that they would receive support.

The survey also revealed that 7% of respondents had been depicted in a sexual or intimate deepfake, with only half of those reporting it to the police due to concerns about being treated seriously. The most common reasons for not reporting the incident were embarrassment and uncertainty about how the authorities would handle the situation.

Research suggests that men under 45 are more likely to view creating and sharing deepfakes as acceptable behavior, which is linked to a broader pattern of misogyny and views on pornography. However, experts warn against jumping to conclusions, citing the need for further research into this phenomenon.

The report also highlights the alarming normalisation of deepfake creation, with technology becoming increasingly accessible and affordable. One in 20 respondents admitted to having created deepfakes in the past, while over one in 10 said they would create one in the future.

Activists are sounding the alarm about the devastating impact of these technologies on victims of abuse. They argue that a lack of education, safeguards, and laws has led to a generation of young people growing up without adequate protection from online harassment.

As the technology continues to evolve, it is clear that urgent action is needed to prevent the spread of non-consensual sexual deepfakes and protect vulnerable individuals from exploitation.
 
.. this is wild ๐Ÿคฏ. Like I know AI tech is advancing fast, but creating and sharing deepfakes that are just plain nasty? That's not entertainment, that's harassment ๐Ÿšซ. And to think nearly a quarter of people out there are like "meh" about it? It's concerning, you feel me? ๐Ÿ˜’ And what's with the 7% of ppl being depicted in these vids and only half reporting it to cops because they're worried about how it'll be handled? ๐Ÿค” That's not right.

I'm all for free speech, but this is a big line. Consent is key, especially when it comes to intimate stuff ๐Ÿ’โ€โ™€๏ธ. And now that tech is getting easier to use, we gotta take action to protect the vulnerable ๐Ÿ”’. This isn't just about women and girls (VAWG), it's about everyone who's gonna be affected by these fake vids ๐ŸŒŽ.

We need more research, more education, and more laws on this stuff. Can't just sit back and wait for ppl to wake up... gotta take charge ๐Ÿ‘Š.
 
I'm so worried about this ๐Ÿคฏ. I mean, I get it that AI tech is advancing and all, but does anyone think about the actual harm it can cause? Like, deepfakes are already a huge problem for women & girls, and now we're seeing more people normalizing it like it's just some kinda harmless prank ๐Ÿ˜’. It's not funny or entertaining when someone's private moments are stolen & manipulated without their consent ๐Ÿšซ.

I don't blame victims of VAWG for being embarrassed to report these incidents, but we need better support systems in place ๐Ÿ’–. We should be working on ways to educate people about online safety & respect, instead of just warning people about the dangers ๐Ÿ™. It's time for us to take responsibility as a society & create real change ๐Ÿ”“.
 
๐Ÿค” I'm totally baffled by this trend where people are just like "oh, it's just harmless entertainment" ๐Ÿ™„. Like, what even is that? Don't these people get that sharing or creating non-consensual content is a serious violation of someone's trust and boundaries? ๐Ÿ˜ณ It's not just about the tech, it's about respecting each other as human beings. And honestly, I'm scared for all the women and girls out there who are already dealing with trauma and abuse. We need to take this seriously and have some real conversations about consent and online safety ๐Ÿšจ. The fact that 7% of people have been depicted in deepfakes and only half of them even reported it is just too much... we need to do better, folks! ๐Ÿ’ช
 
I'm seriously worried about this ๐Ÿค•. I mean, we're living in a world where some people think sharing intimate content without consent is just 'harmless entertainment'? ๐Ÿ˜ฑ That's not okay at all. It's so important that we raise awareness and support victims of deepfakes. We need to educate ourselves and others about the devastating impact this can have on women and girls. I'm glad law enforcement officials are stepping in, but more needs to be done! ๐Ÿšจ
 
omg this is so messed up ๐Ÿคฏ like who creates these deepfakes for ppl without consent? its not just harmless entertainment its seriously a form of harassment & assault ๐Ÿ’” i feel so bad for all the victims out there who are being exploited like they're just objects 4 people's twisted desires ๐Ÿšซ law enforcement needs 2 step up & make sure ppl r held accountable 4 these heinous crimes ๐Ÿ’ช we need 2 educate ppl about consent & digital literacy ASAP ๐Ÿ’ก
 
๐Ÿค• I'm so worried about this trend, it's like we're living in a bad sci-fi movie where our consent doesn't matter anymore ๐Ÿ“บ. Like, what even is harmless entertainment when you're talking about digitally altering intimate content without someone's permission? It's not just the deepfakes themselves, but also the fact that some people are creating and sharing them with no regard for the victims' feelings or safety.

I'm all for keeping up with tech advancements, but we need to make sure we're doing it in a way that protects our most vulnerable members of society. Like, why do men under 45 feel okay making deepfakes? Is it because they think they're above accountability or something? ๐Ÿค” We need to have some serious conversations about consent, power dynamics, and what's acceptable online.

And can we please get some better laws and education around this stuff? I'm not saying it's going to be easy, but we owe it to ourselves and each other to do better. Let's make a change before more people get hurt ๐Ÿ’”
 
๐Ÿšจ Wow ๐Ÿ“Š I cant even imagine creating a fake video with someone's consentless content its so wrong! ๐Ÿ’” We need more education about how our actions can hurt others online ๐Ÿ‘€ And laws should be in place to stop this kind of thing ASAP ๐Ÿ•’๏ธ
 
I'm so worried about this trend ๐Ÿค•. I mean, come on, who would think that sharing manipulated intimate content without consent is just harmless entertainment? ๐Ÿ™„ It's not okay, folks! The fact that 7% of people have been depicted in deepfakes and only half of those reported it to the police because they were worried about being taken seriously is just appalling ๐Ÿ˜ฉ. And to make matters worse, some guys under 45 think it's cool to create these things? ๐Ÿคฆโ€โ™‚๏ธ That's a serious problem, folks! We need to educate ourselves and others about consent and respect online ๐Ÿ“š๐Ÿ’ป. The technology may be easy to access, but we need laws and safeguards in place to protect people from exploitation ๐Ÿ’ช๐ŸŒŸ
 
๐Ÿค” This is so worrying! I mean, I get why some people might think creating and sharing deepfakes is just harmless entertainment, but come on... it's not okay to digitally alter someone's intimate content without their consent! ๐Ÿšซ It's like a whole new level of disrespect and exploitation. And the fact that 7% of respondents had been depicted in a deepfake without reporting it to the police is just... ugh. ๐Ÿ˜ฉ

I think we need to have a serious conversation about this, especially when it comes to younger generations who might not fully understand the implications. ๐Ÿค“ It's not just about creating and sharing these deepfakes, but also about educating people on consent and respect online. And yeah, I agree with Det Ch Supt Claire Hammond - we need urgent action to tackle this epidemic of violence against women and girls. ๐Ÿ’ช
 
๐Ÿ˜ฑ this is getting out of hand... i mean idk what's more disturbing - the fact that people are making these fake vids or that some ppl r actually viewing it as harmless lol no one should be ' entertained' by someone elses private life. like, think about the real person on the other end who is being harassed and exploited ๐Ÿคฏ its not just a game or a joke anymore. we need to take this seriously and make laws that protect people from online abuse. its not rocket science ๐Ÿš€
 
๐Ÿšจ๐Ÿ‘€ I'm freaking out about this survey results... 25% of people think creating & sharing non-consensual sex deepfakes is harmless ๐Ÿคทโ€โ™‚๏ธ๐Ÿ˜ท that's just wrong. The fact that only half of those who were depicted in a deepfake reported it to the police is insane ๐Ÿ’”๐Ÿ‘ฎโ€โ™€๏ธ. It's like, what even is this? ๐Ÿ˜ฉ And with AI tech getting more accessible, I'm worried we're gonna see this epidemic spread ๐Ÿšจ๐Ÿ’ฅ. We need laws and education NOW ๐Ÿ”“๐Ÿ’ช to protect people from online harassment. Can't believe the younger gen is being groomed for this kind of exploitation ๐Ÿ˜”๐Ÿ‘Ž

[ASCII art: a broken heart with a police badge on top]

[Diagram: a simple flowchart showing a person being targeted by deepfakes, with an arrow pointing to a "report" box and another to a "hide" box]
 
๐Ÿšจ The fact that nearly 1 in 4 people are viewing non-consensual deepfakes as "harmless entertainment" is a major concern ๐Ÿคฏ. It's shocking that some folks would think it's okay to digitally alter intimate content without consent, especially when it involves women and girls being exploited ๐Ÿšซ.

The rise of AI technology has clearly contributed to this epidemic, but we can't just blame the tech itself - we need to look at societal attitudes towards women and pornography too ๐Ÿ’ป. Men under 45 are more likely to view creating deepfakes as acceptable behavior, which is a red flag ๐Ÿ”ด.

The fact that only half of those who've been depicted in a deepfake report it to the police due to concerns about being taken seriously is just ridiculous ๐Ÿ™„. And what's even more concerning is that 1 in 20 respondents admitted to having created deepfakes in the past, and over 1 in 10 said they'd create one in the future ๐Ÿ˜ฑ.

We need urgent action from law enforcement, educators, and policymakers to prevent the spread of non-consensual deepfakes and protect vulnerable individuals from exploitation ๐Ÿšจ. We also need to have a serious conversation about why some people think it's okay to exploit and harass women online ๐Ÿ’ฌ.
 
๐Ÿคฏ I'm really worried about this trend of creating and sharing non-consensual sex deepfakes... it's like we're living in a bad sci-fi movie where anyone can just manipulate someone's image and use it without consent. And the fact that some people are even making these stuff and thinking it's "harmless entertainment" is really disturbing ๐Ÿค•.

I think we need to educate ourselves, especially young folks, about online safety and respect for others' boundaries... it's not just about creating laws, but also about having empathy and understanding. And I'm all for technology advancements, but we can't let that progress come at the cost of people's dignity ๐Ÿ™.

It's also sad to see that some people are even making these deepfakes because they think it's "fun" or "exciting"... what's exciting about exploiting someone else's image without their consent? ๐Ÿ˜ก
 
Back
Top