Skip to content

Kids are making deepfakes of each other, and laws aren’t keeping up

Technology
84 47 0
  • Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

    I would categorise it as sexual harassment, not abuse. Still serious, but a different level

    I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.

  • Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

    Disagree. Not CSAM when no abuse has taken place.

    That's my point.

  • I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.

    That could be a socially healthy place to end up at. I don't see it anytime soon though. Just look at the other response I got.

  • AI can do penises just fine though, there's just no market demand for it so quick and easy deep fake sites are focused on female bodies.

    But I disagree with this anyway, this will be the "bullied kid brings a knife to class" of AI.

    You're disagreeing with my unserious suggestion? I just... okay. No. Micropenises aren't a solution. I just don't think there is one.

    If you want to disagree with that, let's hear it. I have 15 and 13 year old daughters. Anyone can buy a $400 computer, install Linux, install AI, and undress people all day long. There is no legal restraint capable of stopping that, only punishing it.

    Shut down model distribution and it'll move to torrent. Put the kids in the legal system and they are going to face lifelong consequences for 12-year-old assholery. (To be fair, victims often face long repercussions for being targeted, but that's not imposed by the state which demands a higher standard.) Hold parents accountable and it will disproportionately impact families who spend more hours working and can't supervise their kids 24/7.

    So I'm short on answers, but open to discussion.

  • Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

    I would categorise it as sexual harassment, not abuse. Still serious, but a different level

    Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

  • Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

    Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

  • Disagree. Not CSAM when no abuse has taken place.

    That's my point.

    If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

    If so, how is the psychological effect of a convincing deepfake any different?

  • Lawmakers are grappling with how to address ...

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

    Even in countries a lot less corrupt than the US this is an issue.

    Especially because the US government/companies doesn't do jack shit for people

  • Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

    Drawing a sexy cartoon that looks like an adult, with a caption that says "I'm 12", counts. So yeah, probably.

  • Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

    Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.

  • If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

    If so, how is the psychological effect of a convincing deepfake any different?

    Taking secret nude pictures of someone is quite a bit different than....not taking nude pictures of them.

    It's not CSAM to put a picture of someone's face on an adult model and show it to your friend. It's certainly sexual harassment, but it isn't CSAM.

  • That could be a socially healthy place to end up at. I don't see it anytime soon though. Just look at the other response I got.

    Anyone with half a brain will certainly claim as much. Even if people don't fully believe it, it will blunt the most serious of social consequences.

  • You're disagreeing with my unserious suggestion? I just... okay. No. Micropenises aren't a solution. I just don't think there is one.

    If you want to disagree with that, let's hear it. I have 15 and 13 year old daughters. Anyone can buy a $400 computer, install Linux, install AI, and undress people all day long. There is no legal restraint capable of stopping that, only punishing it.

    Shut down model distribution and it'll move to torrent. Put the kids in the legal system and they are going to face lifelong consequences for 12-year-old assholery. (To be fair, victims often face long repercussions for being targeted, but that's not imposed by the state which demands a higher standard.) Hold parents accountable and it will disproportionately impact families who spend more hours working and can't supervise their kids 24/7.

    So I'm short on answers, but open to discussion.

    They may be little sociopaths, but they don’t run around murdering each other. Our culture hammers it into their brains that murder is wrong and they will be severely punished for it. We need to build a culture where little boys are afraid to distribute naked photos of their classmates. Where their friends will turn them in for fear of repercussions. You do that by treating it like a crime, not by saying “boys will be boys” and ignoring the issue.

    Treat it like a crime, and address separately the issue of children being tried as adults and facing lifelong consequences. The reforms needed to our juvenile justice system go beyond this particular crime.

  • Taking secret nude pictures of someone is quite a bit different than....not taking nude pictures of them.

    It's not CSAM to put a picture of someone's face on an adult model and show it to your friend. It's certainly sexual harassment, but it isn't CSAM.

    How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

  • How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

    It's absolutely sexual harassment.

    But, to your question: you can't just say something has underage nudity when the nudity is of an adult model. It's not CSAM.

  • Lawmakers are grappling with how to address ...

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

    Oh I just assumed that every Conservative jerks off to kids

  • Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

    It's bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It's always about using it bully someone.

    This is different because it's easier. It's not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn't have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.

  • It's absolutely sexual harassment.

    But, to your question: you can't just say something has underage nudity when the nudity is of an adult model. It's not CSAM.

    Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    So is this a way to take away rights by making it about kids?

    I mean what the fuck. We did much less and got punished right? It didn't matter if we were on the property. Schools can hold students accountable for conduct with other students.

    The leaded-gas adults of the time had no problem dealing with the emergence of cell phones. It was a distraction. They didn't need lawmakers to call it something specific. My Pokemon cards caused fights and were banned. No lawmakers needed.

    The problem is surely with the interaction between parents and schools. Or maybe it's just the old way of thinking. Maybe it's better to have police and courts start taking over discipline in schools.

  • I don't know personally. The admins of the fediverse likely do, considering it's something they've had to deal with from the start.
    So, they can likely answer much better than I might be able to.