Skip to content

Kids are making deepfakes of each other, and laws aren’t keeping up

Technology
172 77 1.3k
  • Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

    I would categorise it as sexual harassment, not abuse. Still serious, but a different level

    Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

  • Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

    I would categorise it as sexual harassment, not abuse. Still serious, but a different level

    I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.

  • Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

    Disagree. Not CSAM when no abuse has taken place.

    That's my point.

  • I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.

    That could be a socially healthy place to end up at. I don't see it anytime soon though. Just look at the other response I got.

  • AI can do penises just fine though, there's just no market demand for it so quick and easy deep fake sites are focused on female bodies.

    But I disagree with this anyway, this will be the "bullied kid brings a knife to class" of AI.

    You're disagreeing with my unserious suggestion? I just... okay. No. Micropenises aren't a solution. I just don't think there is one.

    If you want to disagree with that, let's hear it. I have 15 and 13 year old daughters. Anyone can buy a $400 computer, install Linux, install AI, and undress people all day long. There is no legal restraint capable of stopping that, only punishing it.

    Shut down model distribution and it'll move to torrent. Put the kids in the legal system and they are going to face lifelong consequences for 12-year-old assholery. (To be fair, victims often face long repercussions for being targeted, but that's not imposed by the state which demands a higher standard.) Hold parents accountable and it will disproportionately impact families who spend more hours working and can't supervise their kids 24/7.

    So I'm short on answers, but open to discussion.

  • Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

    I would categorise it as sexual harassment, not abuse. Still serious, but a different level

    Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

  • Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

    Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

  • Disagree. Not CSAM when no abuse has taken place.

    That's my point.

    If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

    If so, how is the psychological effect of a convincing deepfake any different?

  • Lawmakers are grappling with how to address ...

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

    Even in countries a lot less corrupt than the US this is an issue.

    Especially because the US government/companies doesn't do jack shit for people

  • Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

    Drawing a sexy cartoon that looks like an adult, with a caption that says "I'm 12", counts. So yeah, probably.

  • Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

    Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.

  • If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

    If so, how is the psychological effect of a convincing deepfake any different?

    Taking secret nude pictures of someone is quite a bit different than....not taking nude pictures of them.

    It's not CSAM to put a picture of someone's face on an adult model and show it to your friend. It's certainly sexual harassment, but it isn't CSAM.

  • That could be a socially healthy place to end up at. I don't see it anytime soon though. Just look at the other response I got.

    Anyone with half a brain will certainly claim as much. Even if people don't fully believe it, it will blunt the most serious of social consequences.

  • You're disagreeing with my unserious suggestion? I just... okay. No. Micropenises aren't a solution. I just don't think there is one.

    If you want to disagree with that, let's hear it. I have 15 and 13 year old daughters. Anyone can buy a $400 computer, install Linux, install AI, and undress people all day long. There is no legal restraint capable of stopping that, only punishing it.

    Shut down model distribution and it'll move to torrent. Put the kids in the legal system and they are going to face lifelong consequences for 12-year-old assholery. (To be fair, victims often face long repercussions for being targeted, but that's not imposed by the state which demands a higher standard.) Hold parents accountable and it will disproportionately impact families who spend more hours working and can't supervise their kids 24/7.

    So I'm short on answers, but open to discussion.

    They may be little sociopaths, but they don’t run around murdering each other. Our culture hammers it into their brains that murder is wrong and they will be severely punished for it. We need to build a culture where little boys are afraid to distribute naked photos of their classmates. Where their friends will turn them in for fear of repercussions. You do that by treating it like a crime, not by saying “boys will be boys” and ignoring the issue.

    Treat it like a crime, and address separately the issue of children being tried as adults and facing lifelong consequences. The reforms needed to our juvenile justice system go beyond this particular crime.

  • Taking secret nude pictures of someone is quite a bit different than....not taking nude pictures of them.

    It's not CSAM to put a picture of someone's face on an adult model and show it to your friend. It's certainly sexual harassment, but it isn't CSAM.

    How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

  • How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

    It's absolutely sexual harassment.

    But, to your question: you can't just say something has underage nudity when the nudity is of an adult model. It's not CSAM.

  • Lawmakers are grappling with how to address ...

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

    Oh I just assumed that every Conservative jerks off to kids

  • Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

    It's bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It's always about using it bully someone.

    This is different because it's easier. It's not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn't have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.

  • It's absolutely sexual harassment.

    But, to your question: you can't just say something has underage nudity when the nudity is of an adult model. It's not CSAM.

    Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    So is this a way to take away rights by making it about kids?

    I mean what the fuck. We did much less and got punished right? It didn't matter if we were on the property. Schools can hold students accountable for conduct with other students.

    The leaded-gas adults of the time had no problem dealing with the emergence of cell phones. It was a distraction. They didn't need lawmakers to call it something specific. My Pokemon cards caused fights and were banned. No lawmakers needed.

    The problem is surely with the interaction between parents and schools. Or maybe it's just the old way of thinking. Maybe it's better to have police and courts start taking over discipline in schools.

  • 119 Stimmen
    8 Beiträge
    46 Aufrufe
    wizardbeard@lemmy.dbzer0.comW
    Most still are/can be. Enough that I find it hard to believe people are missing out without podcasts through these paid services.
  • Converting An E-Paper Photo Frame Into Weather Map

    Technology technology
    2
    1
    113 Stimmen
    2 Beiträge
    21 Aufrufe
    indibrony@lemmy.worldI
    Looks like East Anglia has basically disappeared. At least nothing of value was lost
  • 1 Stimmen
    1 Beiträge
    12 Aufrufe
    Niemand hat geantwortet
  • Is the ‘tech bro-ification’ of abortion here?

    Technology technology
    15
    1
    69 Stimmen
    15 Beiträge
    78 Aufrufe
    T
    Nah. Been working in tech for nearly 30 years, "tech bro" is a delineation. Keeps the fuckers from smearing the rest of us
  • 6 Stimmen
    4 Beiträge
    28 Aufrufe
    T
    Oh I agree. I just think is part of the equation perhaps the thinner and lighter will enable for better processor? Not an AR guy , although I lived my oculus until FB got hold of it. Didn't use it ever again after that day.
  • Microsoft's AI Secretly Copying All Your Private Messages

    Technology technology
    4
    1
    0 Stimmen
    4 Beiträge
    33 Aufrufe
    S
    Forgive me for not explaining better. Here are the terms potentially needing explanation. Provisioning in this case is initial system setup, the kind of stuff you would do manually after a fresh install, but usually implies a regimented and repeatable process. Virtual Machine (VM) snapshots are like a save state in a game, and are often used to reset a virtual machine to a particular known-working condition. Preboot Execution Environment (PXE, aka ‘network boot’) is a network adapter feature that lets you boot a physical machine from a hosted network image rather than the usual installation on locally attached storage. It’s probably tucked away in your BIOS settings, but many computers have the feature since it’s a common requirement in commercial deployments. As with the VM snapshot described above, a PXE image is typically a known-working state that resets on each boot. Non-virtualized means not using hardware virtualization, and I meant specifically not running inside a virtual machine. Local-only means without a network or just not booting from a network-hosted image. Telemetry refers to data collecting functionality. Most software has it. Windows has a lot. Telemetry isn’t necessarily bad since it can, for example, help reveal and resolve bugs and usability problems, but it is easily (and has often been) abused by data-hungry corporations like MS, so disabling it is an advisable precaution. MS = Microsoft OSS = Open Source Software Group policies are administrative settings in Windows that control standards (for stuff like security, power management, licensing, file system and settings access, etc.) for user groups on a machine or network. Most users stick with the defaults but you can edit these yourself for a greater degree of control. Docker lets you run software inside “containers” to isolate them from the rest of the environment, exposing and/or virtualizing just the resources they need to run, and Compose is a related tool for defining one or more of these containers, how they interact, etc. To my knowledge there is no one-to-one equivalent for Windows. Obviously, many of these concepts relate to IT work, as are the use-cases I had in mind, but the software is simple enough for the average user if you just pick one of the premade playbooks. (The Atlas playbook is popular among gamers, for example.) Edit: added explanations for docker and telemetry
  • 0 Stimmen
    3 Beiträge
    27 Aufrufe
    thehatfox@lemmy.worldT
    The platform owners don’t consider engagement to me be participation in meaningful discourse. Engagement to them just means staying on the platform while seeing ads. If bots keep people doing that those platforms will keep letting them in.
  • *deleted by creator*

    Technology technology
    1
    1
    0 Stimmen
    1 Beiträge
    15 Aufrufe
    Niemand hat geantwortet