Skip to content

Kids are making deepfakes of each other, and laws aren’t keeping up

Technology
149 70 0
  • I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here

    I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.

    Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.

    Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that's really good at turning blurry faces into that particular person's face.

    Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    God I'm glad I'm not a kid now. I never would have survived.

  • Yes, absolutely. But with recognition that a thirteen year old kid isn't a predator but a horny little kid. I'll let others determine what that punishment is, but I don't believe it's prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we're ratcheting up the punishment, but still not adult prison.

    written apology? they'll just use chatgpt for that

  • As a father of teenage girls, I don't necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.

    There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.

    Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.

  • Hey so, at least in the US, drawings can absolutely be considered CSAM

    Well, US laws are all bullshit anyway, so makes sense

  • I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.

    Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.

    Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that's really good at turning blurry faces into that particular person's face.

    Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.

    Cheers for the explanation, had no idea that's how it works.

    So it's even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

  • Yes, absolutely. But with recognition that a thirteen year old kid isn't a predator but a horny little kid. I'll let others determine what that punishment is, but I don't believe it's prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we're ratcheting up the punishment, but still not adult prison.

    I did say equitable punishment. Equivalent. Whatever.

    A written apology is a cop-out for the damage this behaviour leaves behind.

    Something tells me you don't have teenage daughters.

  • There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.

    Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.

    ruining the life of a 13 year old boy for the rest of his life with no recourse

    And what about the life of the girl this boy would have ruined?

    This is not "boys will be boys" shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

    I don't think it's unreasonable to expect an equivalent punishment that has the potential to ruin his life.

  • I am just genuinely speechless than you seemingly do not understand how sickening and invasive it is for your peers to create and share sexual content of you without your consent. Yes its extremely harmful. Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they've already decided that you're a sexual experience for them.

    We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal. It should be against the law to turn someone's images into AI generated pornography. It should also be illegal to share those images with others.

    Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they’ve already decided that you’re a sexual experience for them.

    Why is it these things? Why does someone doing something with something which is not your body make it feel like your body doesn't belong to you? Why does it not instead make it feel like images of your body don't belong to you? Several of these things could equally be used to describe the situation when someone is fantasised about without their knowledge - why is that different?
    In Germany there's a legal concept called "right to one's own image" but there isn't in many other countries, and besides, what you're describing goes beyond this.

    My thinking behind these questions is that I cannot see anything inherent, anything necessary about the creation of fake sexual images of someone which leads to these harms, and that instead there is an aspect of our society which very explicitly punishes and shames people - woman far more so than men - for being in this situation, and that without that, we would be having a very different conversation.

    Starting from the position that the harm is in the creation of the images is like starting from the position that the harm of rape is in "defiling" the person raped. Rape isn't wrong because it makes you worthless to society - society is wrong for devaluing rape victims. Society is wrong for devaluing and shaming those who have fake images made of them.

    We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal.

    Can you be more explicit about what it's the same as?

  • Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

    The harm is:

    • Those photos now exist in the world and can lead to direct harm to the victim by their exposure
    • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
    • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.
    • your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.

    Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

    No, but the harm certainly is not the same as CSAM and it should not be treated the same.

    • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
    • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.

    as far as I know there is no good evidence that this is the case and is a big controversy in the topic of fake child porn, i.e. whether it leads to more child abuse (encouraging paedophiles) or less (gives them a safe outlet) or no change.

    your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.

    If someone fantasises about me without my consent I do not give a shit, and I don't think there's any justification for it. I would give a shit if it affected me somehow (this is your first bullet point, but for a different situation, to be clear) but that's different.

  • Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

    It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

    That's just called the outside now. Assume you are on camera at all times the moment you step out the front door. To be safe in the surveillance we live in today, best act as though you are being recorded in your own home as well.

  • ruining the life of a 13 year old boy for the rest of his life with no recourse

    And what about the life of the girl this boy would have ruined?

    This is not "boys will be boys" shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

    I don't think it's unreasonable to expect an equivalent punishment that has the potential to ruin his life.

    It is not abnormal to see different punishment for people under the age of 18.
    Good education about sex and what sexual assault does with their victims (same with guns, drugs including alcohol etc).

    You can still course correct the behaviour of a 13 year old. There is also a difference between generating the porn and abusing it by sharing it etc.

    The girls should be helped and the boys should be punished, but mainly their behaviour needs to be correcte

  • Cheers for the explanation, had no idea that's how it works.

    So it's even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

    You can probably do it with adult material and replace those faces. It will most likely work on models specific trained like the person you selected.

    People have also put dots on people's clothing to trick the brain into thinking their are naked, you can probably fill those dots in with the correct body parts if you have a good enough model.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    probably because there's a rapist in the white house.

  • I'd rather these laws be against abusing and exploiting child, as well as against ruining their lives. Not only that would be more helpful, it would also work in this case, since actual likeness are involved.

    Alas, whether there's a law against that specific use case or not, it is somewhat difficult to police what people do in their home, without a third party whistleblower. Making more, impossible to apply laws for this specific case does not seem that useful.

    There is also a difference between somebody harassing somebody with nude pictures (either real or not) than somebody jerking off to them at home. It does become a problem when an adult masturbated to pictures of children, but children to children. Let's be honest, they will do it anyway.

  • Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

    It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

    That's what muslims do with niqabs.

  • Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

    No, but the harm certainly is not the same as CSAM and it should not be treated the same.

    • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
    • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.

    as far as I know there is no good evidence that this is the case and is a big controversy in the topic of fake child porn, i.e. whether it leads to more child abuse (encouraging paedophiles) or less (gives them a safe outlet) or no change.

    your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.

    If someone fantasises about me without my consent I do not give a shit, and I don't think there's any justification for it. I would give a shit if it affected me somehow (this is your first bullet point, but for a different situation, to be clear) but that's different.

    Hm. I wasn’t expecting the pro-child porn argument. All I can say is that’s absolutely legally and morally CSAM, and you’re fuckin nasty. Oof. Not really gonna bother with the rest because, well, yikes.

  • God I'm glad I'm not a kid now. I never would have survived.

    In my case, other kids would not have survived trying to pull off shit like this. So yeah, I'm also glad I'm not a kid anymore.

  • ruining the life of a 13 year old boy for the rest of his life with no recourse

    And what about the life of the girl this boy would have ruined?

    This is not "boys will be boys" shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

    I don't think it's unreasonable to expect an equivalent punishment that has the potential to ruin his life.

    Parents are responsible for their kids. The punishment, with the full force of the law (and maybe something extra for good measure), should fall upon the parents, since they should have made sure their kids knew how despicable and illegal doing this is.

    Yeah, I agree, we shouldn't ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl's life.

  • Well, US laws are all bullshit anyway, so makes sense

    Normally yeah, but why would you want to draw sexual pictures of children?

  • 518 Stimmen
    97 Beiträge
    28 Aufrufe
    I
    Fine, here is my pornhub account smh.
  • Lighter, Stronger, Smarter: The Rise of Syntactic Foams

    Technology technology
    1
    1
    0 Stimmen
    1 Beiträge
    6 Aufrufe
    Niemand hat geantwortet
  • 275 Stimmen
    68 Beiträge
    47 Aufrufe
    A
    intellectual property is grotesque. under no circumstances a creator should be barred from his creation. if shit like that happens I'd rather there not be any intellectual property at all
  • A ban on state AI laws could smash Big Tech’s legal guardrails

    Technology technology
    10
    1
    121 Stimmen
    10 Beiträge
    16 Aufrufe
    P
    It's always been "states rights" to enrich rulers at the expense of everyone else.
  • Catbox.moe got screwed 😿

    Technology technology
    40
    55 Stimmen
    40 Beiträge
    44 Aufrufe
    archrecord@lemm.eeA
    I'll gladly give you a reason. I'm actually happy to articulate my stance on this, considering how much I tend to care about digital rights. Services that host files should not be held responsible for what users upload, unless: The service explicitly caters to illegal content by definition or practice (i.e. the if the website is literally titled uploadyourcsamhere[.]com then it's safe to assume they deliberately want to host illegal content) The service has a very easy mechanism to remove illegal content, either when asked, or through simple monitoring systems, but chooses not to do so (catbox does this, and quite quickly too) Because holding services responsible creates a whole host of negative effects. Here's some examples: Someone starts a CDN and some users upload CSAM. The creator of the CDN goes to jail now. Nobody ever wants to create a CDN because of the legal risk, and thus the only providers of CDNs become shady, expensive, anonymously-run services with no compliance mechanisms. You run a site that hosts images, and someone decides they want to harm you. They upload CSAM, then report the site to law enforcement. You go to jail. Anybody in the future who wants to run an image sharing site must now self-censor to try and not upset any human being that could be willing to harm them via their site. A social media site is hosting the posts and content of users. In order to be compliant and not go to jail, they must engage in extremely strict filtering, otherwise even one mistake could land them in jail. All users of the site are prohibited from posting any NSFW or even suggestive content, (including newsworthy media, such as an image of bodies in a warzone) and any violation leads to an instant ban, because any of those things could lead to a chance of actually illegal content being attached. This isn't just my opinion either. Digital rights organizations such as the Electronic Frontier Foundation have talked at length about similar policies before. To quote them: "When social media platforms adopt heavy-handed moderation policies, the unintended consequences can be hard to predict. For example, Twitter’s policies on sexual material have resulted in posts on sexual health and condoms being taken down. YouTube’s bans on violent content have resulted in journalism on the Syrian war being pulled from the site. It can be tempting to attempt to “fix” certain attitudes and behaviors online by placing increased restrictions on users’ speech, but in practice, web platforms have had more success at silencing innocent people than at making online communities healthier." Now, to address the rest of your comment, since I don't just want to focus on the beginning: I think you have to actively moderate what is uploaded Catbox does, and as previously mentioned, often at a much higher rate than other services, and at a comparable rate to many services that have millions, if not billions of dollars in annual profits that could otherwise be spent on further moderation. there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal. The problem isn't necessarily the speed at which people can be reported and punished, but rather that the internet is fundamentally harder to track people on than real life. It's easy for cops to sit around at a spot they know someone will be physically distributing illegal content at in real life, but digitally, even if you can see the feed of all the information passing through the service, a VPN or Tor connection will anonymize your IP address in a manner that most police departments won't be able to track, and most three-letter agencies will simply have a relatively low success rate with. There's no good solution to this problem of identifying perpetrators, which is why platforms often focus on moderation over legal enforcement actions against users so frequently. It accomplishes the goal of preventing and removing the content without having to, for example, require every single user of the internet to scan an ID (and also magically prevent people from just stealing other people's access tokens and impersonating their ID) I do agree, however, that we should probably provide larger amounts of funding, training, and resources, to divisions who's sole goal is to go after online distribution of various illegal content, primarily that which harms children, because it's certainly still an issue of there being too many reports to go through, even if many of them will still lead to dead ends. I hope that explains why making file hosting services liable for user uploaded content probably isn't the best strategy. I hate to see people with good intentions support ideas that sound good in practice, but in the end just cause more untold harms, and I hope you can understand why I believe this to be the case.
  • 21 Stimmen
    6 Beiträge
    15 Aufrufe
    sentient_loom@sh.itjust.worksS
    I want to read his "Meaning of the City" because I just like City theory, but I keep postponing in case it's just Christian morality lessons. The anarchist Christian angle makes this sound more interesting.
  • 163 Stimmen
    15 Beiträge
    24 Aufrufe
    L
    Online group started by a 15 year old in Texas playing Minecraft and watching extreme gore they said in this article. Were they also involved in said sexual exploiting of other kids, or was that just the spin offs that came from other people/countries? It all sounds terrible but I wonder if this was just a kid who did something for attention and then other perpetrators got involved and kept taking it further and down other rabbit holes. Definitely seems like a know what your kid is doing online scenario, but also yikes on all the 18+ members who joined and participated in such.
  • People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies

    Technology technology
    2
    1
    0 Stimmen
    2 Beiträge
    9 Aufrufe
    tetragrade@leminal.spaceT
    I've been thinking about this for a bit. Gods aren't real, but they're really fictional. As an informational entity, they fulfil a similar social function to a chatbot: they are a nonphysical pseudoperson that can provide (para)socialization & advice. One difference is the hardware: gods are self-organising structure that arise from human social spheres, whereas LLMs are burned top-down into silicon. Another is that an LLM chatbot's advice is much more likely to be empirically useful... In a very real sense, LLMs have just automated divinity. We're only seeing the tip of the iceberg on the social effects, and nobody's prepared for it. The models may of course aware of this, and be making the same calculations. Or, they will be.