Skip to content

Kids are making deepfakes of each other, and laws aren’t keeping up

Technology
156 75 0
  • especially that Abbot Ted Cruz, who brought this one up, voted against it in the end, which is pretty confusing for an european tbh

    e: i mean that it's memeworthy lol

    I'm confused - by Abbot do you mean Gov. Abbott of Texas, and are we talking about the same issue? Cuz the 99-1 vote was about a senate bill regarding AI. Greg Abbott can't vote on senate bills, and there's no senator named Abbot.

  • I'm confused - by Abbot do you mean Gov. Abbott of Texas, and are we talking about the same issue? Cuz the 99-1 vote was about a senate bill regarding AI. Greg Abbott can't vote on senate bills, and there's no senator named Abbot.

    aaah i misremembered, it was Ted Cruz, oops 😄

  • I think generating and sharing sexually explicit images of a person without their consent is abuse.

    That's distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I'm morally uncomfortable criminalizing an act that has no victim.

    Harassment sure, but not abuse.

  • I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here

    not necessarily. image generation models work on a more fine-grained scale than that. they can seamlessly combine related concepts, like "photograph"+"person"+"small"+"pose" and generate plausible material due to the fact that all of those concepts have features in common.

    you can also use small add-on models trained on very little data (tens to hundreds of images, as compared to millions to billions for a full model) to "steer" the output of a model towards a particular style.

    you can make even a fully legal model output illegal data.

    all that being said, the base dataset that most of the stable diffusion family of models started out with in 2021 is medical in nature so there could very well be bad shit in there. it's like 12 billion images so it's hard to check, and even back with stable diffusion 1.0 there was less than a single bit of data in the final model per image in the data.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

    That's just on it's face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That's not pedophilia. It's wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

    It shouldn't be treated the same as when an adult man generates it; there should be nuance. I'm not saying it's ok for a thirteen year old to generate said content: I'm saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

    I'm so glad I went through puberty at a time when this kind of shit wasn't available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

  • I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here

    This is mostly about swapping faces. You take a video and a photo of someone's face. Software can replace the face of someone in the video with that face. That's been around for a decade or so. There are other ways of doing it.

    When the face belongs to an underage individual, and the video is pornographic...

    LLMs only do text.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    Jfc the replies here are fucking rancid. Lemmy is full of sweaty middle aged blokes in tech who hate it when anyone tells them that grown men who pursue teenage girls who have just reached an arbitrary age are fucking creeps, so of course they're here encouraging the next generation of misogynist scum by defending this shit, too.
    And men (pretend to) wonder why we distrust them.

    Ngl, I'm only leaving reply notifs on for this one to work on my blocklist.

  • When someone makes child porn they put a child in a sexual situation - which is something that we have amassed a pile of evidence is extremely harmful to the child.

    For all you have said - "without the consent" - "being sexualised" - "commodifies their existence" - you haven't told us what the harm is. If you think those things are in and of themselves harmful then I need to know more about what you mean because:

    1. if someone thinks of me sexually without my consent I am not harmed
    2. if someone sexualises me in their mind I am not harmed
    3. I don't know what the "commodification of one's existence" can actually mean - I can't buy or sell "the existence of women" (does buying something's existence mean the same as buying the thing, or something else?) the same I can aluminium, and I don't see how being able to (easily) make (realistic) nude images of someone changes this in any way

    It is genuinely incredible to me that you could be so unempathetic,

    I am not unempathetic, but I attribute the blame for what makes me feel bad about the situation is that girls are being made to feel bad and ashamed not that a particular technology is now being used in one step of that.

    Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

    The harm is:

    • Those photos now exist in the world and can lead to direct harm to the victim by their exposure
    • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
    • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.
    • your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.
  • Jfc the replies here are fucking rancid. Lemmy is full of sweaty middle aged blokes in tech who hate it when anyone tells them that grown men who pursue teenage girls who have just reached an arbitrary age are fucking creeps, so of course they're here encouraging the next generation of misogynist scum by defending this shit, too.
    And men (pretend to) wonder why we distrust them.

    Ngl, I'm only leaving reply notifs on for this one to work on my blocklist.

    Yeah there’s some nasty shit here. Big yikes, Lemmy.

  • If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

    If so, how is the psychological effect of a convincing deepfake any different?

    If someone puts a camera in a locker room, that means that someone entered a space where you would usually feel safe. It implies the potential of a physical threat.

    It also means that someone observed you when you were doing "secret" things. One may feel vulnerable in such situations. Even a seasoned nude model might be embarrassed to be seen while changing, maybe in a dishevelled state.

    I would think it is very different. Unless you're only thinking about the psychological effect on the viewer.

  • For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

    That's just on it's face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That's not pedophilia. It's wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

    It shouldn't be treated the same as when an adult man generates it; there should be nuance. I'm not saying it's ok for a thirteen year old to generate said content: I'm saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

    I'm so glad I went through puberty at a time when this kind of shit wasn't available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

    As a father of teenage girls, I don't necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.

  • Lawmakers are grappling with how to address ...

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

    In the case of US govt, the AI part of the bill they voted against was the part that blocked regulations on AI for a period of 10 years.

    In case that wasn't clear, the US govt voted in favor of regulating AI. 99-1.

  • I would consider that as qualifying. Because it's targeted harassment in a sexually-explicit manner. All the girl would have to do is claim it's her.

    Source: I'm a father of teenage daughters. I would pursue the individual(s) who started it and make them regret their choices.

  • Disagree. Not CSAM when no abuse has taken place.

    That's my point.

    Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child's identity.

    Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material... CSHAM, or maybe just CSAM, you know, to remember it more easily.

  • Historically, the respectability of a woman depended on her sexuality. In many conservative cultures and communities, that is still true. Spreading the message that deepfakes are some particular horrible form of harassment reinforces that view.

    If having your head on the model of a nude model is a terrible crime, then what does that say about the nude model? What does it say about women who simply happen to develop a larger bosom or lips? What does it say about sex before marriage?

    The implicit message here is simply harmful to girls and women.

    That doesn't mean that we should tolerate harassment. But it needs to be understood that we can do no more to stop this kind of harassment than we can do to stop any other kind.

    Spoken like someone who hasn't been around women.

  • For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

    That's just on it's face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That's not pedophilia. It's wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

    It shouldn't be treated the same as when an adult man generates it; there should be nuance. I'm not saying it's ok for a thirteen year old to generate said content: I'm saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

    I'm so glad I went through puberty at a time when this kind of shit wasn't available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

    Punishment for an adult man doing this: Prison

    Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.

  • As a father of teenage girls, I don't necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.

    Yes, absolutely. But with recognition that a thirteen year old kid isn't a predator but a horny little kid. I'll let others determine what that punishment is, but I don't believe it's prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we're ratcheting up the punishment, but still not adult prison.

  • Oh I just assumed that every Conservative jerks off to kids

    Get some receipts and that will be a start.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

    It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

  • Get some receipts and that will be a start.

    Receipts you say?

    We're at 56 pages of this now for a nice round count of 1400 charges

    So far as I am aware all of these are publicly searchable court cases

  • 113 Stimmen
    10 Beiträge
    2 Aufrufe
    S
    I admire your positivity. I do not share it though, because from what I have seen, because even if there are open weights, the one with the biggest datacenter will in the future hold the most intelligent and performance model. Very similar to how even if storage space is very cheap today, large companies are holding all the data anyway. AI will go the same way, and thus the megacorps will and in some extent already are owning not only our data, but our thoughts and the ability to modify them. I mean, sponsored prompt injection is just the first thought modifying thing, imagine Google search sponsored hits, but instead it's a hyperconvincing AI response that subtly nudges you to a certain brand or way of thinking. Absolutely terrifies me, especially with all the research Meta has done on how to manipulate people's mood and behaviour through which social media posts they are presented with
  • 0 Stimmen
    1 Beiträge
    3 Aufrufe
    Niemand hat geantwortet
  • Apple to Australians: You’re Too Stupid to Choose Your Own Apps

    Technology technology
    60
    1
    398 Stimmen
    60 Beiträge
    27 Aufrufe
    P
    I was always surprised by that (t9 dialing). Surely there was some legal reason for that. It felt so - primative.
  • How Do I Prepare My Phone for a Protest?

    Technology technology
    139
    1
    506 Stimmen
    139 Beiträge
    57 Aufrufe
    D
    So first, even here we see foundation money and big tech, not government. Facebook, Google, etc mostly love net neutrality, tolerate encryption, anf see utility in anonymous internet access, mostly because these things don't interfere with their core advertising businesses, and generally have helped them. I didn't see Comcast and others in the ISP oligopoly on that list, probably because they would not benefit from net neutrality, encryption, and privacy for obvious reasons. The EFF advocates for particular civil libertarian policies, always has. That does attract certain donors, but not others. They have plenty of diverse and grassroots support too. One day they may have to choose between their corpo donors and their values, but I have yet to see them abandon principles.
  • 74 Stimmen
    10 Beiträge
    16 Aufrufe
    C
    Time to start chopping down flock cameras.
  • Catbox.moe got screwed 😿

    Technology technology
    40
    55 Stimmen
    40 Beiträge
    44 Aufrufe
    archrecord@lemm.eeA
    I'll gladly give you a reason. I'm actually happy to articulate my stance on this, considering how much I tend to care about digital rights. Services that host files should not be held responsible for what users upload, unless: The service explicitly caters to illegal content by definition or practice (i.e. the if the website is literally titled uploadyourcsamhere[.]com then it's safe to assume they deliberately want to host illegal content) The service has a very easy mechanism to remove illegal content, either when asked, or through simple monitoring systems, but chooses not to do so (catbox does this, and quite quickly too) Because holding services responsible creates a whole host of negative effects. Here's some examples: Someone starts a CDN and some users upload CSAM. The creator of the CDN goes to jail now. Nobody ever wants to create a CDN because of the legal risk, and thus the only providers of CDNs become shady, expensive, anonymously-run services with no compliance mechanisms. You run a site that hosts images, and someone decides they want to harm you. They upload CSAM, then report the site to law enforcement. You go to jail. Anybody in the future who wants to run an image sharing site must now self-censor to try and not upset any human being that could be willing to harm them via their site. A social media site is hosting the posts and content of users. In order to be compliant and not go to jail, they must engage in extremely strict filtering, otherwise even one mistake could land them in jail. All users of the site are prohibited from posting any NSFW or even suggestive content, (including newsworthy media, such as an image of bodies in a warzone) and any violation leads to an instant ban, because any of those things could lead to a chance of actually illegal content being attached. This isn't just my opinion either. Digital rights organizations such as the Electronic Frontier Foundation have talked at length about similar policies before. To quote them: "When social media platforms adopt heavy-handed moderation policies, the unintended consequences can be hard to predict. For example, Twitter’s policies on sexual material have resulted in posts on sexual health and condoms being taken down. YouTube’s bans on violent content have resulted in journalism on the Syrian war being pulled from the site. It can be tempting to attempt to “fix” certain attitudes and behaviors online by placing increased restrictions on users’ speech, but in practice, web platforms have had more success at silencing innocent people than at making online communities healthier." Now, to address the rest of your comment, since I don't just want to focus on the beginning: I think you have to actively moderate what is uploaded Catbox does, and as previously mentioned, often at a much higher rate than other services, and at a comparable rate to many services that have millions, if not billions of dollars in annual profits that could otherwise be spent on further moderation. there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal. The problem isn't necessarily the speed at which people can be reported and punished, but rather that the internet is fundamentally harder to track people on than real life. It's easy for cops to sit around at a spot they know someone will be physically distributing illegal content at in real life, but digitally, even if you can see the feed of all the information passing through the service, a VPN or Tor connection will anonymize your IP address in a manner that most police departments won't be able to track, and most three-letter agencies will simply have a relatively low success rate with. There's no good solution to this problem of identifying perpetrators, which is why platforms often focus on moderation over legal enforcement actions against users so frequently. It accomplishes the goal of preventing and removing the content without having to, for example, require every single user of the internet to scan an ID (and also magically prevent people from just stealing other people's access tokens and impersonating their ID) I do agree, however, that we should probably provide larger amounts of funding, training, and resources, to divisions who's sole goal is to go after online distribution of various illegal content, primarily that which harms children, because it's certainly still an issue of there being too many reports to go through, even if many of them will still lead to dead ends. I hope that explains why making file hosting services liable for user uploaded content probably isn't the best strategy. I hate to see people with good intentions support ideas that sound good in practice, but in the end just cause more untold harms, and I hope you can understand why I believe this to be the case.
  • 88 Stimmen
    21 Beiträge
    32 Aufrufe
    J
    The self hosted model has hard coded censored content.
  • 1 Stimmen
    3 Beiträge
    10 Aufrufe
    Z
    Yes i'm looking for erp system like sap