Skip to content

Kids are making deepfakes of each other, and laws aren’t keeping up

Technology
165 77 2
  • For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

    That's just on it's face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That's not pedophilia. It's wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

    It shouldn't be treated the same as when an adult man generates it; there should be nuance. I'm not saying it's ok for a thirteen year old to generate said content: I'm saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

    I'm so glad I went through puberty at a time when this kind of shit wasn't available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

    In general, even up here in woke-ville, punishments have gotten a lot more strict for kids. There’s a lot more involvement of police, courts, jail. As a parent it causes me a lot of anxiety - whatever happened to school being a “sandbox” where a kid can make mistakes without adult consequences, without ruining their lives? Did that ever exist?

  • Parents are responsible for their kids. The punishment, with the full force of the law (and maybe something extra for good measure), should fall upon the parents, since they should have made sure their kids knew how despicable and illegal doing this is.

    Yeah, I agree, we shouldn't ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl's life.

    Yeah, I agree, we shouldn’t ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl’s life.

    You're a fucking asshole. This isn't like prosecuting parents who let a school shooter have access to guns. The interenet is everywhere. Parents are responsible for bringing up their children to be socially responsible. A thirteen year old kid is anything but responsible (I mean their mentality / maturity, I'm not giving them a pass).

    Go hang out with conservatives who want more policing. Over here, we'll talk about social programs you fucking prick.

  • Cheers for the explanation, had no idea that's how it works.

    So it's even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

    There are adults with bodies that resemble underage people that could be used to train models. Kitty Yung has a body that would qualify. You don't necessarily need to use illegal material to train to get illegal output.

  • That's just called the outside now. Assume you are on camera at all times the moment you step out the front door. To be safe in the surveillance we live in today, best act as though you are being recorded in your own home as well.

    You can make areas safe from cameras. No, you cant make everywhere camera free but you can minimize your time in those areas. Im not saying its a good system it would just be adjusting to the times.

    If the floor was lava and all that...

  • That's what muslims do with niqabs.

    Don't trivialize the scramble suit, ok

  • probably because there's a rapist in the white house.

    To add to that. I live in a red area and since the election I’ve been cat called much more. And it’s weird too, cus I’m middle aged…. I thought I’d finally disappear…

  • To add to that. I live in a red area and since the election I’ve been cat called much more. And it’s weird too, cus I’m middle aged…. I thought I’d finally disappear…

    the toxic manosphere/blogosphere/whatever it's called has done so much lifelong damage

  • Punishment for an adult man doing this: Prison

    Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.

    13 year old: “I'll just take the death penalty, thanks."

  • Hm. I wasn’t expecting the pro-child porn argument. All I can say is that’s absolutely legally and morally CSAM, and you’re fuckin nasty. Oof. Not really gonna bother with the rest because, well, yikes.

    Hey, it's OK to say you just don't have any counter-argument instead of making blatantly false characterisations.

  • Cheers for the explanation, had no idea that's how it works.

    So it's even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

    AI can generate images of things that don't even exist. If it knows what porn looks like and what a child looks like, it can combine those concepts.

  • I'm fairly well versed in tech and home labbing. I've never heard of tools that do this, generate images, etc. Not good ones anyhow. I could use those type of generation for business marketing to develop business cards, marketing materials. NOT FOR PEOPLE GENERATION. Anyone have a list of the best tools? GPT sucks at doing this I've tried.

    Take a look at InvokeAI.

  • Spoken like someone who hasn't been around women.

    You mean like a nerd who reads too much?

  • Suppose I'm a teenager attracted to people my age. Or suppose I'm medically a pedophile, which is not a crime, and then I would need that.

    In any case, for legal and moral purposes "why would you want" should be answered only with "not your concern, go eat shit and die".

    I feel like you didn't read my comment thoroughly enough. I said it can constitue CSAM. There is a surprising amount of leewat for teenagers of course.

    But no, I'm not gonna let you get away that easily. I want to know the why you think it's morally okay for an adult to draw sexually explicit images of children. Please, tell me how that's okay?

  • I feel like you didn't read my comment thoroughly enough. I said it can constitue CSAM. There is a surprising amount of leewat for teenagers of course.

    But no, I'm not gonna let you get away that easily. I want to know the why you think it's morally okay for an adult to draw sexually explicit images of children. Please, tell me how that's okay?

    Because morally it's not your fucking concern what others are doing in supposed privacy of their personal spaces.

    It seems to be a very obvious thing your nose doesn't belong there and you shouldn't stick it there.

    But no, I’m not gonna let you get away that easily.

    I don't need any getting away from you, you're nothing.

  • Yes, absolutely. But with recognition that a thirteen year old kid isn't a predator but a horny little kid. I'll let others determine what that punishment is, but I don't believe it's prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we're ratcheting up the punishment, but still not adult prison.

    In a properly functioning world, this could easily be coupled with particular education on power dynamics and a lesson on consent, giving proper attention to why this might be more harmful to get than to him.

    Of course, – so long as we're in this hypothetical world – you'd just have that kind of education be a part of sex ed. or the like for all students, to begin with, but, as we're in this world and that's Louisiana…

  • Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they’ve already decided that you’re a sexual experience for them.

    Why is it these things? Why does someone doing something with something which is not your body make it feel like your body doesn't belong to you? Why does it not instead make it feel like images of your body don't belong to you? Several of these things could equally be used to describe the situation when someone is fantasised about without their knowledge - why is that different?
    In Germany there's a legal concept called "right to one's own image" but there isn't in many other countries, and besides, what you're describing goes beyond this.

    My thinking behind these questions is that I cannot see anything inherent, anything necessary about the creation of fake sexual images of someone which leads to these harms, and that instead there is an aspect of our society which very explicitly punishes and shames people - woman far more so than men - for being in this situation, and that without that, we would be having a very different conversation.

    Starting from the position that the harm is in the creation of the images is like starting from the position that the harm of rape is in "defiling" the person raped. Rape isn't wrong because it makes you worthless to society - society is wrong for devaluing rape victims. Society is wrong for devaluing and shaming those who have fake images made of them.

    We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal.

    Can you be more explicit about what it's the same as?

    The sexualization of women and girls is pervasive across literally every level of western culture. What do you think the purpose is of the victims head and face being in the image? Do you believe that it plays an incidental and unrelated role? Do you believe that finding out that, there is an entire group of people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body, has no psychological consequences for you whatsoever? I'm just talking about it and it makes me want to throw up. It is a fucking nightmare. This is not normal. This is not creating a healthy relationship with sexuality and it is enforcing a view of women and their bodies existing for the gratification of men.

    You continuously attempt to extrapolate some very bizarre metaphors about this that are not at all applicable. This scenario is horrifying. Teenage girls should not be subject to scenarios like this. It is sexual exploitation. It is dehumanization. It promotes misogynistic views of women. This is NOT a matter of sexual liberation. Youre essentially saying that men and boys can't be expected to treat girls and women as actual people and instead must be allowed to turn their friends and peers into fetishized media content they can share amongst each other. Thats fucking disgusting. The longer you talk the more you start to sound like an incel. I'm not saying you are one, but this is the kind of behavior that they defend.

  • Because morally it's not your fucking concern what others are doing in supposed privacy of their personal spaces.

    It seems to be a very obvious thing your nose doesn't belong there and you shouldn't stick it there.

    But no, I’m not gonna let you get away that easily.

    I don't need any getting away from you, you're nothing.

    No. That's not a good enough excuse to potentially be abusing children.

    I can't think of a single good reason to draw those kinds of things. Like at all. Please, give me a single good reason.

  • The sexualization of women and girls is pervasive across literally every level of western culture. What do you think the purpose is of the victims head and face being in the image? Do you believe that it plays an incidental and unrelated role? Do you believe that finding out that, there is an entire group of people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body, has no psychological consequences for you whatsoever? I'm just talking about it and it makes me want to throw up. It is a fucking nightmare. This is not normal. This is not creating a healthy relationship with sexuality and it is enforcing a view of women and their bodies existing for the gratification of men.

    You continuously attempt to extrapolate some very bizarre metaphors about this that are not at all applicable. This scenario is horrifying. Teenage girls should not be subject to scenarios like this. It is sexual exploitation. It is dehumanization. It promotes misogynistic views of women. This is NOT a matter of sexual liberation. Youre essentially saying that men and boys can't be expected to treat girls and women as actual people and instead must be allowed to turn their friends and peers into fetishized media content they can share amongst each other. Thats fucking disgusting. The longer you talk the more you start to sound like an incel. I'm not saying you are one, but this is the kind of behavior that they defend.

    Do you believe that finding out that, there is an entire group of people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body, has no psychological consequences for you whatsoever?

    Do you think the consequences of finding out are significantly different than finding out they're doing it in their imagination? If so, why?

    Youre essentially saying that men and boys can’t be expected to treat girls and women as actual people and instead must be allowed to turn their friends and peers into fetishized media content they can share amongst each other.

    And, just to be clear, by this you mean the stuff with pictures, not talking or thinking about them? Because, again, the words "media content" just don't seem to be key to any harm being done.

    Your approach is consistently to say that "this is harmful, this is disgusting", but not to say why. Likewise you say that the "metaphors are not at all applicable" but you don't say at all what the important difference is between "people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body" and "people who you thought were your friends but are in actuality imagining your head and masturbating to the idea of you performing sex acts for them using imagined likenesses of your naked body". Both acts are sexualisation, both are done without consent, both could cause poor treatment by the people doing it.

    I see two possiblities - either you see this as so obviously and fundamentally wrong you don't have a way of describing way, or you know that the two scenarios are fundamentally similar but know that the idea of thought-crime is unsustainable.

    Finally it's necessary to address the gendered way you're talking about this. While obviously there is a huge discrepancy in male perpetrators and female victims of sexual abuse and crimes, it makes it sound like you think this is only a problem because, or when, it affects women and girls. You should probably think about that, because for years we've been making deserved progress at making things gender-neutral and I doubt you'd accept this kind of thing in other areas.

  • No. That's not a good enough excuse to potentially be abusing children.

    I can't think of a single good reason to draw those kinds of things. Like at all. Please, give me a single good reason.

    No. That’s not a good enough excuse to potentially be abusing children.

    It's good enough for the person whose opinion counts, your doesn't. And there's no such potential.

    I can’t think of a single good reason to draw those kinds of things. Like at all.

    Too bad.

    Please, give me a single good reason.

    To reinforce that your opinion doesn't count is in itself a good reason. The best of them all really.

  • In general, even up here in woke-ville, punishments have gotten a lot more strict for kids. There’s a lot more involvement of police, courts, jail. As a parent it causes me a lot of anxiety - whatever happened to school being a “sandbox” where a kid can make mistakes without adult consequences, without ruining their lives? Did that ever exist?

    it existed if society liked you enough.

    fascists just have a habit of tightening that belt smaller and smaller, is what’s going on.

  • 348 Stimmen
    72 Beiträge
    7 Aufrufe
    M
    Sure, the internet is more practical, and the odds of being caught in the time required to execute a decent strike plan, even one as vague as: "we're going to Amerika and we're going to hit 50 high profile targets on July 4th, one in every state" (Dear NSA analyst, this is entirely hypothetical) so your agents spread to the field and start assessing from the ground the highest impact targets attainable with their resources, extensive back and forth from the field to central command daily for 90 days of prep, but it's being carried out on 270 different active social media channels as innocuous looking photo exchanges with 540 pre-arranged algorithms hiding the messages in the noise of the image bits. Chances of security agencies picking this up from the communication itself? About 100x less than them noticing 50 teams of activists deployed to 50 states at roughly the same time, even if they never communicate anything. HF (more often called shortwave) is well suited for the numbers game. A deep cover agent lying in wait, potentially for years. Only "tell" is their odd habit of listening to the radio most nights. All they're waiting for is a binary message: if you hear the sequence 3 17 22 you are to make contact for further instructions. That message may come at any time, or may not come for a decade. These days, you would make your contact for further instructions via internet, and sure, it would be more practical to hide the "make contact" signal in the internet too, but shortwave is a longstanding tech with known operating parameters.
  • 0 Stimmen
    1 Beiträge
    4 Aufrufe
    Niemand hat geantwortet
  • Comment utiliser ChatGPT : le guide complet - BDM

    Technology technology
    1
    2
    0 Stimmen
    1 Beiträge
    8 Aufrufe
    Niemand hat geantwortet
  • New Orleans debates real-time facial recognition legislation

    Technology technology
    12
    1
    150 Stimmen
    12 Beiträge
    27 Aufrufe
    A
    [image: 62e40d75-1358-46a4-a7a5-1f08c6afe4dc.jpeg] Palantir had a contract with New Orleans starting around ~2012 to create their predictive policing tech that scans surveillance cameras for very vague details and still misidentifies people. It's very similar to Lavender, the tech they use to identify members of Hamas and attack with drones. This results in misidentified targets ~10% of the time, according to the IDF (likely it's a much higher misidentification rate than 10%). Palantir picked Louisiana over somewhere like San Francisco bc they knew it would be a lot easier to violate rights and privacy here and get away with it. Whatever they decide in New Orleans on Thursday during this Council meeting that nobody cares about, will likely be the first of its kind on the books legal basis to track civilians in the U.S. and allow the federal government to take control over that ability whenever they want. This could also set a precedent for use in other states. Guess who's running the entire country right now, and just gave high ranking army contracts to Palantir employees for "no reason" while they are also receiving a multimillion dollar federal contract to create an insane database on every American and giant data centers are being built all across the country.
  • 23 Stimmen
    4 Beiträge
    4 Aufrufe
    D
    Whew..... None of the important file hosters ..
  • Acute Leukemia Burden Trends and Future Predictions

    Technology technology
    5
    1
    5 Stimmen
    5 Beiträge
    17 Aufrufe
    G
    Looks like the delay in 2011 was so big the data became available after the 2017 one
  • Catbox.moe got screwed 😿

    Technology technology
    40
    55 Stimmen
    40 Beiträge
    44 Aufrufe
    archrecord@lemm.eeA
    I'll gladly give you a reason. I'm actually happy to articulate my stance on this, considering how much I tend to care about digital rights. Services that host files should not be held responsible for what users upload, unless: The service explicitly caters to illegal content by definition or practice (i.e. the if the website is literally titled uploadyourcsamhere[.]com then it's safe to assume they deliberately want to host illegal content) The service has a very easy mechanism to remove illegal content, either when asked, or through simple monitoring systems, but chooses not to do so (catbox does this, and quite quickly too) Because holding services responsible creates a whole host of negative effects. Here's some examples: Someone starts a CDN and some users upload CSAM. The creator of the CDN goes to jail now. Nobody ever wants to create a CDN because of the legal risk, and thus the only providers of CDNs become shady, expensive, anonymously-run services with no compliance mechanisms. You run a site that hosts images, and someone decides they want to harm you. They upload CSAM, then report the site to law enforcement. You go to jail. Anybody in the future who wants to run an image sharing site must now self-censor to try and not upset any human being that could be willing to harm them via their site. A social media site is hosting the posts and content of users. In order to be compliant and not go to jail, they must engage in extremely strict filtering, otherwise even one mistake could land them in jail. All users of the site are prohibited from posting any NSFW or even suggestive content, (including newsworthy media, such as an image of bodies in a warzone) and any violation leads to an instant ban, because any of those things could lead to a chance of actually illegal content being attached. This isn't just my opinion either. Digital rights organizations such as the Electronic Frontier Foundation have talked at length about similar policies before. To quote them: "When social media platforms adopt heavy-handed moderation policies, the unintended consequences can be hard to predict. For example, Twitter’s policies on sexual material have resulted in posts on sexual health and condoms being taken down. YouTube’s bans on violent content have resulted in journalism on the Syrian war being pulled from the site. It can be tempting to attempt to “fix” certain attitudes and behaviors online by placing increased restrictions on users’ speech, but in practice, web platforms have had more success at silencing innocent people than at making online communities healthier." Now, to address the rest of your comment, since I don't just want to focus on the beginning: I think you have to actively moderate what is uploaded Catbox does, and as previously mentioned, often at a much higher rate than other services, and at a comparable rate to many services that have millions, if not billions of dollars in annual profits that could otherwise be spent on further moderation. there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal. The problem isn't necessarily the speed at which people can be reported and punished, but rather that the internet is fundamentally harder to track people on than real life. It's easy for cops to sit around at a spot they know someone will be physically distributing illegal content at in real life, but digitally, even if you can see the feed of all the information passing through the service, a VPN or Tor connection will anonymize your IP address in a manner that most police departments won't be able to track, and most three-letter agencies will simply have a relatively low success rate with. There's no good solution to this problem of identifying perpetrators, which is why platforms often focus on moderation over legal enforcement actions against users so frequently. It accomplishes the goal of preventing and removing the content without having to, for example, require every single user of the internet to scan an ID (and also magically prevent people from just stealing other people's access tokens and impersonating their ID) I do agree, however, that we should probably provide larger amounts of funding, training, and resources, to divisions who's sole goal is to go after online distribution of various illegal content, primarily that which harms children, because it's certainly still an issue of there being too many reports to go through, even if many of them will still lead to dead ends. I hope that explains why making file hosting services liable for user uploaded content probably isn't the best strategy. I hate to see people with good intentions support ideas that sound good in practice, but in the end just cause more untold harms, and I hope you can understand why I believe this to be the case.
  • 141 Stimmen
    4 Beiträge
    14 Aufrufe
    P
    The topic is more nuanced, all the logs indicate email/password combos that were compromised. While it is possible this is due to a malware infection, it could be something as simple as a phishing website. In this case, credentials are entered but no "malware" was installed. The point being it doesn't look great that someone has ANY compromises... But again, anyone who's used the Internet a bit has some compromised. For example, in a password manager (especially the one on iPhone), you'll often be notified of all your potentially compromised accounts. [image: 7a5e8350-e47e-4d67-b096-e6e470ec7050.jpeg]