Skip to content

Kids are making deepfakes of each other, and laws aren’t keeping up

Technology
159 76 1
  • "thought crime"? And you have the balls to talk about using words "by their meaning"?

    This is a solid action with a product to show for it, not a thought, which happens to impact someone's life negatively without their consent, with potentially devastating consequences for the victim. So, can you please use words by their meaning?

    Edit: I jumped the gun when I read "thought crime", effectively disregarding the context. As such, I'm scratching the parts of my comment that don't apply, and leaving the ones that do apply (not necessarily to the post I was replying to, but to the whole thread).

    The author of those comments wrote a few times what in their opinion happens in the heads of others and how that should be prevented or something.

    Can you please stop interpreting my words exactly the way you like? That's not worth a gram of horse shit.

  • Normally yeah, but why would you want to draw sexual pictures of children?

    Suppose I'm a teenager attracted to people my age. Or suppose I'm medically a pedophile, which is not a crime, and then I would need that.

    In any case, for legal and moral purposes "why would you want" should be answered only with "not your concern, go eat shit and die".

  • ruining the life of a 13 year old boy for the rest of his life with no recourse

    And what about the life of the girl this boy would have ruined?

    This is not "boys will be boys" shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

    I don't think it's unreasonable to expect an equivalent punishment that has the potential to ruin his life.

    Fake pictures do not ruin your life… sorry…

    Our puritanical / 100% sex culture is the problem, not fake pictures…

  • For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

    That's just on it's face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That's not pedophilia. It's wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

    It shouldn't be treated the same as when an adult man generates it; there should be nuance. I'm not saying it's ok for a thirteen year old to generate said content: I'm saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

    I'm so glad I went through puberty at a time when this kind of shit wasn't available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

    In general, even up here in woke-ville, punishments have gotten a lot more strict for kids. There’s a lot more involvement of police, courts, jail. As a parent it causes me a lot of anxiety - whatever happened to school being a “sandbox” where a kid can make mistakes without adult consequences, without ruining their lives? Did that ever exist?

  • Parents are responsible for their kids. The punishment, with the full force of the law (and maybe something extra for good measure), should fall upon the parents, since they should have made sure their kids knew how despicable and illegal doing this is.

    Yeah, I agree, we shouldn't ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl's life.

    Yeah, I agree, we shouldn’t ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl’s life.

    You're a fucking asshole. This isn't like prosecuting parents who let a school shooter have access to guns. The interenet is everywhere. Parents are responsible for bringing up their children to be socially responsible. A thirteen year old kid is anything but responsible (I mean their mentality / maturity, I'm not giving them a pass).

    Go hang out with conservatives who want more policing. Over here, we'll talk about social programs you fucking prick.

  • Cheers for the explanation, had no idea that's how it works.

    So it's even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

    There are adults with bodies that resemble underage people that could be used to train models. Kitty Yung has a body that would qualify. You don't necessarily need to use illegal material to train to get illegal output.

  • That's just called the outside now. Assume you are on camera at all times the moment you step out the front door. To be safe in the surveillance we live in today, best act as though you are being recorded in your own home as well.

    You can make areas safe from cameras. No, you cant make everywhere camera free but you can minimize your time in those areas. Im not saying its a good system it would just be adjusting to the times.

    If the floor was lava and all that...

  • That's what muslims do with niqabs.

    Don't trivialize the scramble suit, ok

  • probably because there's a rapist in the white house.

    To add to that. I live in a red area and since the election I’ve been cat called much more. And it’s weird too, cus I’m middle aged…. I thought I’d finally disappear…

  • To add to that. I live in a red area and since the election I’ve been cat called much more. And it’s weird too, cus I’m middle aged…. I thought I’d finally disappear…

    the toxic manosphere/blogosphere/whatever it's called has done so much lifelong damage

  • Punishment for an adult man doing this: Prison

    Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.

    13 year old: “I'll just take the death penalty, thanks."

  • Hm. I wasn’t expecting the pro-child porn argument. All I can say is that’s absolutely legally and morally CSAM, and you’re fuckin nasty. Oof. Not really gonna bother with the rest because, well, yikes.

    Hey, it's OK to say you just don't have any counter-argument instead of making blatantly false characterisations.

  • Cheers for the explanation, had no idea that's how it works.

    So it's even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

    AI can generate images of things that don't even exist. If it knows what porn looks like and what a child looks like, it can combine those concepts.

  • I'm fairly well versed in tech and home labbing. I've never heard of tools that do this, generate images, etc. Not good ones anyhow. I could use those type of generation for business marketing to develop business cards, marketing materials. NOT FOR PEOPLE GENERATION. Anyone have a list of the best tools? GPT sucks at doing this I've tried.

    Take a look at InvokeAI.

  • Spoken like someone who hasn't been around women.

    You mean like a nerd who reads too much?

  • Suppose I'm a teenager attracted to people my age. Or suppose I'm medically a pedophile, which is not a crime, and then I would need that.

    In any case, for legal and moral purposes "why would you want" should be answered only with "not your concern, go eat shit and die".

    I feel like you didn't read my comment thoroughly enough. I said it can constitue CSAM. There is a surprising amount of leewat for teenagers of course.

    But no, I'm not gonna let you get away that easily. I want to know the why you think it's morally okay for an adult to draw sexually explicit images of children. Please, tell me how that's okay?

  • I feel like you didn't read my comment thoroughly enough. I said it can constitue CSAM. There is a surprising amount of leewat for teenagers of course.

    But no, I'm not gonna let you get away that easily. I want to know the why you think it's morally okay for an adult to draw sexually explicit images of children. Please, tell me how that's okay?

    Because morally it's not your fucking concern what others are doing in supposed privacy of their personal spaces.

    It seems to be a very obvious thing your nose doesn't belong there and you shouldn't stick it there.

    But no, I’m not gonna let you get away that easily.

    I don't need any getting away from you, you're nothing.

  • Yes, absolutely. But with recognition that a thirteen year old kid isn't a predator but a horny little kid. I'll let others determine what that punishment is, but I don't believe it's prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we're ratcheting up the punishment, but still not adult prison.

    In a properly functioning world, this could easily be coupled with particular education on power dynamics and a lesson on consent, giving proper attention to why this might be more harmful to get than to him.

    Of course, – so long as we're in this hypothetical world – you'd just have that kind of education be a part of sex ed. or the like for all students, to begin with, but, as we're in this world and that's Louisiana…

  • Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they’ve already decided that you’re a sexual experience for them.

    Why is it these things? Why does someone doing something with something which is not your body make it feel like your body doesn't belong to you? Why does it not instead make it feel like images of your body don't belong to you? Several of these things could equally be used to describe the situation when someone is fantasised about without their knowledge - why is that different?
    In Germany there's a legal concept called "right to one's own image" but there isn't in many other countries, and besides, what you're describing goes beyond this.

    My thinking behind these questions is that I cannot see anything inherent, anything necessary about the creation of fake sexual images of someone which leads to these harms, and that instead there is an aspect of our society which very explicitly punishes and shames people - woman far more so than men - for being in this situation, and that without that, we would be having a very different conversation.

    Starting from the position that the harm is in the creation of the images is like starting from the position that the harm of rape is in "defiling" the person raped. Rape isn't wrong because it makes you worthless to society - society is wrong for devaluing rape victims. Society is wrong for devaluing and shaming those who have fake images made of them.

    We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal.

    Can you be more explicit about what it's the same as?

    The sexualization of women and girls is pervasive across literally every level of western culture. What do you think the purpose is of the victims head and face being in the image? Do you believe that it plays an incidental and unrelated role? Do you believe that finding out that, there is an entire group of people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body, has no psychological consequences for you whatsoever? I'm just talking about it and it makes me want to throw up. It is a fucking nightmare. This is not normal. This is not creating a healthy relationship with sexuality and it is enforcing a view of women and their bodies existing for the gratification of men.

    You continuously attempt to extrapolate some very bizarre metaphors about this that are not at all applicable. This scenario is horrifying. Teenage girls should not be subject to scenarios like this. It is sexual exploitation. It is dehumanization. It promotes misogynistic views of women. This is NOT a matter of sexual liberation. Youre essentially saying that men and boys can't be expected to treat girls and women as actual people and instead must be allowed to turn their friends and peers into fetishized media content they can share amongst each other. Thats fucking disgusting. The longer you talk the more you start to sound like an incel. I'm not saying you are one, but this is the kind of behavior that they defend.

  • Because morally it's not your fucking concern what others are doing in supposed privacy of their personal spaces.

    It seems to be a very obvious thing your nose doesn't belong there and you shouldn't stick it there.

    But no, I’m not gonna let you get away that easily.

    I don't need any getting away from you, you're nothing.

    No. That's not a good enough excuse to potentially be abusing children.

    I can't think of a single good reason to draw those kinds of things. Like at all. Please, give me a single good reason.

  • 113 Stimmen
    10 Beiträge
    4 Aufrufe
    S
    I admire your positivity. I do not share it though, because from what I have seen, because even if there are open weights, the one with the biggest datacenter will in the future hold the most intelligent and performance model. Very similar to how even if storage space is very cheap today, large companies are holding all the data anyway. AI will go the same way, and thus the megacorps will and in some extent already are owning not only our data, but our thoughts and the ability to modify them. I mean, sponsored prompt injection is just the first thought modifying thing, imagine Google search sponsored hits, but instead it's a hyperconvincing AI response that subtly nudges you to a certain brand or way of thinking. Absolutely terrifies me, especially with all the research Meta has done on how to manipulate people's mood and behaviour through which social media posts they are presented with
  • Blocking real-world ads: is the future here?

    Technology technology
    33
    1
    198 Stimmen
    33 Beiträge
    26 Aufrufe
    S
    Also a work of fiction
  • Delivering BlogOnLemmy worldwide in record speeds

    Technology technology
    3
    28 Stimmen
    3 Beiträge
    9 Aufrufe
    kernelle@0d.gsK
    Nice to hear! I'm glad you enjoyed it.
  • Why your old mobile phone may be polluting Thailand

    Technology technology
    20
    1
    88 Stimmen
    20 Beiträge
    17 Aufrufe
    C
    Yeah. My old phones are in my house somewhere.
  • You're not alone: This email from Google's Gemini team is concerning

    Technology technology
    298
    1
    837 Stimmen
    298 Beiträge
    248 Aufrufe
    M
    My understanding is that, in broad strokes... Aurora acts like a proxy or mirror that doesn't require you to sign in to get Google Play Store apps. It doesn't provide any other software besides what you specifically download from it, and it doesn't include any telemetry/tracking like normal Google Play Store would. microG is a reimplementation of Google Play services (the suite of proprietary background services that Google runs on normal Android phones). MicroG doesn't have the bloat and tracking and other closed source functionality, but rather acts as a stand-in that other apps can talk to (when they'd normally be talking to Google Play services). This has to be installed and configured and I would refer to the microG github or other documentation. GrapheneOS has its own sandboxed Google Play Services which is basically unmodified Google Play Services, crammed into its own sandbox with no special permissions, and a compatibility layer that retains some functionality while keeping it from being able to access app data with high level permissions like it would normally do on a vanilla Android phone.
  • Uber, Lyft oppose some bills that aim to prevent assaults during rides

    Technology technology
    12
    94 Stimmen
    12 Beiträge
    10 Aufrufe
    F
    California is not Colorado nor is it federal No shit, did you even read my comment? Regulations already exist in every state that ride share companies operate in, including any state where taxis operate. People are already not supposed to sexually assault their passengers. Will adding another regulation saying they shouldn’t do that, even when one already exists, suddenly stop it from happening? No. Have you even looked at the regulations in Colorado for ride share drivers and companies? I’m guessing not. Here are the ones that were made in 2014: https://law.justia.com/codes/colorado/2021/title-40/article-10-1/part-6/section-40-10-1-605/#%3A~%3Atext=§+40-10.1-605.+Operational+Requirements+A+driver+shall+not%2Ca+ride%2C+otherwise+known+as+a+“street+hail”. Here’s just one little but relevant section: Before a person is permitted to act as a driver through use of a transportation network company's digital network, the person shall: Obtain a criminal history record check pursuant to the procedures set forth in section 40-10.1-110 as supplemented by the commission's rules promulgated under section 40-10.1-110 or through a privately administered national criminal history record check, including the national sex offender database; and If a privately administered national criminal history record check is used, provide a copy of the criminal history record check to the transportation network company. A driver shall obtain a criminal history record check in accordance with subparagraph (I) of paragraph (a) of this subsection (3) every five years while serving as a driver. A person who has been convicted of or pled guilty or nolo contendere to driving under the influence of drugs or alcohol in the previous seven years before applying to become a driver shall not serve as a driver. If the criminal history record check reveals that the person has ever been convicted of or pled guilty or nolo contendere to any of the following felony offenses, the person shall not serve as a driver: (c) (I) A person who has been convicted of or pled guilty or nolo contendere to driving under the influence of drugs or alcohol in the previous seven years before applying to become a driver shall not serve as a driver. If the criminal history record check reveals that the person has ever been convicted of or pled guilty or nolo contendere to any of the following felony offenses, the person shall not serve as a driver: An offense involving fraud, as described in article 5 of title 18, C.R.S.; An offense involving unlawful sexual behavior, as defined in section 16-22-102 (9), C.R.S.; An offense against property, as described in article 4 of title 18, C.R.S.; or A crime of violence, as described in section 18-1.3-406, C.R.S. A person who has been convicted of a comparable offense to the offenses listed in subparagraph (I) of this paragraph (c) in another state or in the United States shall not serve as a driver. A transportation network company or a third party shall retain true and accurate results of the criminal history record check for each driver that provides services for the transportation network company for at least five years after the criminal history record check was conducted. A person who has, within the immediately preceding five years, been convicted of or pled guilty or nolo contendere to a felony shall not serve as a driver. Before permitting an individual to act as a driver on its digital network, a transportation network company shall obtain and review a driving history research report for the individual. An individual with the following moving violations shall not serve as a driver: More than three moving violations in the three-year period preceding the individual's application to serve as a driver; or A major moving violation in the three-year period preceding the individual's application to serve as a driver, whether committed in this state, another state, or the United States, including vehicular eluding, as described in section 18-9-116.5, C.R.S., reckless driving, as described in section 42-4-1401, C.R.S., and driving under restraint, as described in section 42-2-138, C.R.S. A transportation network company or a third party shall retain true and accurate results of the driving history research report for each driver that provides services for the transportation network company for at least three years. So all sorts of criminal history, driving record, etc checks have been required since 2014. Colorado were actually the first state in the USA to implement rules like this for ride share companies lol.
  • 35 Stimmen
    1 Beiträge
    8 Aufrufe
    Niemand hat geantwortet
  • 873 Stimmen
    107 Beiträge
    53 Aufrufe
    softestsapphic@lemmy.worldS
    How are they going to make money off of these projects if people can legally copy and redistribute them for free? The same reasons everyone doesn't already do this via pirating. You mean copy, not steal. When something is stolen from you, you no longer have it. Wow you are just a troll, thanks for showing me so I don't waste anymore time with you.