Skip to content

Kids are making deepfakes of each other, and laws aren’t keeping up

Technology
171 77 2
  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    Deepfakes might end up being the modern version of a bikini. In the olden days, people wore these to the beach. Having less was scandalous and moral decay. Yet, now we wear much less.

    Our grandchildren might simply not give a damn about their nudity, because it is assumed that everyone is deepfaking everyone.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • anyone using any kind of AI either doesn't know how consent works-- or they don't care about it.

    a horrifying development in the intersection of technofascism and rape culture

    Any AI? Every application? What kind of statement is this?

  • Deepfakes might end up being the modern version of a bikini. In the olden days, people wore these to the beach. Having less was scandalous and moral decay. Yet, now we wear much less.

    Our grandchildren might simply not give a damn about their nudity, because it is assumed that everyone is deepfaking everyone.

    These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.

  • Any AI? Every application? What kind of statement is this?

    AI models (unless you're training your own) are usually trained on data it does not have a licence to use. The companies training these models are also notorious for ignoring robot.txt and other measures websites use to stop bots from trawling their data.

    Like in crypto, most people in AI are not nerds, just criminal scum.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    Burkas for the win ?

  • Can't afford this much cheese today to find just the right slice for every bikini photo...

  • Teenagers are old enough to understand consequences.

    In fact, my neighborhood nearly burned down last week because a teenager, despite being told "no" and "stop" multiple times - including by neighbors - decided to light off fireworks on the mountainside right behind the neighborhood.

    Red arrow is my house. We were damn lucky the wind was blowing the right direction. If this had happened the day before, the neighborhood would be gone.

    some day I hope to be brave enough to post pictures of my house on the internet

  • Drawing a sexy cartoon that looks like an adult, with a caption that says "I'm 12", counts. So yeah, probably.

    This actually is quite fuzzy and depends on your country and even jurisdiction in your country

  • AI models (unless you're training your own) are usually trained on data it does not have a licence to use. The companies training these models are also notorious for ignoring robot.txt and other measures websites use to stop bots from trawling their data.

    Like in crypto, most people in AI are not nerds, just criminal scum.

    You are thinking of LLMs, not AI in general.

  • You are thinking of LLMs, not AI in general.

    I am. And so is OC. Neural networks are a different beast, although neither is actual AI. Just a marketing term at this point.

  • These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.

    It can be both. The cornerstone of why nudity can be abused, is that society makes it shameful to be bare. If some generations from now that people can just shrug and not care, that is one less tool an abuser can use against people.

    In any case, I am of the mind that people of my generation might be doing their own version of the Satanic Panic, or the reaction against rap music. For better or worse, older people cannot relate to the younger.

  • There is an institution in society specifically designed to strip women of their autonomy, reduce them down to their sexual appeal to men, and proliferate the notions of their inherent submission to men. This simply does not exist the other way. This will not be a major problem for boys, teenage girls are not creating fucking AI porn rings with pictures of boys from their classes. That isnt happening. Will someone do it? Almost certainly. Is it a systemic issue? No. Men's bodies are not attacked institutionally in this way.

    And youre still trying to equate imagination with physical tangible media. And to be clear, if several of my friends said they were collectively beating off to the idea of me naked, I would be horrified and disgusted. The overwhelming majority of people would. Again, they've taken you an actual person they know and are friends with, and have turned you into a sexual goal to be attained. It is invasive, exploitative, and above all else dehumanizing. Yeah if even one of my friends told me he jerked off to the thought of me naked I would never see him the same way again and would stop being friends with him. If I was a teenager it would probably fuck me up pretty bad to know that someone who I thought was my friend just saw me as a collection of sexual body parts with a face attached. If I found that a whole group of boys, some who i might not even know, were sharing AI generated porn with my face it would be severely psychologically traumatizing and probably shake my trust in men and boys for the rest of my life. This isn't a fucking game. Youre acting like this is normal, its NOT FUCKING NORMAL. Photoshopping a girl in your classes face onto a nude body and sharing it with a group of boys is NOT NORMAL. That is severely disturbed behavior. That shows a complete malfunction in your empathy. It does if thats your imagination too. And finding that out, that somebody has done that, is absolutely repulsive.

    And no I find it perfectly sustainable. We have no means by which to detect pedophiles by their thoughts. But pedophilic thoughts are still wrong and are not something we tolerate people expressing. Creating CSAM is still illegal, whether or not the child is aware such content is being created of them. They cant consent to that as they are children. This is the same. No we cant fucking read people's thoughts and punish them for them. Having thoughts like that is absolutely a sign of some obsessive tendencies and already forming devaluation of women and girls and reduction of them to their bodies, but the correct thing is for them to receive counseling and proper education about sex and relationships. Creating, sharing and distributing AI generated porn of someone is so fundamentally different from that I have to think you have a fundamental misunderstanding about what an image is. This isnt a fucking thought. These boys and men can do whatever they want with this pornography they've made of you, can send it to whoever they want and share it as far and wide as they want. They have literally created porn of you without your consent. And for teenage girls this is a whole other level of fucked up. This is being used to produce CSAM. They cannot consent to this. It is a provable act of violation of women and girls. This should be illegal and should be treated extremely seriously when teenage boys are found to have done it.

    You all say youre feminists until someone comes after your fucked up sexualities and your porn addictions. Always the same.

    And youre still trying to equate imagination with physical tangible media. And to be clear, if several of my friends said they were collectively beating off to the idea of me naked, I would be horrified and disgusted [...]

    So the fundamental reality is that imagination and physical tangible media are very similar in this regard. That's what you just said.

    a whole group of boys, some who i might not even know, were sharing AI generated porn with my face

    And if they were just talking about a shared fantasy - with your face? You still have the "ring" aspect, the stranger aspect, the dehumanising aspect, etc.

    This is why there's the connection that I keep getting at: there are many similarities, and you even say you'd feel similarly in both circumstances. So, the question is: do we go down the route of thought crime and criminalise the similar act? Or do we use this similarity to realise that it is not the act that is the problem, but the effects it can have on the victim?

    If I was a teenager it would probably fuck me up pretty bad to know that someone who I thought was my friend just saw me as a collection of sexual body parts with a face attached.

    Why do you think doing either thing (imagined or with pictures) means that someone just sees the person as a "collection of sexual body parts with a face attached"? Why can't someone see you as an ordinary human being? While you might not believe that either thing is normal, I can assure you it is prevalent. I'm sure that you and I have both been the subject of masturbatory fantasies without our knowledge. I don't say that to make you feel uncomfortable (and am sorry if it does) but to get you to think about how those acts have affected you, or not.

    You talk again about how an image can be shared - but so can a fantasy (by talking about it). You talk again about how it's created without consent - but so is a fantasy.

    Another thought experiment: someone on the other side of the world draws an erotic image, and it happens by pure chance to resemble a real person. Has that person been victimised, and abused? Does that image need to be destroyed by the authorities? If not, why not? The circumstances of the image are the same as if it were created as fake porn. If it reached that person's real circle of acquaintances, it could very well have the same effects - being shared, causing them shame, ridicule, abuse. It's another example that shows how the problematic part is not the creation of an image, but the use of that image to abuse someone.

    But pedophilic thoughts are still wrong and are not something we tolerate people expressing.

    It's my view that paedophilia, un-acted upon, is not wrong, as it harms no-one. A culture in which people are shamed, dehumanised and abused for the way their mind works is one in which those people won't seek help before they act on those thoughts.

    Having thoughts like that is absolutely a sign of some obsessive tendencies and already forming devaluation of women and girls

    It's kind of shocking to see you again erase male victims of (child) sexual abuse. For child abuse specifically, rates of victimisation are much closer than for adults.

    You all say youre feminists until someone comes after your fucked up sexualities and your porn addictions. Always the same.

    Luckily I know you're not representative of all of any group of people.

  • In general, even up here in woke-ville, punishments have gotten a lot more strict for kids. There’s a lot more involvement of police, courts, jail. As a parent it causes me a lot of anxiety - whatever happened to school being a “sandbox” where a kid can make mistakes without adult consequences, without ruining their lives? Did that ever exist?

    I can already picture that as an Onion headline:

    New York Renames State to 'WokeVille'. NYC to follow.

  • Maybe let's assume all digital images are fake and go back to painting. Wait... what if children start painting deepfakes ?

    Or pasting someone's photo over porn...in their minds...

  • These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.

    Unless it is used to pretend that it is a real video and circulated for denigration or blackmail, it is very much not at all like assault. And also, deepfakes do not have the special features hidden under your clothes, so it is possible to debunk those if you really have to.

  • And youre still trying to equate imagination with physical tangible media. And to be clear, if several of my friends said they were collectively beating off to the idea of me naked, I would be horrified and disgusted [...]

    So the fundamental reality is that imagination and physical tangible media are very similar in this regard. That's what you just said.

    a whole group of boys, some who i might not even know, were sharing AI generated porn with my face

    And if they were just talking about a shared fantasy - with your face? You still have the "ring" aspect, the stranger aspect, the dehumanising aspect, etc.

    This is why there's the connection that I keep getting at: there are many similarities, and you even say you'd feel similarly in both circumstances. So, the question is: do we go down the route of thought crime and criminalise the similar act? Or do we use this similarity to realise that it is not the act that is the problem, but the effects it can have on the victim?

    If I was a teenager it would probably fuck me up pretty bad to know that someone who I thought was my friend just saw me as a collection of sexual body parts with a face attached.

    Why do you think doing either thing (imagined or with pictures) means that someone just sees the person as a "collection of sexual body parts with a face attached"? Why can't someone see you as an ordinary human being? While you might not believe that either thing is normal, I can assure you it is prevalent. I'm sure that you and I have both been the subject of masturbatory fantasies without our knowledge. I don't say that to make you feel uncomfortable (and am sorry if it does) but to get you to think about how those acts have affected you, or not.

    You talk again about how an image can be shared - but so can a fantasy (by talking about it). You talk again about how it's created without consent - but so is a fantasy.

    Another thought experiment: someone on the other side of the world draws an erotic image, and it happens by pure chance to resemble a real person. Has that person been victimised, and abused? Does that image need to be destroyed by the authorities? If not, why not? The circumstances of the image are the same as if it were created as fake porn. If it reached that person's real circle of acquaintances, it could very well have the same effects - being shared, causing them shame, ridicule, abuse. It's another example that shows how the problematic part is not the creation of an image, but the use of that image to abuse someone.

    But pedophilic thoughts are still wrong and are not something we tolerate people expressing.

    It's my view that paedophilia, un-acted upon, is not wrong, as it harms no-one. A culture in which people are shamed, dehumanised and abused for the way their mind works is one in which those people won't seek help before they act on those thoughts.

    Having thoughts like that is absolutely a sign of some obsessive tendencies and already forming devaluation of women and girls

    It's kind of shocking to see you again erase male victims of (child) sexual abuse. For child abuse specifically, rates of victimisation are much closer than for adults.

    You all say youre feminists until someone comes after your fucked up sexualities and your porn addictions. Always the same.

    Luckily I know you're not representative of all of any group of people.

    Your thought experiment is moot as these are real people. Youre still not getting it. Youre still seemingly fundamentally confused about why having porn made of you without your consent is wrong.

    I dont think pedophilic thoughts should ever be tolerated outside a counselors office. If I found out one of my friends was a pedophile I would never speak with them again. End statement. You are in a very very very small minority of people if you disagree.

    You skipped over the section where I said that a group of boys collectively sharing in a fantasy of one of their female peers and using that fantasy to sexually gratify themselves would be severely psychologically traumatizing for the victim.

    Don't make porn of people without their consent. You should face legal consequences for making porn of someone without their consent. The difference between fantasy and porn is that porn is media content, it is a real image or video and not an imagination in someone's mind. If the fantasy is being written down and then shared then its kind of erotica isnt it, and I also think its extremely fucked up to write erotica about someone you know. Don't do that either. Wild.

  • Schools can already do that though. You can get in trouble for bullying outside of school, and when i was a student athletes i had pretty strict restrictions on what i was allowed to do because i was an "ambassador" for the school.

    And you think these are positive things?

  • I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here

    You know how when you look at a picture of someone and you cover up the clothed bits, they look naked. Your brain fills in the gaps with what it knows of general human anatomy.

    It's like that.

  • AI can do penises just fine though, there's just no market demand for it so quick and easy deep fake sites are focused on female bodies.

    But I disagree with this anyway, this will be the "bullied kid brings a knife to class" of AI.

    there's just no market demand for it

    Oh.... There's demand for it for sure.

    The image generation models that exist are unquestionable proof of demand for penises. I think what's missing is the kahunas required to make a business around it. There are places even pornographers fear to tread.

  • 0 Stimmen
    1 Beiträge
    0 Aufrufe
    Niemand hat geantwortet
  • 4 Stimmen
    1 Beiträge
    5 Aufrufe
    Niemand hat geantwortet
  • Uber, Lyft oppose some bills that aim to prevent assaults during rides

    Technology technology
    12
    94 Stimmen
    12 Beiträge
    12 Aufrufe
    F
    California is not Colorado nor is it federal No shit, did you even read my comment? Regulations already exist in every state that ride share companies operate in, including any state where taxis operate. People are already not supposed to sexually assault their passengers. Will adding another regulation saying they shouldn’t do that, even when one already exists, suddenly stop it from happening? No. Have you even looked at the regulations in Colorado for ride share drivers and companies? I’m guessing not. Here are the ones that were made in 2014: https://law.justia.com/codes/colorado/2021/title-40/article-10-1/part-6/section-40-10-1-605/#%3A~%3Atext=§+40-10.1-605.+Operational+Requirements+A+driver+shall+not%2Ca+ride%2C+otherwise+known+as+a+“street+hail”. Here’s just one little but relevant section: Before a person is permitted to act as a driver through use of a transportation network company's digital network, the person shall: Obtain a criminal history record check pursuant to the procedures set forth in section 40-10.1-110 as supplemented by the commission's rules promulgated under section 40-10.1-110 or through a privately administered national criminal history record check, including the national sex offender database; and If a privately administered national criminal history record check is used, provide a copy of the criminal history record check to the transportation network company. A driver shall obtain a criminal history record check in accordance with subparagraph (I) of paragraph (a) of this subsection (3) every five years while serving as a driver. A person who has been convicted of or pled guilty or nolo contendere to driving under the influence of drugs or alcohol in the previous seven years before applying to become a driver shall not serve as a driver. If the criminal history record check reveals that the person has ever been convicted of or pled guilty or nolo contendere to any of the following felony offenses, the person shall not serve as a driver: (c) (I) A person who has been convicted of or pled guilty or nolo contendere to driving under the influence of drugs or alcohol in the previous seven years before applying to become a driver shall not serve as a driver. If the criminal history record check reveals that the person has ever been convicted of or pled guilty or nolo contendere to any of the following felony offenses, the person shall not serve as a driver: An offense involving fraud, as described in article 5 of title 18, C.R.S.; An offense involving unlawful sexual behavior, as defined in section 16-22-102 (9), C.R.S.; An offense against property, as described in article 4 of title 18, C.R.S.; or A crime of violence, as described in section 18-1.3-406, C.R.S. A person who has been convicted of a comparable offense to the offenses listed in subparagraph (I) of this paragraph (c) in another state or in the United States shall not serve as a driver. A transportation network company or a third party shall retain true and accurate results of the criminal history record check for each driver that provides services for the transportation network company for at least five years after the criminal history record check was conducted. A person who has, within the immediately preceding five years, been convicted of or pled guilty or nolo contendere to a felony shall not serve as a driver. Before permitting an individual to act as a driver on its digital network, a transportation network company shall obtain and review a driving history research report for the individual. An individual with the following moving violations shall not serve as a driver: More than three moving violations in the three-year period preceding the individual's application to serve as a driver; or A major moving violation in the three-year period preceding the individual's application to serve as a driver, whether committed in this state, another state, or the United States, including vehicular eluding, as described in section 18-9-116.5, C.R.S., reckless driving, as described in section 42-4-1401, C.R.S., and driving under restraint, as described in section 42-2-138, C.R.S. A transportation network company or a third party shall retain true and accurate results of the driving history research report for each driver that provides services for the transportation network company for at least three years. So all sorts of criminal history, driving record, etc checks have been required since 2014. Colorado were actually the first state in the USA to implement rules like this for ride share companies lol.
  • 133 Stimmen
    80 Beiträge
    62 Aufrufe
    glizzyguzzler@lemmy.blahaj.zoneG
    Indeed I did not, we’re at a stalemate because you and I do not believe what the other is saying! So we can’t move anywhere since it’s two walls. Buuuut Tim Apple got my back for once, just saw this now!: https://lemmy.blahaj.zone/post/27197259 I’ll leave it at that, as thanks to that white paper I win! Yay internet points!
  • Pocket shutting down

    Technology technology
    2
    2 Stimmen
    2 Beiträge
    6 Aufrufe
    B
    Can anyone recommend a good alternative? I still use it to bookmark most wanted sites.
  • 30 Stimmen
    6 Beiträge
    19 Aufrufe
    S
    The thing about compelling lies is not that they are new, just that they are easier to expand. The most common effect of compelling lies is their ability to get well-intentioned people to support malign causes and give their money to fraudsters. So, expect that to expand, kind of like it already has been. The big question for me is what the response will be. Will we make lying illegal? Will we become a world of ever more paranoid isolationists, returning to clans, families, households, as the largest social group you can trust? Will most people even have the intelligence to see what is happenning and respond? Or will most people be turned into info-puppets, controlled into behaviours by manipulation of their information diet to an unprecedented degree? I don't know.
  • Microsoft pulls MS365 Business Premium from nonprofits

    Technology technology
    37
    1
    48 Stimmen
    37 Beiträge
    36 Aufrufe
    S
    That's the thing, I wish we could just switch all enterprises to Linux, but Microsoft developed a huge ecosystem that really does have good features. Unless something comparable comes up in the Linux world, I don't see Europe becoming independent of Microsoft any time soon
  • Things at Tesla are worse than they appear

    Technology technology
    34
    1
    420 Stimmen
    34 Beiträge
    64 Aufrufe
    halcyon@discuss.tchncs.deH
    [image: a4f3b70f-db20-4c1d-b737-611548cf3104.jpeg]