Skip to content

Kids are making deepfakes of each other, and laws aren’t keeping up

Technology
172 77 3
  • Lawmakers are grappling with how to address ...

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

    Even in countries a lot less corrupt than the US this is an issue.

    Especially because the US government/companies doesn't do jack shit for people

  • Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

    Drawing a sexy cartoon that looks like an adult, with a caption that says "I'm 12", counts. So yeah, probably.

  • Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

    Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.

  • If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

    If so, how is the psychological effect of a convincing deepfake any different?

    Taking secret nude pictures of someone is quite a bit different than....not taking nude pictures of them.

    It's not CSAM to put a picture of someone's face on an adult model and show it to your friend. It's certainly sexual harassment, but it isn't CSAM.

  • That could be a socially healthy place to end up at. I don't see it anytime soon though. Just look at the other response I got.

    Anyone with half a brain will certainly claim as much. Even if people don't fully believe it, it will blunt the most serious of social consequences.

  • You're disagreeing with my unserious suggestion? I just... okay. No. Micropenises aren't a solution. I just don't think there is one.

    If you want to disagree with that, let's hear it. I have 15 and 13 year old daughters. Anyone can buy a $400 computer, install Linux, install AI, and undress people all day long. There is no legal restraint capable of stopping that, only punishing it.

    Shut down model distribution and it'll move to torrent. Put the kids in the legal system and they are going to face lifelong consequences for 12-year-old assholery. (To be fair, victims often face long repercussions for being targeted, but that's not imposed by the state which demands a higher standard.) Hold parents accountable and it will disproportionately impact families who spend more hours working and can't supervise their kids 24/7.

    So I'm short on answers, but open to discussion.

    They may be little sociopaths, but they don’t run around murdering each other. Our culture hammers it into their brains that murder is wrong and they will be severely punished for it. We need to build a culture where little boys are afraid to distribute naked photos of their classmates. Where their friends will turn them in for fear of repercussions. You do that by treating it like a crime, not by saying “boys will be boys” and ignoring the issue.

    Treat it like a crime, and address separately the issue of children being tried as adults and facing lifelong consequences. The reforms needed to our juvenile justice system go beyond this particular crime.

  • Taking secret nude pictures of someone is quite a bit different than....not taking nude pictures of them.

    It's not CSAM to put a picture of someone's face on an adult model and show it to your friend. It's certainly sexual harassment, but it isn't CSAM.

    How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

  • How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

    It's absolutely sexual harassment.

    But, to your question: you can't just say something has underage nudity when the nudity is of an adult model. It's not CSAM.

  • Lawmakers are grappling with how to address ...

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

    Oh I just assumed that every Conservative jerks off to kids

  • Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

    It's bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It's always about using it bully someone.

    This is different because it's easier. It's not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn't have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.

  • It's absolutely sexual harassment.

    But, to your question: you can't just say something has underage nudity when the nudity is of an adult model. It's not CSAM.

    Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    So is this a way to take away rights by making it about kids?

    I mean what the fuck. We did much less and got punished right? It didn't matter if we were on the property. Schools can hold students accountable for conduct with other students.

    The leaded-gas adults of the time had no problem dealing with the emergence of cell phones. It was a distraction. They didn't need lawmakers to call it something specific. My Pokemon cards caused fights and were banned. No lawmakers needed.

    The problem is surely with the interaction between parents and schools. Or maybe it's just the old way of thinking. Maybe it's better to have police and courts start taking over discipline in schools.

  • I don't know personally. The admins of the fediverse likely do, considering it's something they've had to deal with from the start.
    So, they can likely answer much better than I might be able to.

  • It's bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It's always about using it bully someone.

    This is different because it's easier. It's not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn't have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.

    It's sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.

    Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don't understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.

    Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It's LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It's criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.

  • Disagree. Not CSAM when no abuse has taken place.

    That's my point.

    There's a thing that was happening in the past. Not sure it's still happening, due to lack of news about it. It was something called "glamour modeling" I think or an extension of it.

    Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.

    Nothing untoward directly happened to the children. They weren't physically abused. They were treated as regular fashion models. And yet, it's still csam. Why? Because of the intention behind making those pictures.

    The intention to exploit.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    If kids want to be protected they need to get some better lobbyists. /s

  • Disagree. Not CSAM when no abuse has taken place.

    That's my point.

    I think generating and sharing sexually explicit images of a person without their consent is abuse.

    That's distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I'm morally uncomfortable criminalizing an act that has no victim.

  • Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

    This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    Sexual attraction doesn't necessarily involve dehumanization. Unlike most other kinds of interest in a human being, it doesn't require interest in their personality, but these are logically not the same.

    In general you are using emotional arguments for things that work not through emotion, but through literal interpretation. That's like using metric calculations for a system that expects imperial. Utterly useless.

    If the person in the image is underaged then it should be classified as child pornography.

    No, it's not. It's literally a photorealistic drawing based on a photo (and a dataset to make the generative model). No children have been abused to produce it. Laws work literally.

    If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    No, because the woman is not being literally sexually exploited. Her photo being used without consent is, I think, subject of some laws. There are no new fundamental legal entities involved.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

    I think I agree. But it's neither child pornography nor sexual exploitation and can't be equated to them.

    There are already existing laws for such actions, similar to using a photo of the victim and a pornographic photo, paper, scissors, pencils and glue. Or, if you think the situation is radically different, there should be new punishable crimes introduced.

    Otherwise it's like punishing everyone caught driving while drunk for non-premeditated murder. One is not the other.

  • It's sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.

    Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don't understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.

    Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It's LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It's criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.

    Can you stop trying to find a silver lining in the sexual exploitation of teenage girls?

    Can you please use words by their meaning?

    Also I'll have to be blunt, but - every human has their own sexuality, with their own level of "drive", so to say, and their dreams.

    And it's absolutely normal to dream of other people. Including sexually. Including those who don't like you. Not only men do that, too. There are no thought crimes.

    So talking about that being easier or harder you are not making any argument at all.

    However. As I said elsewhere, the actions that really harm people should be classified legally and addressed. Like sharing such stuff. But not as making child pornography because it's not, and not like sexual exploitation because it's not.

    It's just that your few posts I've seen in this thread seem to say that certain kinds of thought should be illegal, and that's absolute bullshit. And laws shouldn't be made based on such emotions.

  • Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

    Historically, the respectability of a woman depended on her sexuality. In many conservative cultures and communities, that is still true. Spreading the message that deepfakes are some particular horrible form of harassment reinforces that view.

    If having your head on the model of a nude model is a terrible crime, then what does that say about the nude model? What does it say about women who simply happen to develop a larger bosom or lips? What does it say about sex before marriage?

    The implicit message here is simply harmful to girls and women.

    That doesn't mean that we should tolerate harassment. But it needs to be understood that we can do no more to stop this kind of harassment than we can do to stop any other kind.

  • 303 Stimmen
    50 Beiträge
    0 Aufrufe
    rimu@piefed.socialR
    Although, replacing the battery on the Fairphone is so much easier that 1000 cycles is acceptable.
  • 91 Stimmen
    8 Beiträge
    1 Aufrufe
    E
    It can be hard to guess who to bribe, or how big each bribe should be?
  • 67 Stimmen
    1 Beiträge
    2 Aufrufe
    Niemand hat geantwortet
  • Uber, Lyft oppose some bills that aim to prevent assaults during rides

    Technology technology
    12
    94 Stimmen
    12 Beiträge
    16 Aufrufe
    F
    California is not Colorado nor is it federal No shit, did you even read my comment? Regulations already exist in every state that ride share companies operate in, including any state where taxis operate. People are already not supposed to sexually assault their passengers. Will adding another regulation saying they shouldn’t do that, even when one already exists, suddenly stop it from happening? No. Have you even looked at the regulations in Colorado for ride share drivers and companies? I’m guessing not. Here are the ones that were made in 2014: https://law.justia.com/codes/colorado/2021/title-40/article-10-1/part-6/section-40-10-1-605/#%3A~%3Atext=§+40-10.1-605.+Operational+Requirements+A+driver+shall+not%2Ca+ride%2C+otherwise+known+as+a+“street+hail”. Here’s just one little but relevant section: Before a person is permitted to act as a driver through use of a transportation network company's digital network, the person shall: Obtain a criminal history record check pursuant to the procedures set forth in section 40-10.1-110 as supplemented by the commission's rules promulgated under section 40-10.1-110 or through a privately administered national criminal history record check, including the national sex offender database; and If a privately administered national criminal history record check is used, provide a copy of the criminal history record check to the transportation network company. A driver shall obtain a criminal history record check in accordance with subparagraph (I) of paragraph (a) of this subsection (3) every five years while serving as a driver. A person who has been convicted of or pled guilty or nolo contendere to driving under the influence of drugs or alcohol in the previous seven years before applying to become a driver shall not serve as a driver. If the criminal history record check reveals that the person has ever been convicted of or pled guilty or nolo contendere to any of the following felony offenses, the person shall not serve as a driver: (c) (I) A person who has been convicted of or pled guilty or nolo contendere to driving under the influence of drugs or alcohol in the previous seven years before applying to become a driver shall not serve as a driver. If the criminal history record check reveals that the person has ever been convicted of or pled guilty or nolo contendere to any of the following felony offenses, the person shall not serve as a driver: An offense involving fraud, as described in article 5 of title 18, C.R.S.; An offense involving unlawful sexual behavior, as defined in section 16-22-102 (9), C.R.S.; An offense against property, as described in article 4 of title 18, C.R.S.; or A crime of violence, as described in section 18-1.3-406, C.R.S. A person who has been convicted of a comparable offense to the offenses listed in subparagraph (I) of this paragraph (c) in another state or in the United States shall not serve as a driver. A transportation network company or a third party shall retain true and accurate results of the criminal history record check for each driver that provides services for the transportation network company for at least five years after the criminal history record check was conducted. A person who has, within the immediately preceding five years, been convicted of or pled guilty or nolo contendere to a felony shall not serve as a driver. Before permitting an individual to act as a driver on its digital network, a transportation network company shall obtain and review a driving history research report for the individual. An individual with the following moving violations shall not serve as a driver: More than three moving violations in the three-year period preceding the individual's application to serve as a driver; or A major moving violation in the three-year period preceding the individual's application to serve as a driver, whether committed in this state, another state, or the United States, including vehicular eluding, as described in section 18-9-116.5, C.R.S., reckless driving, as described in section 42-4-1401, C.R.S., and driving under restraint, as described in section 42-2-138, C.R.S. A transportation network company or a third party shall retain true and accurate results of the driving history research report for each driver that provides services for the transportation network company for at least three years. So all sorts of criminal history, driving record, etc checks have been required since 2014. Colorado were actually the first state in the USA to implement rules like this for ride share companies lol.
  • 40 Stimmen
    10 Beiträge
    22 Aufrufe
    T
    Clearly the author doesn't understand how capitalism works. If Apple can pick you up by the neck, turn you upside down, and shake whatever extra money it can from you then it absolutely will do so. The problem is that one indie developer doesn't have any power over Apple... so they can go fuck themselves. The developer is granted the opportunity to grovel at the feet of their betters (richers) and pray that they are allowed to keep enough of their own crop to survive the winter. If they don't survive... then some other dev will probably jump at the chance to take part in the "free market" and demonstrate their worth.
  • Amazon is reportedly training humanoid robots to deliver packages

    Technology technology
    143
    1
    300 Stimmen
    143 Beiträge
    112 Aufrufe
    M
    Yup, and people seem to frequently underestimate how ridiculously expensive running a fleet of humanoid robots would be (and don’t seem to realize how comparatively low the manual labor it’d replace is paid.)
  • 374 Stimmen
    69 Beiträge
    71 Aufrufe
    T
    In those situations I usually enable 1.5x.
  • 19 Stimmen
    1 Beiträge
    7 Aufrufe
    Niemand hat geantwortet