Skip to content

Kids are making deepfakes of each other, and laws aren’t keeping up

Technology
155 75 0
  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    Maybe let's assume all digital images are fake and go back to painting. Wait... what if children start painting deepfakes ?

  • No. That’s not a good enough excuse to potentially be abusing children.

    It's good enough for the person whose opinion counts, your doesn't. And there's no such potential.

    I can’t think of a single good reason to draw those kinds of things. Like at all.

    Too bad.

    Please, give me a single good reason.

    To reinforce that your opinion doesn't count is in itself a good reason. The best of them all really.

    Okay so you have no reason. Which is because having sexually explicit images, drawn or otherwise, is gross and weird and disturbing. And the fact that you are continually doubling down shows me that you likely need your hard drives and notebooks checked.

    Please don't respond again unless you are telling me what country you are from so I can report you to the appropriate authorities.

  • Okay so you have no reason. Which is because having sexually explicit images, drawn or otherwise, is gross and weird and disturbing. And the fact that you are continually doubling down shows me that you likely need your hard drives and notebooks checked.

    Please don't respond again unless you are telling me what country you are from so I can report you to the appropriate authorities.

    People don't need reasons to do things gross or disturbing or whatever for you in their own space.

    And the fact that you are continually doubling down shows me that you likely need your hard drives and notebooks checked.

    Thankfully that's not your concern, and would get you in jail if you tried to do that yourself. Also I'm too lazy for my porn habits to be secret enough, LOL.

    Please don’t respond again unless you are telling me what country you are from so I can report you to the appropriate authorities.

    I don't think you understand. You're the fiend here. The kind of obnoxious shit that thinks it's in their right to watch after others' morality.

    I wonder, what if I'd try to report you and someone would follow through (unlikely, of course, without anything specific to report), hypothetically, which instances of stalking and privacy violations they'd find?

    You really seem the kind.

  • Do you believe that finding out that, there is an entire group of people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body, has no psychological consequences for you whatsoever?

    Do you think the consequences of finding out are significantly different than finding out they're doing it in their imagination? If so, why?

    Youre essentially saying that men and boys can’t be expected to treat girls and women as actual people and instead must be allowed to turn their friends and peers into fetishized media content they can share amongst each other.

    And, just to be clear, by this you mean the stuff with pictures, not talking or thinking about them? Because, again, the words "media content" just don't seem to be key to any harm being done.

    Your approach is consistently to say that "this is harmful, this is disgusting", but not to say why. Likewise you say that the "metaphors are not at all applicable" but you don't say at all what the important difference is between "people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body" and "people who you thought were your friends but are in actuality imagining your head and masturbating to the idea of you performing sex acts for them using imagined likenesses of your naked body". Both acts are sexualisation, both are done without consent, both could cause poor treatment by the people doing it.

    I see two possiblities - either you see this as so obviously and fundamentally wrong you don't have a way of describing way, or you know that the two scenarios are fundamentally similar but know that the idea of thought-crime is unsustainable.

    Finally it's necessary to address the gendered way you're talking about this. While obviously there is a huge discrepancy in male perpetrators and female victims of sexual abuse and crimes, it makes it sound like you think this is only a problem because, or when, it affects women and girls. You should probably think about that, because for years we've been making deserved progress at making things gender-neutral and I doubt you'd accept this kind of thing in other areas.

    There is an institution in society specifically designed to strip women of their autonomy, reduce them down to their sexual appeal to men, and proliferate the notions of their inherent submission to men. This simply does not exist the other way. This will not be a major problem for boys, teenage girls are not creating fucking AI porn rings with pictures of boys from their classes. That isnt happening. Will someone do it? Almost certainly. Is it a systemic issue? No. Men's bodies are not attacked institutionally in this way.

    And youre still trying to equate imagination with physical tangible media. And to be clear, if several of my friends said they were collectively beating off to the idea of me naked, I would be horrified and disgusted. The overwhelming majority of people would. Again, they've taken you an actual person they know and are friends with, and have turned you into a sexual goal to be attained. It is invasive, exploitative, and above all else dehumanizing. Yeah if even one of my friends told me he jerked off to the thought of me naked I would never see him the same way again and would stop being friends with him. If I was a teenager it would probably fuck me up pretty bad to know that someone who I thought was my friend just saw me as a collection of sexual body parts with a face attached. If I found that a whole group of boys, some who i might not even know, were sharing AI generated porn with my face it would be severely psychologically traumatizing and probably shake my trust in men and boys for the rest of my life. This isn't a fucking game. Youre acting like this is normal, its NOT FUCKING NORMAL. Photoshopping a girl in your classes face onto a nude body and sharing it with a group of boys is NOT NORMAL. That is severely disturbed behavior. That shows a complete malfunction in your empathy. It does if thats your imagination too. And finding that out, that somebody has done that, is absolutely repulsive.

    And no I find it perfectly sustainable. We have no means by which to detect pedophiles by their thoughts. But pedophilic thoughts are still wrong and are not something we tolerate people expressing. Creating CSAM is still illegal, whether or not the child is aware such content is being created of them. They cant consent to that as they are children. This is the same. No we cant fucking read people's thoughts and punish them for them. Having thoughts like that is absolutely a sign of some obsessive tendencies and already forming devaluation of women and girls and reduction of them to their bodies, but the correct thing is for them to receive counseling and proper education about sex and relationships. Creating, sharing and distributing AI generated porn of someone is so fundamentally different from that I have to think you have a fundamental misunderstanding about what an image is. This isnt a fucking thought. These boys and men can do whatever they want with this pornography they've made of you, can send it to whoever they want and share it as far and wide as they want. They have literally created porn of you without your consent. And for teenage girls this is a whole other level of fucked up. This is being used to produce CSAM. They cannot consent to this. It is a provable act of violation of women and girls. This should be illegal and should be treated extremely seriously when teenage boys are found to have done it.

    You all say youre feminists until someone comes after your fucked up sexualities and your porn addictions. Always the same.

  • The author of those comments wrote a few times what in their opinion happens in the heads of others and how that should be prevented or something.

    Can you please stop interpreting my words exactly the way you like? That's not worth a gram of horse shit.

    Yes I can, moreso after your clarification. I must have misread it the first time. Sorry.

  • Take a look at InvokeAI.

    Thanks. Rather than everyone downvoting for no real reason. Finally someone was atleast trying to be helpful. I will use it for super simple business card proofs or basic brochures that's about it. Magnets, etc.

  • Yeah, I agree, we shouldn’t ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl’s life.

    You're a fucking asshole. This isn't like prosecuting parents who let a school shooter have access to guns. The interenet is everywhere. Parents are responsible for bringing up their children to be socially responsible. A thirteen year old kid is anything but responsible (I mean their mentality / maturity, I'm not giving them a pass).

    Go hang out with conservatives who want more policing. Over here, we'll talk about social programs you fucking prick.

    I am an asshole, that's never been in question, and I fully own it. Having said that, no amount of "social programs" is going to have any effect if fucking parents don't raise their kids right.

    I'm entirely against surveillance, except when it comes to parents and keeping a close eye on everything their kids watch, browse or otherwise access (evidently making it known to the kids that "I can see EVERYTHING you see and do").

    So, yeah, hang the imbecile parents that should not have had kids in the first place because a fucking social program or school would raise them instead. Fuck off.

  • probably because there's a rapist in the white house.

    At least they've learned a skill?

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    anyone using any kind of AI either doesn't know how consent works-- or they don't care about it.

    a horrifying development in the intersection of technofascism and rape culture

  • Parents are responsible for their kids. The punishment, with the full force of the law (and maybe something extra for good measure), should fall upon the parents, since they should have made sure their kids knew how despicable and illegal doing this is.

    Yeah, I agree, we shouldn't ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl's life.

    Teenagers are old enough to understand consequences.

    In fact, my neighborhood nearly burned down last week because a teenager, despite being told "no" and "stop" multiple times - including by neighbors - decided to light off fireworks on the mountainside right behind the neighborhood.

    Red arrow is my house. We were damn lucky the wind was blowing the right direction. If this had happened the day before, the neighborhood would be gone.

  • I am an asshole, that's never been in question, and I fully own it. Having said that, no amount of "social programs" is going to have any effect if fucking parents don't raise their kids right.

    I'm entirely against surveillance, except when it comes to parents and keeping a close eye on everything their kids watch, browse or otherwise access (evidently making it known to the kids that "I can see EVERYTHING you see and do").

    So, yeah, hang the imbecile parents that should not have had kids in the first place because a fucking social program or school would raise them instead. Fuck off.

    social program

    And thanks to the assholes in Congress who just passed the Big Betrayal Bill, those are all going away.

  • That's just called the outside now. Assume you are on camera at all times the moment you step out the front door. To be safe in the surveillance we live in today, best act as though you are being recorded in your own home as well.

    best act as though you are being recorded in your own home as well.

    If you don't know, don't try? Seems a bit defeatist.

    There's also the matter of "you" the NPC and well... "You".

    You can rest easy knowing Trump knows you're at work, but not the contents of the monologue you gave on Palestine on a political XMPP chatroom.

  • Yes I can, moreso after your clarification. I must have misread it the first time. Sorry.

    Sorry for my tone too, I get dysphoric-defensive very easily (as have been illustrated).

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    Deepfakes might end up being the modern version of a bikini. In the olden days, people wore these to the beach. Having less was scandalous and moral decay. Yet, now we wear much less.

    Our grandchildren might simply not give a damn about their nudity, because it is assumed that everyone is deepfaking everyone.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • anyone using any kind of AI either doesn't know how consent works-- or they don't care about it.

    a horrifying development in the intersection of technofascism and rape culture

    Any AI? Every application? What kind of statement is this?

  • Deepfakes might end up being the modern version of a bikini. In the olden days, people wore these to the beach. Having less was scandalous and moral decay. Yet, now we wear much less.

    Our grandchildren might simply not give a damn about their nudity, because it is assumed that everyone is deepfaking everyone.

    These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.

  • Any AI? Every application? What kind of statement is this?

    AI models (unless you're training your own) are usually trained on data it does not have a licence to use. The companies training these models are also notorious for ignoring robot.txt and other measures websites use to stop bots from trawling their data.

    Like in crypto, most people in AI are not nerds, just criminal scum.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    Burkas for the win ?

  • Can't afford this much cheese today to find just the right slice for every bikini photo...

  • Authors petition publishers to curtail their use of AI

    Technology technology
    2
    74 Stimmen
    2 Beiträge
    4 Aufrufe
    M
    I’m sure publishers are all ears /s
  • 893 Stimmen
    134 Beiträge
    105 Aufrufe
    Y
    Yup, but the control mechanisms are going to shit, because it sounds like they are going to maybe do a half assed rollout
  • How to "Reformat" a Hardrive the American way

    Technology technology
    25
    2
    90 Stimmen
    25 Beiträge
    26 Aufrufe
    T
    It really, really is. Like that scene from Office Space.
  • Uber, Lyft oppose some bills that aim to prevent assaults during rides

    Technology technology
    12
    94 Stimmen
    12 Beiträge
    9 Aufrufe
    F
    California is not Colorado nor is it federal No shit, did you even read my comment? Regulations already exist in every state that ride share companies operate in, including any state where taxis operate. People are already not supposed to sexually assault their passengers. Will adding another regulation saying they shouldn’t do that, even when one already exists, suddenly stop it from happening? No. Have you even looked at the regulations in Colorado for ride share drivers and companies? I’m guessing not. Here are the ones that were made in 2014: https://law.justia.com/codes/colorado/2021/title-40/article-10-1/part-6/section-40-10-1-605/#%3A~%3Atext=§+40-10.1-605.+Operational+Requirements+A+driver+shall+not%2Ca+ride%2C+otherwise+known+as+a+“street+hail”. Here’s just one little but relevant section: Before a person is permitted to act as a driver through use of a transportation network company's digital network, the person shall: Obtain a criminal history record check pursuant to the procedures set forth in section 40-10.1-110 as supplemented by the commission's rules promulgated under section 40-10.1-110 or through a privately administered national criminal history record check, including the national sex offender database; and If a privately administered national criminal history record check is used, provide a copy of the criminal history record check to the transportation network company. A driver shall obtain a criminal history record check in accordance with subparagraph (I) of paragraph (a) of this subsection (3) every five years while serving as a driver. A person who has been convicted of or pled guilty or nolo contendere to driving under the influence of drugs or alcohol in the previous seven years before applying to become a driver shall not serve as a driver. If the criminal history record check reveals that the person has ever been convicted of or pled guilty or nolo contendere to any of the following felony offenses, the person shall not serve as a driver: (c) (I) A person who has been convicted of or pled guilty or nolo contendere to driving under the influence of drugs or alcohol in the previous seven years before applying to become a driver shall not serve as a driver. If the criminal history record check reveals that the person has ever been convicted of or pled guilty or nolo contendere to any of the following felony offenses, the person shall not serve as a driver: An offense involving fraud, as described in article 5 of title 18, C.R.S.; An offense involving unlawful sexual behavior, as defined in section 16-22-102 (9), C.R.S.; An offense against property, as described in article 4 of title 18, C.R.S.; or A crime of violence, as described in section 18-1.3-406, C.R.S. A person who has been convicted of a comparable offense to the offenses listed in subparagraph (I) of this paragraph (c) in another state or in the United States shall not serve as a driver. A transportation network company or a third party shall retain true and accurate results of the criminal history record check for each driver that provides services for the transportation network company for at least five years after the criminal history record check was conducted. A person who has, within the immediately preceding five years, been convicted of or pled guilty or nolo contendere to a felony shall not serve as a driver. Before permitting an individual to act as a driver on its digital network, a transportation network company shall obtain and review a driving history research report for the individual. An individual with the following moving violations shall not serve as a driver: More than three moving violations in the three-year period preceding the individual's application to serve as a driver; or A major moving violation in the three-year period preceding the individual's application to serve as a driver, whether committed in this state, another state, or the United States, including vehicular eluding, as described in section 18-9-116.5, C.R.S., reckless driving, as described in section 42-4-1401, C.R.S., and driving under restraint, as described in section 42-2-138, C.R.S. A transportation network company or a third party shall retain true and accurate results of the driving history research report for each driver that provides services for the transportation network company for at least three years. So all sorts of criminal history, driving record, etc checks have been required since 2014. Colorado were actually the first state in the USA to implement rules like this for ride share companies lol.
  • Websites Are Tracking You Via Browser Fingerprinting

    Technology technology
    41
    1
    296 Stimmen
    41 Beiträge
    41 Aufrufe
    M
    Lets you question how digital stalking is still allowed?
  • 18 Stimmen
    10 Beiträge
    13 Aufrufe
    M
    Business Insider was founded in 2007.
  • 462 Stimmen
    94 Beiträge
    45 Aufrufe
    L
    Make them publishers or whatever is required to have it be a legal requirement, have them ban people who share false information. The law doesn't magically make open discussions not open. By design, social media is open. If discussion from the public is closed, then it's no longer social media. ban people who share false information Banning people doesn't stop falsehoods. It's a broken solution promoting a false assurance. Authorities are still fallible & risk banning over unpopular/debatable expressions that may turn out true. There was unpopular dissent over covid lockdown policies in the US despite some dramatic differences with EU policies. Pro-palestinian protests get cracked down. Authorities are vulnerable to biases & swayed. Moreover, when people can just share their falsehoods offline, attempting to ban them online is hard to justify. If print media, through its decline, is being held legally responsible Print media is a controlled medium that controls it writers & approves everything before printing. It has a prepared, coordinated message. They can & do print books full of falsehoods if they want. Social media is open communication where anyone in the entire public can freely post anything before it is revoked. They aren't claiming to spread the truth, merely to enable communication.
  • Palantir’s Idea of Peace

    Technology technology
    12
    22 Stimmen
    12 Beiträge
    27 Aufrufe
    A
    "Totally not a narc, inc."