Skip to content

Kids are making deepfakes of each other, and laws aren’t keeping up

Technology
156 75 0
  • Lawmakers are grappling with how to address ...

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

    They want to be regulated so they can finally have their mote. Cutting out the states' power does mean they will only have to buy one group of politicians in Washington and those are some relatively cheap Hoes

  • That could be a socially healthy place to end up at. I don't see it anytime soon though. Just look at the other response I got.

    Sure. That might end up being a socially healthy place for adults to end up.

    But it will never work that way for young teens. Their brains aren't done baking yet. They don't have the emotional maturity to understand that enough to be "okay with it because it's just a fake".

    That's why we protect kids rather than just telling them "hey it's okay...it's only a fake."

  • Lawmakers are grappling with how to address ...

    Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

    A 99-1 vote to drop the anti AI regulation is hardly the government voting against. The Senate smashed that shit hard and fast.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!

  • This is just apologia for the sexual commodification and exploitation of girls and women. There literally is no girl being sexually liberated here, she has literally had the choice taken from her. Sexual liberation does NOT mean "boys and men can turn all women into personal maturation aids". This ENFORCES patriarchy and subjugation of women. It literally teaches girls that their bodies do not belong to them, that its totally understandable for boys to strip them of humanity itself and turn them into sex dolls.

    The most deepfaked women are certainly actresses or musicians; attractive people that appear on screens and are known by much of the population.

    In some countries, they do not allow people to appear on-screen exactly because of that. Or at least, that's one justification. If the honor or humanity of a woman depends on sexual feelings that she might or might not arouse in men, then women cannot be free. And men probably can't be free either.

    At no point have I claimed that anyone is being liberated here. I do not know what will happen. I'm just pointing out how your message is harmful.

  • I'm not even going to begin describing all the ways that what you just said is fucked up.

    I'll just point out that online deepfake technology is FAR more accessible to the average 13 year old to use on their peers than "porno mags" were in our day.

    You want to compare taking your 13 year old classmates photo off of Facebook, running it through an AI and in five seconds creating photo-realistic adult content featuring them, and compare that to getting your dad's skin-mag from under his mattress when he's not home, cutting your classmates face out of a yearbook, taping it on, then sneaking THAT into the computer lab at school so that you can photocopy it and pass it around in home room, and then putting the skin-mag BACK under the mattress before your dad finds out.

    Is that right...is THAT what you're trying to say? Are those the two things that you're trying say are equivalent?

    Yes, we all know it's fucked up. The point is that we don't need a new class of laws just because it's harassment and bullying ✨with AI✨.

  • It's sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.

    Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don't understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.

    Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It's LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It's criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.

    If a boy fantasises sexually about a girl, is that harmful to her? If he tells his friends about it? No, this is not harmful - these actions do not affect her in any way. What affects the girl is how the boys might then treat her differently than they would do someone they don't find sexually attractive.

    The solution, in both cases, has to be to address the harmful behaviour. The only arguments for criminalising deepfakes themselves are also arguments for criminalising sexual fantasies. that is why people are talking about thought crime, because once you criminalise things that are harmless on their own, but which might down the line lead to directly harmful behaviour, there is no other distinction.

    The consent of the individual has been entirely erased. Dehumanization in its most direct form.

    Both of these, for example, apply just as readily to discussing a shared sexual fantasy about someone who didn't agree to it.

    No distinction, that is, other than this is new and icky. I don't want government policy to be dictated by fear of the new and by what people find icky, though. I do lots of stuff people find icky.

  • How is a school going to regulate what kids do outside of school property? They could ban cell phones on campus but that's not going to change what happens after hours.

    Schools can already do that though. You can get in trouble for bullying outside of school, and when i was a student athletes i had pretty strict restrictions on what i was allowed to do because i was an "ambassador" for the school.

  • This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

    Sexual attraction doesn't necessarily involve dehumanization. Unlike most other kinds of interest in a human being, it doesn't require interest in their personality, but these are logically not the same.

    In general you are using emotional arguments for things that work not through emotion, but through literal interpretation. That's like using metric calculations for a system that expects imperial. Utterly useless.

    If the person in the image is underaged then it should be classified as child pornography.

    No, it's not. It's literally a photorealistic drawing based on a photo (and a dataset to make the generative model). No children have been abused to produce it. Laws work literally.

    If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.

    No, because the woman is not being literally sexually exploited. Her photo being used without consent is, I think, subject of some laws. There are no new fundamental legal entities involved.

    Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

    I think I agree. But it's neither child pornography nor sexual exploitation and can't be equated to them.

    There are already existing laws for such actions, similar to using a photo of the victim and a pornographic photo, paper, scissors, pencils and glue. Or, if you think the situation is radically different, there should be new punishable crimes introduced.

    Otherwise it's like punishing everyone caught driving while drunk for non-premeditated murder. One is not the other.

    Hey so, at least in the US, drawings can absolutely be considered CSAM

  • If a boy fantasises sexually about a girl, is that harmful to her? If he tells his friends about it? No, this is not harmful - these actions do not affect her in any way. What affects the girl is how the boys might then treat her differently than they would do someone they don't find sexually attractive.

    The solution, in both cases, has to be to address the harmful behaviour. The only arguments for criminalising deepfakes themselves are also arguments for criminalising sexual fantasies. that is why people are talking about thought crime, because once you criminalise things that are harmless on their own, but which might down the line lead to directly harmful behaviour, there is no other distinction.

    The consent of the individual has been entirely erased. Dehumanization in its most direct form.

    Both of these, for example, apply just as readily to discussing a shared sexual fantasy about someone who didn't agree to it.

    No distinction, that is, other than this is new and icky. I don't want government policy to be dictated by fear of the new and by what people find icky, though. I do lots of stuff people find icky.

    No an image that is shared and distributed is not the same as a fantasy in someone's head. That is deranged. Should CSAM also be legal because making it illegal is like criminalizing the fantasies of pedophiles? Absolutely insane logical framework you have there.

    This isnt fantasy. It is content. It is media. It is material. It is produced without the consent of the girls and women being sexualized and it commodifies their existence, literally transforming the idea of them into sexual media consumed for the gratification of boys and men.

    It is genuinely incredible to me that you could be so unempathetic, so impassive, so detached from the real world and the consequences of this, that you could even make this comparison. You have seemingly no idea what youre talking about if you believe that pornography is the same thing as mental fantasies.

    And even in the case of mental fantasies, are those all good? Is it really a good thing that boys see the mere existence of the girls around them as inherently some kind of sexual availability?

  • No an image that is shared and distributed is not the same as a fantasy in someone's head. That is deranged. Should CSAM also be legal because making it illegal is like criminalizing the fantasies of pedophiles? Absolutely insane logical framework you have there.

    This isnt fantasy. It is content. It is media. It is material. It is produced without the consent of the girls and women being sexualized and it commodifies their existence, literally transforming the idea of them into sexual media consumed for the gratification of boys and men.

    It is genuinely incredible to me that you could be so unempathetic, so impassive, so detached from the real world and the consequences of this, that you could even make this comparison. You have seemingly no idea what youre talking about if you believe that pornography is the same thing as mental fantasies.

    And even in the case of mental fantasies, are those all good? Is it really a good thing that boys see the mere existence of the girls around them as inherently some kind of sexual availability?

    When someone makes child porn they put a child in a sexual situation - which is something that we have amassed a pile of evidence is extremely harmful to the child.

    For all you have said - "without the consent" - "being sexualised" - "commodifies their existence" - you haven't told us what the harm is. If you think those things are in and of themselves harmful then I need to know more about what you mean because:

    1. if someone thinks of me sexually without my consent I am not harmed
    2. if someone sexualises me in their mind I am not harmed
    3. I don't know what the "commodification of one's existence" can actually mean - I can't buy or sell "the existence of women" (does buying something's existence mean the same as buying the thing, or something else?) the same I can aluminium, and I don't see how being able to (easily) make (realistic) nude images of someone changes this in any way

    It is genuinely incredible to me that you could be so unempathetic,

    I am not unempathetic, but I attribute the blame for what makes me feel bad about the situation is that girls are being made to feel bad and ashamed not that a particular technology is now being used in one step of that.

  • So is this a way to take away rights by making it about kids?

    I mean what the fuck. We did much less and got punished right? It didn't matter if we were on the property. Schools can hold students accountable for conduct with other students.

    The leaded-gas adults of the time had no problem dealing with the emergence of cell phones. It was a distraction. They didn't need lawmakers to call it something specific. My Pokemon cards caused fights and were banned. No lawmakers needed.

    The problem is surely with the interaction between parents and schools. Or maybe it's just the old way of thinking. Maybe it's better to have police and courts start taking over discipline in schools.

    All your examples are of things that were stopped while at school, so your argument doesn't really carry over. You still had your pokemon cards everywhere else.

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    Instead of laws keeping up It also might turn out to be a case where culture keeps up.

  • My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!

    Thanks, cap'n.

  • A 99-1 vote to drop the anti AI regulation is hardly the government voting against. The Senate smashed that shit hard and fast.

    Expecting people to know about that 99-1 vote might be misplaced optimism, since it hasn't been made into a meme yet.

  • My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!

    this advice might get you locked up

  • My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!

    In the bible, it says, and I quote: "If a deepkfake of you is made, you shall give the creator more material to create deepfakes"

  • Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    Aren't there already laws against making child porn?

  • Aren't there already laws against making child porn?

    I'd rather these laws be against abusing and exploiting child, as well as against ruining their lives. Not only that would be more helpful, it would also work in this case, since actual likeness are involved.

    Alas, whether there's a law against that specific use case or not, it is somewhat difficult to police what people do in their home, without a third party whistleblower. Making more, impossible to apply laws for this specific case does not seem that useful.

  • this advice might get you locked up

    My mama also told me that if someone locks you up, then you just lock them up right back.

  • 5 Stimmen
    1 Beiträge
    0 Aufrufe
    Niemand hat geantwortet
  • US Senate strikes AI regulation ban from Trump megabill

    Technology technology
    3
    73 Stimmen
    3 Beiträge
    1 Aufrufe
    C
    No one likes little teddy, it appears.
  • www2025

    Technology technology
    1
    2
    1 Stimmen
    1 Beiträge
    2 Aufrufe
    Niemand hat geantwortet
  • 138 Stimmen
    15 Beiträge
    16 Aufrufe
    toastedravioli@midwest.socialT
    ChatGPT is not a doctor. But models trained on imaging can actually be a very useful tool for them to utilize. Even years ago, just before the AI “boom”, they were asking doctors for details on how they examine patient images and then training models on that. They found that the AI was “better” than doctors specifically because it followed the doctor’s advice 100% of the time; thereby eliminating any kind of bias from the doctor that might interfere with following their own training. Of course, the splashy headline “AI better than doctors” was ridiculous. But it does show the benefit of having a neutral tool for doctors to utilize, especially when looking at images for people who are outside of the typical demographics that much medical training is based on. (As in mostly just white men. For example, everything they train doctors on regarding knee imagining comes from images of the knees of coal miners in the UK some decades ago)
  • 163 Stimmen
    7 Beiträge
    8 Aufrufe
    L
    I wonder if they could develop this into a tooth coating. Preventing biofilms would go a long way to preventing cavities.
  • 195 Stimmen
    31 Beiträge
    14 Aufrufe
    isveryloud@lemmy.caI
    It's a loaded term that should be replaced with a more nimble definition. A dog whistle is the name for a loaded term that is used to tag a specific target with a large baggage of information, but in a way where only people who are part of the "in group" can understand the baggage of the word, hence "dog whistle", only heard by dogs. In the case of the word "degeneracy", it's a vague word that has been often used to attack, among other things, LGBTQ and their allies as well as non-religious people. The term is vague enough that the user can easily weasel their way out of criticism for its usage, but the target audience gets the message loud and clear: "[target] should be attacked for being [thing]." Another example of such a word would be "woke".
  • 1k Stimmen
    95 Beiträge
    15 Aufrufe
    G
    Obviously the law must be simple enough to follow so that for Jim’s furniture shop is not a problem nor a too high cost to respect it, but it must be clear that if you break it you can cease to exist as company. I think this may be the root of our disagreement, I do not believe that there is any law making body today that is capable of an elegantly simple law. I could be too naive, but I think it is possible. We also definitely have a difference on opinion when it comes to the severity of the infraction, in my mind, while privacy is important, it should not have the same level of punishments associated with it when compared to something on the level of poisoning water ways; I think that a privacy law should hurt but be able to be learned from while in the poison case it should result in the bankruptcy of a company. The severity is directly proportional to the number of people affected. If you violate the privacy of 200 million people is the same that you poison the water of 10 people. And while with the poisoning scenario it could be better to jail the responsible people (for a very, very long time) and let the company survive to clean the water, once your privacy is violated there is no way back, a company could not fix it. The issue we find ourselves with today is that the aggregate of all privacy breaches makes it harmful to the people, but with a sizeable enough fine, I find it hard to believe that there would be major or lasting damage. So how much money your privacy it's worth ? 6 For this reason I don’t think it is wise to write laws that will bankrupt a company off of one infraction which was not directly or indirectly harmful to the physical well being of the people: and I am using indirectly a little bit more strict than I would like to since as I said before, the aggregate of all the information is harmful. The point is that the goal is not to bankrupt companies but to have them behave right. The penalty associated to every law IS the tool that make you respect the law. And it must be so high that you don't want to break the law. I would have to look into the laws in question, but on a surface level I think that any company should be subjected to the same baseline privacy laws, so if there isn’t anything screwy within the law that apple, Google, and Facebook are ignoring, I think it should apply to them. Trust me on this one, direct experience payment processors have a lot more rules to follow to be able to work. I do not want jail time for the CEO by default but he need to know that he will pay personally if the company break the law, it is the only way to make him run the company being sure that it follow the laws. For some reason I don’t have my usual cynicism when it comes to this issue. I think that the magnitude of loses that vested interests have in these companies would make it so that companies would police themselves for fear of losing profits. That being said I wouldn’t be opposed to some form of personal accountability on corporate leadership, but I fear that they will just end up finding a way to create a scapegoat everytime. It is not cynicism. I simply think that a huge fine to a single person (the CEO for example) is useless since it too easy to avoid and if it really huge realistically it would be never paid anyway so nothing usefull since the net worth of this kind of people is only on the paper. So if you slap a 100 billion file to Musk he will never pay because he has not the money to pay even if technically he is worth way more than that. Jail time instead is something that even Musk can experience. In general I like laws that are as objective as possible, I think that a privacy law should be written so that it is very objectively overbearing, but that has a smaller fine associated with it. This way the law is very clear on right and wrong, while also giving the businesses time and incentive to change their practices without having to sink large amount of expenses into lawyers to review every minute detail, which is the logical conclusion of the one infraction bankrupt system that you seem to be supporting. Then you write a law that explicitally state what you can do and what is not allowed is forbidden by default.
  • New Cars Don't All Come With Dipsticks Anymore, Here's Why

    Technology technology
    22
    1
    2 Stimmen
    22 Beiträge
    31 Aufrufe
    L
    The U660F transmission in my wife's 2015 Highlander doesn't have a dipstick. Luckily that transmission is solid and easy to service anyway, you just need a skinny funnel to fill it.