Kids are making deepfakes of each other, and laws aren’t keeping up
-
This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.
Sexual attraction doesn't necessarily involve dehumanization. Unlike most other kinds of interest in a human being, it doesn't require interest in their personality, but these are logically not the same.
In general you are using emotional arguments for things that work not through emotion, but through literal interpretation. That's like using metric calculations for a system that expects imperial. Utterly useless.
If the person in the image is underaged then it should be classified as child pornography.
No, it's not. It's literally a photorealistic drawing based on a photo (and a dataset to make the generative model). No children have been abused to produce it. Laws work literally.
If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.
No, because the woman is not being literally sexually exploited. Her photo being used without consent is, I think, subject of some laws. There are no new fundamental legal entities involved.
Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.
I think I agree. But it's neither child pornography nor sexual exploitation and can't be equated to them.
There are already existing laws for such actions, similar to using a photo of the victim and a pornographic photo, paper, scissors, pencils and glue. Or, if you think the situation is radically different, there should be new punishable crimes introduced.
Otherwise it's like punishing everyone caught driving while drunk for non-premeditated murder. One is not the other.
Hey so, at least in the US, drawings can absolutely be considered CSAM
-
If a boy fantasises sexually about a girl, is that harmful to her? If he tells his friends about it? No, this is not harmful - these actions do not affect her in any way. What affects the girl is how the boys might then treat her differently than they would do someone they don't find sexually attractive.
The solution, in both cases, has to be to address the harmful behaviour. The only arguments for criminalising deepfakes themselves are also arguments for criminalising sexual fantasies. that is why people are talking about thought crime, because once you criminalise things that are harmless on their own, but which might down the line lead to directly harmful behaviour, there is no other distinction.
The consent of the individual has been entirely erased. Dehumanization in its most direct form.
Both of these, for example, apply just as readily to discussing a shared sexual fantasy about someone who didn't agree to it.
No distinction, that is, other than this is new and icky. I don't want government policy to be dictated by fear of the new and by what people find icky, though. I do lots of stuff people find icky.
No an image that is shared and distributed is not the same as a fantasy in someone's head. That is deranged. Should CSAM also be legal because making it illegal is like criminalizing the fantasies of pedophiles? Absolutely insane logical framework you have there.
This isnt fantasy. It is content. It is media. It is material. It is produced without the consent of the girls and women being sexualized and it commodifies their existence, literally transforming the idea of them into sexual media consumed for the gratification of boys and men.
It is genuinely incredible to me that you could be so unempathetic, so impassive, so detached from the real world and the consequences of this, that you could even make this comparison. You have seemingly no idea what youre talking about if you believe that pornography is the same thing as mental fantasies.
And even in the case of mental fantasies, are those all good? Is it really a good thing that boys see the mere existence of the girls around them as inherently some kind of sexual availability?
-
No an image that is shared and distributed is not the same as a fantasy in someone's head. That is deranged. Should CSAM also be legal because making it illegal is like criminalizing the fantasies of pedophiles? Absolutely insane logical framework you have there.
This isnt fantasy. It is content. It is media. It is material. It is produced without the consent of the girls and women being sexualized and it commodifies their existence, literally transforming the idea of them into sexual media consumed for the gratification of boys and men.
It is genuinely incredible to me that you could be so unempathetic, so impassive, so detached from the real world and the consequences of this, that you could even make this comparison. You have seemingly no idea what youre talking about if you believe that pornography is the same thing as mental fantasies.
And even in the case of mental fantasies, are those all good? Is it really a good thing that boys see the mere existence of the girls around them as inherently some kind of sexual availability?
When someone makes child porn they put a child in a sexual situation - which is something that we have amassed a pile of evidence is extremely harmful to the child.
For all you have said - "without the consent" - "being sexualised" - "commodifies their existence" - you haven't told us what the harm is. If you think those things are in and of themselves harmful then I need to know more about what you mean because:
- if someone thinks of me sexually without my consent I am not harmed
- if someone sexualises me in their mind I am not harmed
- I don't know what the "commodification of one's existence" can actually mean - I can't buy or sell "the existence of women" (does buying something's existence mean the same as buying the thing, or something else?) the same I can aluminium, and I don't see how being able to (easily) make (realistic) nude images of someone changes this in any way
It is genuinely incredible to me that you could be so unempathetic,
I am not unempathetic, but I attribute the blame for what makes me feel bad about the situation is that girls are being made to feel bad and ashamed not that a particular technology is now being used in one step of that.
-
So is this a way to take away rights by making it about kids?
I mean what the fuck. We did much less and got punished right? It didn't matter if we were on the property. Schools can hold students accountable for conduct with other students.
The leaded-gas adults of the time had no problem dealing with the emergence of cell phones. It was a distraction. They didn't need lawmakers to call it something specific. My Pokemon cards caused fights and were banned. No lawmakers needed.
The problem is surely with the interaction between parents and schools. Or maybe it's just the old way of thinking. Maybe it's better to have police and courts start taking over discipline in schools.
All your examples are of things that were stopped while at school, so your argument doesn't really carry over. You still had your pokemon cards everywhere else.
-
Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Instead of laws keeping up It also might turn out to be a case where culture keeps up.
-
My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!
Thanks, cap'n.
-
A 99-1 vote to drop the anti AI regulation is hardly the government voting against. The Senate smashed that shit hard and fast.
Expecting people to know about that 99-1 vote might be misplaced optimism, since it hasn't been made into a meme yet.
-
My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!
this advice might get you locked up
-
My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!
In the bible, it says, and I quote: "If a deepkfake of you is made, you shall give the creator more material to create deepfakes"
-
Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Aren't there already laws against making child porn?
-
Aren't there already laws against making child porn?
I'd rather these laws be against abusing and exploiting child, as well as against ruining their lives. Not only that would be more helpful, it would also work in this case, since actual likeness are involved.
Alas, whether there's a law against that specific use case or not, it is somewhat difficult to police what people do in their home, without a third party whistleblower. Making more, impossible to apply laws for this specific case does not seem that useful.
-
this advice might get you locked up
My mama also told me that if someone locks you up, then you just lock them up right back.
-
Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here
-
In the bible, it says, and I quote: "If a deepkfake of you is made, you shall give the creator more material to create deepfakes"
An eye for an eye, a tooth for a tooth, and a deepfake for a deepfake.
-
Expecting people to know about that 99-1 vote might be misplaced optimism, since it hasn't been made into a meme yet.
especially that
AbbotTed Cruz, who brought this one up, voted against it in the end, which is pretty confusing for an european tbhe: i mean that it's memeworthy lol
-
especially that
AbbotTed Cruz, who brought this one up, voted against it in the end, which is pretty confusing for an european tbhe: i mean that it's memeworthy lol
I'm confused - by Abbot do you mean Gov. Abbott of Texas, and are we talking about the same issue? Cuz the 99-1 vote was about a senate bill regarding AI. Greg Abbott can't vote on senate bills, and there's no senator named Abbot.
-
I'm confused - by Abbot do you mean Gov. Abbott of Texas, and are we talking about the same issue? Cuz the 99-1 vote was about a senate bill regarding AI. Greg Abbott can't vote on senate bills, and there's no senator named Abbot.
aaah i misremembered, it was Ted Cruz, oops
Ted Cruz plan to punish states that regulate AI shot down in 99-1 vote
The one vote backing moratorium on state AI laws came from Thom Tillis, not Cruz.
Ars Technica (arstechnica.com)
-
I think generating and sharing sexually explicit images of a person without their consent is abuse.
That's distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I'm morally uncomfortable criminalizing an act that has no victim.
Harassment sure, but not abuse.
-
I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here
not necessarily. image generation models work on a more fine-grained scale than that. they can seamlessly combine related concepts, like "photograph"+"person"+"small"+"pose" and generate plausible material due to the fact that all of those concepts have features in common.
you can also use small add-on models trained on very little data (tens to hundreds of images, as compared to millions to billions for a full model) to "steer" the output of a model towards a particular style.
you can make even a fully legal model output illegal data.
all that being said, the base dataset that most of the stable diffusion family of models started out with in 2021 is medical in nature so there could very well be bad shit in there. it's like 12 billion images so it's hard to check, and even back with stable diffusion 1.0 there was less than a single bit of data in the final model per image in the data.
-
Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.
That's just on it's face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That's not pedophilia. It's wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.
It shouldn't be treated the same as when an adult man generates it; there should be nuance. I'm not saying it's ok for a thirteen year old to generate said content: I'm saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.
I'm so glad I went through puberty at a time when this kind of shit wasn't available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.
-
Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task
Technology1
-
-
-
-
Meta and Palmer Luckey's Anduril Industries partner to build EagleEye, a new AI-powered weapons system, including rugged helmets, glasses, and other wearables
Technology1
-
-
Duolingo CEO says AI is a better teacher than humans—but schools will exist ‘because you still need childcare’
Technology1
-