Skip to content

FBI opens inquiry into 764, online group that sexually exploits and encourages minors to self-harm

Technology
15 15 140
  • 84 Stimmen
    44 Beiträge
    29 Aufrufe
    T
    There are worse people who didn't go to jail: https://www.theguardian.com/environment/2023/jun/02/dupont-pfas-settlement-water-chemical-contamination These days all companies got to do is pay a fine.
  • 17 Stimmen
    2 Beiträge
    36 Aufrufe
    T
    Yeah, sure. Like the police need extra help with racial profiling and "probable cause." Fuck this, and fuck the people who think this is a good idea. I'm sure the authoritarians in power right now will get right on those proposed "safeguards," right after they install backdoors into encryption, to which Only They Have The Key, to "protect" everyone from the scary "criminals."
  • An AI video ad is making a splash. Is it the future of advertising?

    Technology technology
    2
    10 Stimmen
    2 Beiträge
    31 Aufrufe
    apfelwoischoppen@lemmy.worldA
    Gobble that AI slop NPR. Reads like sponsored content.
  • 15 Stimmen
    1 Beiträge
    16 Aufrufe
    Niemand hat geantwortet
  • the illusion of human thinking

    Technology technology
    2
    0 Stimmen
    2 Beiträge
    31 Aufrufe
    H
    Can we get more than just a picture of an Abstract?
  • 43 Stimmen
    1 Beiträge
    19 Aufrufe
    Niemand hat geantwortet
  • 24 Stimmen
    14 Beiträge
    149 Aufrufe
    S
    I think you're missing some key points. Any file hosting service, no matter what, will have to deal with CSAM as long as people are able to upload to it. No matter what. This is an inescapable fact of hosting and the internet in general. Because CSAM is so ubiquitous and constant, one can only do so much to moderate any services, whether they're a large corporation are someone with a server in their closet. All of the larger platforms like 'meta', google, etc., mostly outsource that moderation to workers in developing countries so they don't have to also provide mental health counselling, but that's another story. The reason they own their own hardware is because the hosting services can and will disable your account and take down your servers if there's even a whiff of CSAM. Since it's a constant threat, it's better to own your own hardware and host everything from your closet so you don't have to eat the downtime and wait for some poor bastard in Nigeria to look through your logs and reinstate your account (not sure how that works exactly though).
  • 477 Stimmen
    81 Beiträge
    1k Aufrufe
    douglasg14b@lemmy.worldD
    Did I say that it did? No? Then why the rhetorical question for something that I never stated? Now that we're past that, I'm not sure if I think it's okay, but I at least recognize that it's normalized within society. And has been for like 70+ years now. The problem happens with how the data is used, and particularly abused. If you walk into my store, you expect that I am monitoring you. You expect that you are on camera and that your shopping patterns, like all foot traffic, are probably being analyzed and aggregated. What you buy is tracked, at least in aggregate, by default really, that's just volume tracking and prediction. Suffice to say that broad customer behavior analysis has been a thing for a couple generations now, at least. When you go to a website, why would you think that it is not keeping track of where you go and what you click on in the same manner? Now that I've stated that I do want to say that the real problems that we experience come in with how this data is misused out of what it's scope should be. And that we should have strong regulatory agencies forcing compliance of how this data is used and enforcing the right to privacy for people that want it removed.