Skip to content

Say Hello to the World's Largest Hard Drive, a Massive 36TB Seagate

Technology
215 137 0
  • YouTube Comment Bots are out of control...

    Technology technology
    3
    55 Stimmen
    3 Beiträge
    32 Aufrufe
    D
    Youtube is just lazy. These bots are laughably easy to detect and block.
  • Teamviewer Terminates Perpetual Licenses

    Technology technology
    38
    236 Stimmen
    38 Beiträge
    178 Aufrufe
    C
    Right on, thanks for the info!
  • Trump social media site brought down by Iran hackers

    Technology technology
    174
    1k Stimmen
    174 Beiträge
    739 Aufrufe
    B
    That's the spirit
  • 466 Stimmen
    24 Beiträge
    130 Aufrufe
    J
    Paging Ray Bradbury......... https://www.libraryofshortstories.com/storiespdf/the-veldt.pdf
  • 24 Stimmen
    14 Beiträge
    41 Aufrufe
    S
    I think you're missing some key points. Any file hosting service, no matter what, will have to deal with CSAM as long as people are able to upload to it. No matter what. This is an inescapable fact of hosting and the internet in general. Because CSAM is so ubiquitous and constant, one can only do so much to moderate any services, whether they're a large corporation are someone with a server in their closet. All of the larger platforms like 'meta', google, etc., mostly outsource that moderation to workers in developing countries so they don't have to also provide mental health counselling, but that's another story. The reason they own their own hardware is because the hosting services can and will disable your account and take down your servers if there's even a whiff of CSAM. Since it's a constant threat, it's better to own your own hardware and host everything from your closet so you don't have to eat the downtime and wait for some poor bastard in Nigeria to look through your logs and reinstate your account (not sure how that works exactly though).
  • The Internet of Consent

    Technology technology
    1
    1
    11 Stimmen
    1 Beiträge
    11 Aufrufe
    Niemand hat geantwortet
  • 272 Stimmen
    131 Beiträge
    201 Aufrufe
    eyedust@lemmy.dbzer0.comE
    This is good to know. I hadn't read the fine print, because I abandoned Telegram and never looked back. I hope its true and I agree, I also wouldn't think they'd do this and then renege into a possible lawsuit.
  • 92 Stimmen
    42 Beiträge
    21 Aufrufe
    G
    You don’t understand. The tracking and spying is the entire point of the maneuver. The ‘children are accessing porn’ thing is just a Trojan horse to justify the spying. I understand what are you saying, I simply don't consider to check if a law is applied as a Trojan horse in itself. I would agree if the EU had said to these sites "give us all the the access log, a list of your subscriber, every data you gather and a list of every IP it ever connected to your site", and even this way does not imply that with only the IP you could know who the user is without even asking the telecom company for help. So, is it a Trojan horse ? Maybe, it heavily depend on how the EU want to do it. If they just ask "show me how you try to avoid that a minor access your material", which normally is the fist step, I don't see how it could be a Trojan horse. It could become, I agree on that. As you pointed out, it’s already illegal for them to access it, and parents are legally required to prevent their children from accessing it. No, parents are not legally required to prevent it. The seller (or provider) is legally required. It is a subtle but important difference. But you don’t lock down the entire population, or institute pre-crime surveillance policies, just because some parents are not going to follow the law. True. You simply impose laws that make mandatories for the provider to check if he can sell/serve something to someone. I mean asking that the cashier of mall check if I am an adult when I buy a bottle of wine is no different than asking to Pornhub to check if the viewer is an adult. I agree that in one case is really simple and in the other is really hard (and it is becoming harder by the day). You then charge the guilty parents after the offense. Ok, it would work, but then how do you caught the offendind parents if not checking what everyone do ? Is it not simpler to try to prevent it instead ?