Skip to content

Your public ChatGPT queries are getting indexed by Google and other search engines

Technology
46 25 0
  • 120 Stimmen
    24 Beiträge
    384 Aufrufe
    M
    A little background info: Russia's been sponsoring one of its oligarchs' business by eliminating their competition. First, they restricted YouTube's speed to an unusable state to force people to switch to RuTube (they didn't) Now they're trying to force people to switch from WhatsApp (and potentially Telegram) to MAX, which they want to be Russia's version of WeChat. Add the fact that our politicians are obsessed with controlling all of the media and you'll get the gist of it.
  • 'I can't drink the water' - life next to a US data centre

    Technology technology
    21
    1
    262 Stimmen
    21 Beiträge
    288 Aufrufe
    C
    They use adiabatic coolers to minimize electrical cost for cooling and maximize cooling capacity. The water isn't directly used as the cooling fluid. It's just used to provide evaporative cooling to boost the efficiency of a conventional refrigeration system. I also suspect that many of them are starting to switch to CO2 based refrigeration systems which heavily benefit from adiabatic gas coolers due to the low critical temp of CO2. Without an adiabatic cooler the efficiency of a CO2 based system starts dropping heavily when the ambient temp gets much above 80F. They could acheive the same results without using water, however their refrigeration systems would need larger gas coolers which would increase their electricity usage.
  • 238 Stimmen
    20 Beiträge
    176 Aufrufe
    A
    Unless you are a major corporation... you are not free to take anything.
  • How to guide for MCP tools, resources, and prompts

    Technology technology
    1
    1
    8 Stimmen
    1 Beiträge
    20 Aufrufe
    Niemand hat geantwortet
  • How can websites verify unique (IRL) identities?

    Technology technology
    6
    8 Stimmen
    6 Beiträge
    55 Aufrufe
    H
    Safe, yeah. Private, no. If you want to verify whether a user is a real person, you need very personally identifiable information. That’s not ever going to be private. The best you could do, in theory, is have a government service that takes that PII and gives the user a signed cryptographic certificate they can use to verify their identity. Most people would either lose their private key or have it stolen, so even that system would have problems. The closest to reality you could do right now is use Apple’s FaceID, and that’s anything but private. Pretty safe though. It’s super illegal and quite hard to steal someone’s face.
  • 'We're done with Teams': German state hits uninstall on Microsoft

    Technology technology
    102
    841 Stimmen
    102 Beiträge
    2k Aufrufe
    F
    You’ve been patient? Bye
  • 12 Stimmen
    1 Beiträge
    18 Aufrufe
    Niemand hat geantwortet
  • Catbox.moe got screwed 😿

    Technology technology
    40
    55 Stimmen
    40 Beiträge
    399 Aufrufe
    archrecord@lemm.eeA
    I'll gladly give you a reason. I'm actually happy to articulate my stance on this, considering how much I tend to care about digital rights. Services that host files should not be held responsible for what users upload, unless: The service explicitly caters to illegal content by definition or practice (i.e. the if the website is literally titled uploadyourcsamhere[.]com then it's safe to assume they deliberately want to host illegal content) The service has a very easy mechanism to remove illegal content, either when asked, or through simple monitoring systems, but chooses not to do so (catbox does this, and quite quickly too) Because holding services responsible creates a whole host of negative effects. Here's some examples: Someone starts a CDN and some users upload CSAM. The creator of the CDN goes to jail now. Nobody ever wants to create a CDN because of the legal risk, and thus the only providers of CDNs become shady, expensive, anonymously-run services with no compliance mechanisms. You run a site that hosts images, and someone decides they want to harm you. They upload CSAM, then report the site to law enforcement. You go to jail. Anybody in the future who wants to run an image sharing site must now self-censor to try and not upset any human being that could be willing to harm them via their site. A social media site is hosting the posts and content of users. In order to be compliant and not go to jail, they must engage in extremely strict filtering, otherwise even one mistake could land them in jail. All users of the site are prohibited from posting any NSFW or even suggestive content, (including newsworthy media, such as an image of bodies in a warzone) and any violation leads to an instant ban, because any of those things could lead to a chance of actually illegal content being attached. This isn't just my opinion either. Digital rights organizations such as the Electronic Frontier Foundation have talked at length about similar policies before. To quote them: "When social media platforms adopt heavy-handed moderation policies, the unintended consequences can be hard to predict. For example, Twitter’s policies on sexual material have resulted in posts on sexual health and condoms being taken down. YouTube’s bans on violent content have resulted in journalism on the Syrian war being pulled from the site. It can be tempting to attempt to “fix” certain attitudes and behaviors online by placing increased restrictions on users’ speech, but in practice, web platforms have had more success at silencing innocent people than at making online communities healthier." Now, to address the rest of your comment, since I don't just want to focus on the beginning: I think you have to actively moderate what is uploaded Catbox does, and as previously mentioned, often at a much higher rate than other services, and at a comparable rate to many services that have millions, if not billions of dollars in annual profits that could otherwise be spent on further moderation. there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal. The problem isn't necessarily the speed at which people can be reported and punished, but rather that the internet is fundamentally harder to track people on than real life. It's easy for cops to sit around at a spot they know someone will be physically distributing illegal content at in real life, but digitally, even if you can see the feed of all the information passing through the service, a VPN or Tor connection will anonymize your IP address in a manner that most police departments won't be able to track, and most three-letter agencies will simply have a relatively low success rate with. There's no good solution to this problem of identifying perpetrators, which is why platforms often focus on moderation over legal enforcement actions against users so frequently. It accomplishes the goal of preventing and removing the content without having to, for example, require every single user of the internet to scan an ID (and also magically prevent people from just stealing other people's access tokens and impersonating their ID) I do agree, however, that we should probably provide larger amounts of funding, training, and resources, to divisions who's sole goal is to go after online distribution of various illegal content, primarily that which harms children, because it's certainly still an issue of there being too many reports to go through, even if many of them will still lead to dead ends. I hope that explains why making file hosting services liable for user uploaded content probably isn't the best strategy. I hate to see people with good intentions support ideas that sound good in practice, but in the end just cause more untold harms, and I hope you can understand why I believe this to be the case.