Skip to content

Front Brake Lights Could Drastically Diminish Road Accident Rates

Technology
336 164 2
  • 45 Stimmen
    7 Beiträge
    0 Aufrufe
    artocode404@lemmy.dbzer0.comA
    Googlebot sad when disallowed access to 18+ videos
  • Pimax: one more brand exposed for promoting "positive reviews".

    Technology technology
    2
    1
    55 Stimmen
    2 Beiträge
    3 Aufrufe
    moose@moose.bestM
    This doesn't really surprise me, I've gotten weird vibes from Pimax for years. Not so much to do with their hardware, but how their sales / promo team operates. A while back at my old workplace we randomly got contacted by Pimax trying to have us carry their headset, which was weird since we didn't sell VR stuff or computers even, just other electronics. It was a very out of place request which we basically said we wouldn't consider it until we can verify the quality of the headset, after which they never replied.
  • Catbox.moe got screwed 😿

    Technology technology
    40
    55 Stimmen
    40 Beiträge
    2 Aufrufe
    archrecord@lemm.eeA
    I'll gladly give you a reason. I'm actually happy to articulate my stance on this, considering how much I tend to care about digital rights. Services that host files should not be held responsible for what users upload, unless: The service explicitly caters to illegal content by definition or practice (i.e. the if the website is literally titled uploadyourcsamhere[.]com then it's safe to assume they deliberately want to host illegal content) The service has a very easy mechanism to remove illegal content, either when asked, or through simple monitoring systems, but chooses not to do so (catbox does this, and quite quickly too) Because holding services responsible creates a whole host of negative effects. Here's some examples: Someone starts a CDN and some users upload CSAM. The creator of the CDN goes to jail now. Nobody ever wants to create a CDN because of the legal risk, and thus the only providers of CDNs become shady, expensive, anonymously-run services with no compliance mechanisms. You run a site that hosts images, and someone decides they want to harm you. They upload CSAM, then report the site to law enforcement. You go to jail. Anybody in the future who wants to run an image sharing site must now self-censor to try and not upset any human being that could be willing to harm them via their site. A social media site is hosting the posts and content of users. In order to be compliant and not go to jail, they must engage in extremely strict filtering, otherwise even one mistake could land them in jail. All users of the site are prohibited from posting any NSFW or even suggestive content, (including newsworthy media, such as an image of bodies in a warzone) and any violation leads to an instant ban, because any of those things could lead to a chance of actually illegal content being attached. This isn't just my opinion either. Digital rights organizations such as the Electronic Frontier Foundation have talked at length about similar policies before. To quote them: "When social media platforms adopt heavy-handed moderation policies, the unintended consequences can be hard to predict. For example, Twitter’s policies on sexual material have resulted in posts on sexual health and condoms being taken down. YouTube’s bans on violent content have resulted in journalism on the Syrian war being pulled from the site. It can be tempting to attempt to “fix” certain attitudes and behaviors online by placing increased restrictions on users’ speech, but in practice, web platforms have had more success at silencing innocent people than at making online communities healthier." Now, to address the rest of your comment, since I don't just want to focus on the beginning: I think you have to actively moderate what is uploaded Catbox does, and as previously mentioned, often at a much higher rate than other services, and at a comparable rate to many services that have millions, if not billions of dollars in annual profits that could otherwise be spent on further moderation. there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal. The problem isn't necessarily the speed at which people can be reported and punished, but rather that the internet is fundamentally harder to track people on than real life. It's easy for cops to sit around at a spot they know someone will be physically distributing illegal content at in real life, but digitally, even if you can see the feed of all the information passing through the service, a VPN or Tor connection will anonymize your IP address in a manner that most police departments won't be able to track, and most three-letter agencies will simply have a relatively low success rate with. There's no good solution to this problem of identifying perpetrators, which is why platforms often focus on moderation over legal enforcement actions against users so frequently. It accomplishes the goal of preventing and removing the content without having to, for example, require every single user of the internet to scan an ID (and also magically prevent people from just stealing other people's access tokens and impersonating their ID) I do agree, however, that we should probably provide larger amounts of funding, training, and resources, to divisions who's sole goal is to go after online distribution of various illegal content, primarily that which harms children, because it's certainly still an issue of there being too many reports to go through, even if many of them will still lead to dead ends. I hope that explains why making file hosting services liable for user uploaded content probably isn't the best strategy. I hate to see people with good intentions support ideas that sound good in practice, but in the end just cause more untold harms, and I hope you can understand why I believe this to be the case.
  • 158 Stimmen
    79 Beiträge
    0 Aufrufe
    M
    But they did give! They did not chose to deny and not have pizza.
  • WhatsApp is working on video and voice calls on the web

    Technology technology
    10
    1
    6 Stimmen
    10 Beiträge
    2 Aufrufe
    A
    Worked well for me. Although all the people I care about had already Signal, Element or Threema installed, so I am not a great pull factor. And those everyday moms from child care or from wherever can reach me via SMS, for the two messages/year.
  • 0 Stimmen
    6 Beiträge
    0 Aufrufe
    H
    Then that's changed since the last time I toyed with the idea. Which, granted, was probably 20 years ago...
  • Microsoft pulls MS365 Business Premium from nonprofits

    Technology technology
    37
    1
    48 Stimmen
    37 Beiträge
    3 Aufrufe
    S
    That's the thing, I wish we could just switch all enterprises to Linux, but Microsoft developed a huge ecosystem that really does have good features. Unless something comparable comes up in the Linux world, I don't see Europe becoming independent of Microsoft any time soon
  • 1 Stimmen
    8 Beiträge
    2 Aufrufe
    L
    I made a PayPal account like 20 years ago in a third world country. The only thing you needed then is an email and password. I have no real name on there and no PII, technically my bank card is attached but on PP itself there's no KYC. I think you could probably use some types of prepaid cards with it if you want to avoid using a bank altogether but for me this wasn't an issue, I just didn't want my ID on any records, I don't have any serious OpSec concerns otherwise. I'm sure you could either buy PayPal accounts like this if you needed to, or make one in a country that doesn't have KYC laws somehow. From there I'd add money to my balance and send money as F&F. At no point did I need an ID so in that sense there's no KYC. Some sellers on localmarket were fancy enough to list that they wanted an ID for KYC, but I'm sure you could just send them any random ID you made in paint from the republic of dave and you'd be fine.