Skip to content

Operation Narnia: Iran’s nuclear scientists reportedly killed simultaneously using special weapon

Technology
396 178 16.3k
  • RUCKUS Networks

    Technology technology
    1
    2
    0 Stimmen
    1 Beiträge
    0 Aufrufe
    Niemand hat geantwortet
  • German police expands use of Palantir surveillance software

    Technology technology
    14
    169 Stimmen
    14 Beiträge
    34 Aufrufe
    M
    A tale older than myth.
  • 86 Stimmen
    31 Beiträge
    282 Aufrufe
    A
    You don’t have the power to decarbonize all electricity From the article: Location also affects how carbon emissions are managed. Germany has the largest carbon footprint for video streaming at 76g CO₂e per hour of streaming, reflecting its continued reliance on coal and fossil fuels. In the UK, this figure is 48g CO₂e per hour, because its energy mix includes renewables and natural gas, increasingly with nuclear as central to the UK’s low-carbon future. France, with a reliance on nuclear is the lowest, at 10g CO₂e per hour. This is a massive difference, and clearly doable, nothing that would be limited to the distant future. So I get this right? I'm naive for expecting govt regulations to put companies' behaviour under control, whereas you're realistic by expecting hundreds of millions of people deciding to systematically minimise their Youtube/Tiktok/Spotify/Netflix/Zoom usage? Hmm, alright. And yet in an another comment you also expect that Spotify shouldn't introduce video streaming, without any external regulation but out of pure goodness of their hearts?
  • What are the most in-demand Tech Skills? (besides AI)

    Technology technology
    5
    10 Stimmen
    5 Beiträge
    66 Aufrufe
    jordanlund@lemmy.worldJ
    AI is devaluing other skills. I got an email today, from my own company, telling me I wouldn't have to renew my professional certification for 2 years if I passed an unrelated test on AI. The "test" was 10 questions. Glad to know my professional certification is equivalent to a 10 question pop quiz on AI.
  • 83 Stimmen
    13 Beiträge
    133 Aufrufe
    M
    It's a bit of a sticking point in Australia which is becoming more and more of a 'two-speed' society. Foxtel is for the rich classes, it caters to the right wing. Sky News is on Foxtel. These eSafety directives killing access to youtube won't affect those rich kids so much, but for everyone else it's going to be a nightmare. My only possible hope out of this is that maybe, Parliament and ACMA (Australian Communications and Media Authority, TV standards) decide that since we need a greater media landscape for kids and they can't be allowed to have it online, that maybe more than 3 major broadcasters could be allowed. It's not a lack of will that stops anyone else making a new free-to-air network, it's legislation, there are only allowed to be 3 commercial FTA broadcasters in any area. I don't love Youtube or the kids watching it, it's that the alternatives are almost objectively worse. 10 and 7 and garbage 24/7 and 9 is basically a right-wing hugbox too.
  • Palantir hits new highs amid Israel-Iran conflict

    Technology technology
    4
    1
    41 Stimmen
    4 Beiträge
    47 Aufrufe
    W
    I think both peace and war are profitable. But those that profit from war may be more pushy than those that profit from peace, and so may get their way even as an unpopular minority . Unless, the left (usually more pro peace) learns a few lessons from the right and places good outcomes above the holier than thou moral purity. "I've never made anyone uncomfortable" is not the merit badge that some think it is. Of course the left can never be a mirror copy of the right because the left cannot afford to give as few fucks about anything as the right (who represent the already-haves economic incumbents; it's not called the "fuck you money" for nothing). But the left can be way tougher and nuancedly uncompromising and even calculatingly and carefully millitant. Might does not make right but might DOES make POLICY. You need both right and might to live under a good policy. Lotta good it does anyone to be right and insightful on all the issues and have zero impact anywhere.
  • 4 Stimmen
    1 Beiträge
    20 Aufrufe
    Niemand hat geantwortet
  • Catbox.moe got screwed 😿

    Technology technology
    40
    55 Stimmen
    40 Beiträge
    452 Aufrufe
    archrecord@lemm.eeA
    I'll gladly give you a reason. I'm actually happy to articulate my stance on this, considering how much I tend to care about digital rights. Services that host files should not be held responsible for what users upload, unless: The service explicitly caters to illegal content by definition or practice (i.e. the if the website is literally titled uploadyourcsamhere[.]com then it's safe to assume they deliberately want to host illegal content) The service has a very easy mechanism to remove illegal content, either when asked, or through simple monitoring systems, but chooses not to do so (catbox does this, and quite quickly too) Because holding services responsible creates a whole host of negative effects. Here's some examples: Someone starts a CDN and some users upload CSAM. The creator of the CDN goes to jail now. Nobody ever wants to create a CDN because of the legal risk, and thus the only providers of CDNs become shady, expensive, anonymously-run services with no compliance mechanisms. You run a site that hosts images, and someone decides they want to harm you. They upload CSAM, then report the site to law enforcement. You go to jail. Anybody in the future who wants to run an image sharing site must now self-censor to try and not upset any human being that could be willing to harm them via their site. A social media site is hosting the posts and content of users. In order to be compliant and not go to jail, they must engage in extremely strict filtering, otherwise even one mistake could land them in jail. All users of the site are prohibited from posting any NSFW or even suggestive content, (including newsworthy media, such as an image of bodies in a warzone) and any violation leads to an instant ban, because any of those things could lead to a chance of actually illegal content being attached. This isn't just my opinion either. Digital rights organizations such as the Electronic Frontier Foundation have talked at length about similar policies before. To quote them: "When social media platforms adopt heavy-handed moderation policies, the unintended consequences can be hard to predict. For example, Twitter’s policies on sexual material have resulted in posts on sexual health and condoms being taken down. YouTube’s bans on violent content have resulted in journalism on the Syrian war being pulled from the site. It can be tempting to attempt to “fix” certain attitudes and behaviors online by placing increased restrictions on users’ speech, but in practice, web platforms have had more success at silencing innocent people than at making online communities healthier." Now, to address the rest of your comment, since I don't just want to focus on the beginning: I think you have to actively moderate what is uploaded Catbox does, and as previously mentioned, often at a much higher rate than other services, and at a comparable rate to many services that have millions, if not billions of dollars in annual profits that could otherwise be spent on further moderation. there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal. The problem isn't necessarily the speed at which people can be reported and punished, but rather that the internet is fundamentally harder to track people on than real life. It's easy for cops to sit around at a spot they know someone will be physically distributing illegal content at in real life, but digitally, even if you can see the feed of all the information passing through the service, a VPN or Tor connection will anonymize your IP address in a manner that most police departments won't be able to track, and most three-letter agencies will simply have a relatively low success rate with. There's no good solution to this problem of identifying perpetrators, which is why platforms often focus on moderation over legal enforcement actions against users so frequently. It accomplishes the goal of preventing and removing the content without having to, for example, require every single user of the internet to scan an ID (and also magically prevent people from just stealing other people's access tokens and impersonating their ID) I do agree, however, that we should probably provide larger amounts of funding, training, and resources, to divisions who's sole goal is to go after online distribution of various illegal content, primarily that which harms children, because it's certainly still an issue of there being too many reports to go through, even if many of them will still lead to dead ends. I hope that explains why making file hosting services liable for user uploaded content probably isn't the best strategy. I hate to see people with good intentions support ideas that sound good in practice, but in the end just cause more untold harms, and I hope you can understand why I believe this to be the case.