Skip to content

Fairphone announces the €599 Fairphone 6, with a 6.31" 120Hz LTPO OLED display, a Snapdragon 7s Gen 3 chip, and enhanced modularity with 12 swappable parts

Technology
555 240 253
  • Comment utiliser ChatGPT : le guide complet - BDM

    Technology technology
    1
    2
    0 Stimmen
    1 Beiträge
    8 Aufrufe
    Niemand hat geantwortet
  • 2 Stimmen
    3 Beiträge
    10 Aufrufe
    vanth@reddthat.comV
    I only vacation in countries that have trained their LLMs to use line breaks.
  • You are Already On "The List"

    Technology technology
    2
    47 Stimmen
    2 Beiträge
    12 Aufrufe
    M
    Even if they're wrong. It's too late. You're already on the list. .... The only option is to destroy the list and those who will use it
  • 832 Stimmen
    96 Beiträge
    31 Aufrufe
    J
    Because there is profit in child exploitation.
  • Catbox.moe got screwed 😿

    Technology technology
    40
    55 Stimmen
    40 Beiträge
    44 Aufrufe
    archrecord@lemm.eeA
    I'll gladly give you a reason. I'm actually happy to articulate my stance on this, considering how much I tend to care about digital rights. Services that host files should not be held responsible for what users upload, unless: The service explicitly caters to illegal content by definition or practice (i.e. the if the website is literally titled uploadyourcsamhere[.]com then it's safe to assume they deliberately want to host illegal content) The service has a very easy mechanism to remove illegal content, either when asked, or through simple monitoring systems, but chooses not to do so (catbox does this, and quite quickly too) Because holding services responsible creates a whole host of negative effects. Here's some examples: Someone starts a CDN and some users upload CSAM. The creator of the CDN goes to jail now. Nobody ever wants to create a CDN because of the legal risk, and thus the only providers of CDNs become shady, expensive, anonymously-run services with no compliance mechanisms. You run a site that hosts images, and someone decides they want to harm you. They upload CSAM, then report the site to law enforcement. You go to jail. Anybody in the future who wants to run an image sharing site must now self-censor to try and not upset any human being that could be willing to harm them via their site. A social media site is hosting the posts and content of users. In order to be compliant and not go to jail, they must engage in extremely strict filtering, otherwise even one mistake could land them in jail. All users of the site are prohibited from posting any NSFW or even suggestive content, (including newsworthy media, such as an image of bodies in a warzone) and any violation leads to an instant ban, because any of those things could lead to a chance of actually illegal content being attached. This isn't just my opinion either. Digital rights organizations such as the Electronic Frontier Foundation have talked at length about similar policies before. To quote them: "When social media platforms adopt heavy-handed moderation policies, the unintended consequences can be hard to predict. For example, Twitter’s policies on sexual material have resulted in posts on sexual health and condoms being taken down. YouTube’s bans on violent content have resulted in journalism on the Syrian war being pulled from the site. It can be tempting to attempt to “fix” certain attitudes and behaviors online by placing increased restrictions on users’ speech, but in practice, web platforms have had more success at silencing innocent people than at making online communities healthier." Now, to address the rest of your comment, since I don't just want to focus on the beginning: I think you have to actively moderate what is uploaded Catbox does, and as previously mentioned, often at a much higher rate than other services, and at a comparable rate to many services that have millions, if not billions of dollars in annual profits that could otherwise be spent on further moderation. there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal. The problem isn't necessarily the speed at which people can be reported and punished, but rather that the internet is fundamentally harder to track people on than real life. It's easy for cops to sit around at a spot they know someone will be physically distributing illegal content at in real life, but digitally, even if you can see the feed of all the information passing through the service, a VPN or Tor connection will anonymize your IP address in a manner that most police departments won't be able to track, and most three-letter agencies will simply have a relatively low success rate with. There's no good solution to this problem of identifying perpetrators, which is why platforms often focus on moderation over legal enforcement actions against users so frequently. It accomplishes the goal of preventing and removing the content without having to, for example, require every single user of the internet to scan an ID (and also magically prevent people from just stealing other people's access tokens and impersonating their ID) I do agree, however, that we should probably provide larger amounts of funding, training, and resources, to divisions who's sole goal is to go after online distribution of various illegal content, primarily that which harms children, because it's certainly still an issue of there being too many reports to go through, even if many of them will still lead to dead ends. I hope that explains why making file hosting services liable for user uploaded content probably isn't the best strategy. I hate to see people with good intentions support ideas that sound good in practice, but in the end just cause more untold harms, and I hope you can understand why I believe this to be the case.
  • 104 Stimmen
    168 Beiträge
    65 Aufrufe
    smartmanapps@programming.devS
    At least that’s not how I’ve been taught in school If you had a bad teacher that doesn't mean everyone else had a bad teacher. You’re not teaching kids how to prove the quadratic formula, do you? We teach them how to do proofs, including several specific ones. No, you teach them how to use it instead. We teach them how to use everything, and how to do proofs as well. Your whole argument is just one big strawman. Again, with the order of operations Happens to be the topic of the post. It’s not a thing Yes it is! I’ve given you two examples that don’t follow any So you could not do the brackets first and still get the right answer? Nope! 2×2×(2-2)/2=0 2×2×2-2/2=7 That’s kinda random, but sure? Not random at all, given you were talking about students understanding how Maths works. 2+3×4 then it’s not an order of operation that plays the role here Yes it is! If I have 1 2-litre bottle of milk, and 4 3-litre bottles of milk, there's only 1 correct answer for how many litres of milk of have, and it ain't 20! Even elementary school kids know how to work it out just by counting up. They all derive from each other No they don't. The proof of order of operations has got nothing to do with any of the properties you mentioned. For example, commutation is used to prove identity And neither is used to prove the order of operations. 2 operators, no order followed Again with a cherry-picked example that only includes operators of the same precedence. You have no property that would allow for (2+3)×4 to be equal 2+3×4 And yet we have a proof of why 14 is the only correct answer to 2+3x4, why you have to do the multiplication first. Is that not correct? Of course it is. So what? It literally has subtraction and distribution No it didn't. It had Brackets (with subtraction inside) and Multiplication and Division. I thought you taught math, no? Yep, and I just pointed out that what you just said is wrong. 2-2(1+2) has Subtraction and Distribution. 2-2 is 2 being, hear me out, subtracted from 2 Which was done first because you had it inside Brackets, therefore not done in the Subtraction step in order of operations, but the Brackets step. Also, can you explain how is that cherry-picking? You already know - you know which operations to pick to make it look like there's no such thing as order of operations. If I tell you to look up at the sky at midnight and say "look - there's no such thing as the sun", that doesn't mean there's no such thing as the sun.
  • Apple’s Smart Glasses Expected to Hit the Market by Late Next Year!

    Technology technology
    14
    6 Stimmen
    14 Beiträge
    18 Aufrufe
    L
    great, another worthless tech product that no one asked for. I can hardly wait.
  • 121 Stimmen
    58 Beiträge
    38 Aufrufe
    D
    I bet every company has at least one employee with right-wing political views. Choosing a product based on some random quotes by employees is stupid.