Skip to content

Software is evolving backwards

Technology
64 38 363
  • Unlocking the Legacy of the Honda Acty Across Four Generations

    Technology technology
    1
    2
    0 Stimmen
    1 Beiträge
    10 Aufrufe
    Niemand hat geantwortet
  • China is rushing to develop its AI-powered censorship system

    Technology technology
    2
    1
    39 Stimmen
    2 Beiträge
    22 Aufrufe
    why0y@lemmy.mlW
    This concept is the enemy of the a centuries old idealistic societal pillar of the West: Liberté, Libertas... this has blessed so many of us in the West, and I beg that it doesn't leave. Something beautiful and as sacred as the freedom from forced labor and the freedom to choose your trade, is the concept of the free and unbounded innocence of voices asking their leaders and each other these questions, to determine amongst ourselves what is fair and not, for our own betterment and the beauty of free enterprise. It's not so much that the Chinese state is an awful power to behold (it is and fuck Poohhead)... but this same politic is on the rise in the West and it leads to war. It always leads to war. And now the most automated form of state and corporate propaganda the world has ever seen is in the hands of a ruthless ruling class that can, has, and will steal bread from children's hands, and literally take the medicine from the sick to pad their pockets. Such is the twisted fate of society and likely always will be. We need to fight and not with prayers; this moment is God forsaking us to behold how the spirit breaks and what the people want to fight for as ruthlessly as the others do to steal our bread.
  • 9 Stimmen
    6 Beiträge
    33 Aufrufe
    F
    You said it yourself: extra places that need human attention ... those need ... humans, right? It's easy to say "let AI find the mistakes". But that tells us nothing at all. There's no substance. It's just a sales pitch for snake oil. In reality, there are various ways one can leverage technology to identify various errors, but that only happens through the focused actions of people who actually understand the details of what's happening. And think about it here. We already have computer systems that monitor patients' real-time data when they're hospitalized. We already have systems that check for allergies in prescribed medication. We already have systems for all kinds of safety mechanisms. We're already using safety tech in hospitals, so what can be inferred from a vague headline about AI doing something that's ... checks notes ... already being done? ... Yeah, the safe money is that it's just a scam.
  • 439 Stimmen
    351 Beiträge
    1k Aufrufe
    G
    "I hate it when misandry pops up on my feed" Word for word. I posted that 5 weeks ago and I'm still getting hate for it.
  • Mudita Kompakt

    Technology technology
    17
    1
    62 Stimmen
    17 Beiträge
    82 Aufrufe
    anunusualrelic@lemmy.worldA
    There you go then. It's 80 €.
  • 195 Stimmen
    31 Beiträge
    96 Aufrufe
    isveryloud@lemmy.caI
    It's a loaded term that should be replaced with a more nimble definition. A dog whistle is the name for a loaded term that is used to tag a specific target with a large baggage of information, but in a way where only people who are part of the "in group" can understand the baggage of the word, hence "dog whistle", only heard by dogs. In the case of the word "degeneracy", it's a vague word that has been often used to attack, among other things, LGBTQ and their allies as well as non-religious people. The term is vague enough that the user can easily weasel their way out of criticism for its usage, but the target audience gets the message loud and clear: "[target] should be attacked for being [thing]." Another example of such a word would be "woke".
  • Digg founder Kevin Rose offers to buy Pocket from Mozilla

    Technology technology
    7
    2
    1 Stimmen
    7 Beiträge
    41 Aufrufe
    H
    IMO it was already shitty.
  • 0 Stimmen
    4 Beiträge
    20 Aufrufe
    redfox@infosec.pubR
    Yeah, damn, I always forget about that...just like they want...