Skip to content

BREAKING: X CEO Linda Yaccarino Steps Down One Day After Elon Musk’s Grok AI Bot Went Full Hitler

Technology
131 105 0
  • AI agents wrong ~70% of time: Carnegie Mellon study

    Technology technology
    279
    1
    963 Stimmen
    279 Beiträge
    210 Aufrufe
    D
    it’s so good at parsing text and documents, summarizing No. Not when it matters. It makes stuff up. The less you carefully check every single fucking thing it says, the more likely you are to believe some lies it subtly slipped in as it went along. If truth doesn't matter, go ahead and use LLMs. If you just want some ideas that you're going to sift through, independently verify and check for yourself with extreme skepticism as if Donald Trump were telling you how to achieve world peace, great, you're using LLMs effectively. But if you're trusting it, you're doing it very, very wrong and you're going to get humiliated because other people are going to catch you out in repeating an LLM's bullshit.
  • Get Your Filthy ChatGPT Away From My Liberal Arts

    Technology technology
    12
    1
    144 Stimmen
    12 Beiträge
    57 Aufrufe
    N
    Indeed—semicolons are usually associated wirh LLMs! But that’s not all! Always remember: use your tools! An LLM „uses“ all types of quotation marks.
  • Session Messenger

    Technology technology
    8
    2
    15 Stimmen
    8 Beiträge
    45 Aufrufe
    S
    I think it was a great idea, but poorly executed. I prefer using simpleX, personally.
  • Hastags killed

    Technology technology
    6
    1
    16 Stimmen
    6 Beiträge
    32 Aufrufe
    klu9@lemmy.caK
    £ says: "The fuck they are, mate!"
  • Judge backs AI firm over use of copyrighted books

    Technology technology
    59
    1
    174 Stimmen
    59 Beiträge
    250 Aufrufe
    artisian@lemmy.worldA
    The students read Tolkien, then invent their own settings. The judge thinks this is similar to how claude works. I, nor I suspect the judge, meant that the students were reusing world building whole cloth.
  • 195 Stimmen
    31 Beiträge
    54 Aufrufe
    isveryloud@lemmy.caI
    It's a loaded term that should be replaced with a more nimble definition. A dog whistle is the name for a loaded term that is used to tag a specific target with a large baggage of information, but in a way where only people who are part of the "in group" can understand the baggage of the word, hence "dog whistle", only heard by dogs. In the case of the word "degeneracy", it's a vague word that has been often used to attack, among other things, LGBTQ and their allies as well as non-religious people. The term is vague enough that the user can easily weasel their way out of criticism for its usage, but the target audience gets the message loud and clear: "[target] should be attacked for being [thing]." Another example of such a word would be "woke".
  • 4 Stimmen
    1 Beiträge
    10 Aufrufe
    Niemand hat geantwortet
  • 30 Stimmen
    6 Beiträge
    31 Aufrufe
    S
    The thing about compelling lies is not that they are new, just that they are easier to expand. The most common effect of compelling lies is their ability to get well-intentioned people to support malign causes and give their money to fraudsters. So, expect that to expand, kind of like it already has been. The big question for me is what the response will be. Will we make lying illegal? Will we become a world of ever more paranoid isolationists, returning to clans, families, households, as the largest social group you can trust? Will most people even have the intelligence to see what is happenning and respond? Or will most people be turned into info-puppets, controlled into behaviours by manipulation of their information diet to an unprecedented degree? I don't know.