Skip to content

AI agents wrong ~70% of time: Carnegie Mellon study

Technology
269 106 61
  • A Forensic Examination of GIS Arta

    Technology technology
    1
    1
    6 Stimmen
    1 Beiträge
    4 Aufrufe
    Niemand hat geantwortet
  • Twitter opens up to Community Notes written by AI bots

    Technology technology
    9
    1
    44 Stimmen
    9 Beiträge
    42 Aufrufe
    G
    Stop fucking using twitter. Stop posting about it, stop posting things that link to it. Delete your account like you should have already.
  • The Decline of Usability: Revisited | datagubbe.se

    Technology technology
    2
    0 Stimmen
    2 Beiträge
    12 Aufrufe
    2xsaiko@discuss.tchncs.de2
    Just saw this article linked in a ThePrimeagen video. I didn't watch the video, but I did read the article, and all of this article is exactly what I'm always saying when I'm complaining about current UI trends and why I'm so picky about the software I use and also the tools I use to write software. I shouldn't have to be picky, but it seems like developers (professional and hobbyist alike) don't care anymore and users don't have standards.
  • 215 Stimmen
    118 Beiträge
    269 Aufrufe
    A
    Outlook has search?!
  • GeForce GTX 970 8GB mod is back for a full review

    Technology technology
    1
    34 Stimmen
    1 Beiträge
    11 Aufrufe
    Niemand hat geantwortet
  • 131 Stimmen
    67 Beiträge
    217 Aufrufe
    I
    Arcing causes more fires, because over current caused all the fires until we tightened standards and dual-mode circuit breakers. Now fires are caused by loose connections arcing, and damaged wires arcing to flammable material. Breakers are specifically designed for a sustained current, but arcing is dangerous because it tends to cascade, light arcing damages contacts, leading to more arcing in a cycle. The real danger of arcing is that it can happen outside of view, and start fires that aren't caught till everything burns down.
  • Backblaze Drive Stats for Q1 2025

    Technology technology
    1
    1
    49 Stimmen
    1 Beiträge
    8 Aufrufe
    Niemand hat geantwortet
  • People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies

    Technology technology
    2
    1
    0 Stimmen
    2 Beiträge
    15 Aufrufe
    tetragrade@leminal.spaceT
    I've been thinking about this for a bit. Gods aren't real, but they're really fictional. As an informational entity, they fulfil a similar social function to a chatbot: they are a nonphysical pseudoperson that can provide (para)socialization & advice. One difference is the hardware: gods are self-organising structure that arise from human social spheres, whereas LLMs are burned top-down into silicon. Another is that an LLM chatbot's advice is much more likely to be empirically useful... In a very real sense, LLMs have just automated divinity. We're only seeing the tip of the iceberg on the social effects, and nobody's prepared for it. The models may of course aware of this, and be making the same calculations. Or, they will be.