Skip to content

Musk's AI firm deletes posts after chatbot praises Adolf Hitler

Technology
31 28 0
  • Former and current Microsofties react to the latest layoffs

    Technology technology
    20
    1
    85 Stimmen
    20 Beiträge
    41 Aufrufe
    eightbitblood@lemmy.worldE
    Incredibly well said. And couldn't agree more! Especially after working as a game dev for Apple Arcade. We spent months proving to them their saving architecture was faulty and would lead to people losing their save file for each Apple Arcade game they play. We were ignored, and then told it was a dev problem. Cut to the launch of Arcade: every single game has several 1 star reviews about players losing their save files. This cannot be fixed by devs as it's an Apple problem, so devs have to figure out novel ways to prevent the issue from happening using their own time and resources. 1.5 years later, Apple finishes restructuring the entire backend of Arcade, fixing the problem. They tell all their devs to reimplement the saving architecture of their games to be compliant with Apples new backend or get booted from Arcade. This costs devs months of time to complete for literally zero return (Apple Arcade deals are upfront - little to no revenue is seen after launch). Apple used their trillions of dollars to ignore a massive backend issue that affected every player and developer on Apple Arcade. They then forced every dev to make an update to their game at their own expense just to keep it listed on Arcade. All while directing user frustration over the issue towards developers instead of taking accountability for launching a faulty product. Literally, these companies are run by sociopaths that have egos bigger than their paychecks. Issues like this are ignored as it's easier to place the blame on someone down the line. People like your manager end up getting promoted to the top of an office heirachy of bullshit, and everything the company makes just gets worse until whatever corpse is left is sold for parts to whatever bigger dumb company hasn't collapsed yet. It's really painful to watch, and even more painful to work with these idiots.
  • 0 Stimmen
    1 Beiträge
    9 Aufrufe
    Niemand hat geantwortet
  • 281 Stimmen
    15 Beiträge
    57 Aufrufe
    fingolfinz@lemmy.worldF
    Magats wanted people with their same mental capacity to run things and oh look, it’s lots of incompetence
  • Why so much hate toward AI?

    Technology technology
    73
    38 Stimmen
    73 Beiträge
    220 Aufrufe
    H
    AI has only one problem to solve: salaries
  • autofocus glasses

    Technology technology
    53
    1
    126 Stimmen
    53 Beiträge
    174 Aufrufe
    M
    Hm. Checking my glasses I think there is something on the top too. I can see distance ever so slightly clearer looking out the top. If I remember right, I have a minus .25 in one eye. Always been told it didn't need correction, but maybe it is in this pair. I should go get some off the shelf progressive readers and try those.
  • 105 Stimmen
    1 Beiträge
    13 Aufrufe
    Niemand hat geantwortet
  • Things at Tesla are worse than they appear

    Technology technology
    34
    1
    420 Stimmen
    34 Beiträge
    117 Aufrufe
    halcyon@discuss.tchncs.deH
    [image: a4f3b70f-db20-4c1d-b737-611548cf3104.jpeg]
  • People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies

    Technology technology
    2
    1
    0 Stimmen
    2 Beiträge
    16 Aufrufe
    tetragrade@leminal.spaceT
    I've been thinking about this for a bit. Gods aren't real, but they're really fictional. As an informational entity, they fulfil a similar social function to a chatbot: they are a nonphysical pseudoperson that can provide (para)socialization & advice. One difference is the hardware: gods are self-organising structure that arise from human social spheres, whereas LLMs are burned top-down into silicon. Another is that an LLM chatbot's advice is much more likely to be empirically useful... In a very real sense, LLMs have just automated divinity. We're only seeing the tip of the iceberg on the social effects, and nobody's prepared for it. The models may of course aware of this, and be making the same calculations. Or, they will be.