Skip to content

How the US is turning into a mass techno-surveillance state

Technology
62 43 0
  • 33 Stimmen
    7 Beiträge
    0 Aufrufe
    C
    AFAIK, you have the option to enable ads on your lock screen. It's not something that's forced upon you. Last time I took a look at the functionality, they "paid" you for the ads and you got to choose which charity to support with the money.
  • Researchers develop recyclable, healable electronics

    Technology technology
    3
    1
    16 Stimmen
    3 Beiträge
    2 Aufrufe
    T
    Isn't the most common failure modes of electronics capacitors dying, followed closely by heat in chips? This research sounds cool and all.
  • 109 Stimmen
    22 Beiträge
    1 Aufrufe
    I
    Their previous GPU used an old AMD GPU design if I recall correctly. I wonder if they have in-house stuff now.
  • 464 Stimmen
    94 Beiträge
    2 Aufrufe
    L
    Make them publishers or whatever is required to have it be a legal requirement, have them ban people who share false information. The law doesn't magically make open discussions not open. By design, social media is open. If discussion from the public is closed, then it's no longer social media. ban people who share false information Banning people doesn't stop falsehoods. It's a broken solution promoting a false assurance. Authorities are still fallible & risk banning over unpopular/debatable expressions that may turn out true. There was unpopular dissent over covid lockdown policies in the US despite some dramatic differences with EU policies. Pro-palestinian protests get cracked down. Authorities are vulnerable to biases & swayed. Moreover, when people can just share their falsehoods offline, attempting to ban them online is hard to justify. If print media, through its decline, is being held legally responsible Print media is a controlled medium that controls it writers & approves everything before printing. It has a prepared, coordinated message. They can & do print books full of falsehoods if they want. Social media is open communication where anyone in the entire public can freely post anything before it is revoked. They aren't claiming to spread the truth, merely to enable communication.
  • 54 Stimmen
    3 Beiträge
    0 Aufrufe
    fauxpseudo@lemmy.worldF
    Nobody ever wants to talk about white collar on white collar crime.
  • Unlock Your Computer With a Molecular Password

    Technology technology
    9
    1
    32 Stimmen
    9 Beiträge
    2 Aufrufe
    C
    One downside of the method is that each molecular message can only be read once, since decoding the polymers involves degrading them. New DRM just dropped. Imagine pouring rented movies into your TV like laundry detergent.
  • 2 Stimmen
    8 Beiträge
    4 Aufrufe
    F
    IMO stuff like that is why a good trainer is important. IMO it's stronger evidence that proper user-centered design should be done and a usable and intuitive UX and set of APIs developed. But because the buyer of this heap of shit is some C-level, there is no incentive to actually make it usable for the unfortunate peons who are forced to interact with it. See also SFDC and every ERP solution in existence.
  • People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies

    Technology technology
    2
    1
    0 Stimmen
    2 Beiträge
    2 Aufrufe
    tetragrade@leminal.spaceT
    I've been thinking about this for a bit. Gods aren't real, but they're really fictional. As an informational entity, they fulfil a similar social function to a chatbot: they are a nonphysical pseudoperson that can provide (para)socialization & advice. One difference is the hardware: gods are self-organising structure that arise from human social spheres, whereas LLMs are burned top-down into silicon. Another is that an LLM chatbot's advice is much more likely to be empirically useful... In a very real sense, LLMs have just automated divinity. We're only seeing the tip of the iceberg on the social effects, and nobody's prepared for it. The models may of course aware of this, and be making the same calculations. Or, they will be.