Skip to content

Minnesota Shooting Suspect Allegedly Used Data Broker Sites to Find Targets’ Addresses

Technology
75 56 0
  • Iran asks its people to delete WhatsApp

    Technology technology
    24
    1
    178 Stimmen
    24 Beiträge
    0 Aufrufe
    G
    Yeah, that happens sometimes.
  • 45 Stimmen
    9 Beiträge
    4 Aufrufe
    M
    This will be a privacy nightmare.
  • 817 Stimmen
    41 Beiträge
    2 Aufrufe
    C
    And then price us out
  • 85K – A Melhor Opção para Quem Busca Diversão e Recompensas

    Technology technology
    1
    1
    0 Stimmen
    1 Beiträge
    2 Aufrufe
    Niemand hat geantwortet
  • 219 Stimmen
    119 Beiträge
    10 Aufrufe
    L
    Okay, I'd be interested to hear what you think is wrong with this, because I'm pretty sure it's more or less correct. Some sources for you to help you understand these concepts a bit better: What DLSS is and how it works as a starter: https://en.wikipedia.org/wiki/Deep_Learning_Super_Sampling Issues with modern "optimization", including DLSS: https://www.youtube.com/watch?v=lJu_DgCHfx4 TAA comparisons (yes, biased, but accurate): https://old.reddit.com/r/FuckTAA/comments/1e7ozv0/rfucktaa_resource/
  • 479 Stimmen
    81 Beiträge
    4 Aufrufe
    douglasg14b@lemmy.worldD
    Did I say that it did? No? Then why the rhetorical question for something that I never stated? Now that we're past that, I'm not sure if I think it's okay, but I at least recognize that it's normalized within society. And has been for like 70+ years now. The problem happens with how the data is used, and particularly abused. If you walk into my store, you expect that I am monitoring you. You expect that you are on camera and that your shopping patterns, like all foot traffic, are probably being analyzed and aggregated. What you buy is tracked, at least in aggregate, by default really, that's just volume tracking and prediction. Suffice to say that broad customer behavior analysis has been a thing for a couple generations now, at least. When you go to a website, why would you think that it is not keeping track of where you go and what you click on in the same manner? Now that I've stated that I do want to say that the real problems that we experience come in with how this data is misused out of what it's scope should be. And that we should have strong regulatory agencies forcing compliance of how this data is used and enforcing the right to privacy for people that want it removed.
  • 56 Stimmen
    4 Beiträge
    4 Aufrufe
    cupcakezealot@lemmy.blahaj.zoneC
    !upliftingnews@lemmy.world
  • People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies

    Technology technology
    2
    1
    0 Stimmen
    2 Beiträge
    4 Aufrufe
    tetragrade@leminal.spaceT
    I've been thinking about this for a bit. Gods aren't real, but they're really fictional. As an informational entity, they fulfil a similar social function to a chatbot: they are a nonphysical pseudoperson that can provide (para)socialization & advice. One difference is the hardware: gods are self-organising structure that arise from human social spheres, whereas LLMs are burned top-down into silicon. Another is that an LLM chatbot's advice is much more likely to be empirically useful... In a very real sense, LLMs have just automated divinity. We're only seeing the tip of the iceberg on the social effects, and nobody's prepared for it. The models may of course aware of this, and be making the same calculations. Or, they will be.