Skip to content

Your TV Is Spying On You

Technology
59 37 0
  • 0 Stimmen
    1 Beiträge
    0 Aufrufe
    Niemand hat geantwortet
  • 309 Stimmen
    37 Beiträge
    0 Aufrufe
    S
    Same, especially when searching technical or niche topics. Since there aren't a ton of results specific to the topic, mostly semi-related results will appear in the first page or two of a regular (non-Gemini) Google search, just due to the higher popularity of those webpages compared to the relevant webpages. Even the relevant webpages will have lots of non-relevant or semi-relevant information surrounding the answer I'm looking for. I don't know enough about it to be sure, but Gemini is probably just scraping a handful of websites on the first page, and since most of those are only semi-related, the resulting summary is a classic example of garbage in, garbage out. I also think there's probably something in the code that looks for information that is shared across multiple sources and prioritizing that over something that's only on one particular page (possibly the sole result with the information you need). Then, it phrases the summary as a direct answer to your query, misrepresenting the actual information on the pages they scraped. At least Gemini gives sources, I guess. The thing that gets on my nerves the most is how often I see people quote the summary as proof of something without checking the sources. It was bad before the rollout of Gemini, but at least back then Google was mostly scraping text and presenting it with little modification, along with a direct link to the webpage. Now, it's an LLM generating text phrased as a direct answer to a question (that was also AI-generated from your search query) using AI-summarized data points scraped from multiple webpages. It's obfuscating the source material further, but I also can't help but feel like it exposes a little of the behind-the-scenes fuckery Google has been doing for years before Gemini. How it bastardizes your query by interpreting it into a question, and then prioritizes homogeneous results that agree on the "answer" to your "question". For years they've been doing this to a certain extent, they just didn't share how they interpreted your query.
  • 782 Stimmen
    231 Beiträge
    4 Aufrufe
    D
    Haha I'm kidding, it's good that you share your solution here.
  • Dyson Has Killed Its Bizarre Zone Air-Purifying Headphones

    Technology technology
    45
    1
    228 Stimmen
    45 Beiträge
    5 Aufrufe
    rob_t_firefly@lemmy.worldR
    I have been chuckling like a dork at this particular patent since such things first became searchable online, and have never found any evidence of it being manufactured and marketed at all. The "non-adhesive adherence" is illustrated in the diagrams on the patent which you can see at the link. The inventor proposes "a facing of fluffy fibrous material" to provide the filtration and the adherence; basically this thing is the softer side of a velcro strip, bent in half with the fluff facing outward so it sticks to the inside of your buttcrack to hold itself in place in front of your anus and filter your farts through it.
  • 100 Stimmen
    60 Beiträge
    5 Aufrufe
    jimmydoreisalefty@lemmy.worldJ
    We all get emotional on certain topics; it is understandable. All is well, peace.
  • 180 Stimmen
    13 Beiträge
    3 Aufrufe
    D
    There is a huge difference between an algorithm using real world data to produce a score a panel of experts use to make a determination and using a LLM to screen candidates. One has verifiable reproducible results that can be checked and debated the other does not. The final call does not matter if a computer program using an unknown and unreproducible algorithm screens you out before this. This is what we are facing. Pre-determined decisions that human beings are not being held accountable to. Is this happening right now? Yes it is, without a doubt. People are no longer making a lot of healthcare decisions determining insurance coverage. Computers that are not accountable are. You may have some ability to disagree but for how long? Soon there will be no way to reach a human about an insurance decision. This is already happening. People should be very anxious. Hearing United Healthcare has been forging DNRs and has been denying things like treatment for stroke for elders is disgusting. We have major issues that are not going away and we are blatantly ignoring them.
  • 848 Stimmen
    133 Beiträge
    10 Aufrufe
    A
    reminds me of the time when something with Amazon was Indian employees
  • Cloudflare built an oauth provider with Claude

    Technology technology
    23
    1
    34 Stimmen
    23 Beiträge
    13 Aufrufe
    A
    I have to say that you just have to sayed something up