Skip to content

Google hit with $314m fine for collecting data from idle Android phones without permission

Technology
62 49 0
  • Most Common PIN Codes

    Technology technology
    50
    1
    181 Stimmen
    50 Beiträge
    2 Aufrufe
    E
    Came here for this comment. Did not disappoint!
  • AI Pressure from the Top: CEOs Urge Workers to Adapt

    Technology technology
    1
    1
    1 Stimmen
    1 Beiträge
    5 Aufrufe
    Niemand hat geantwortet
  • 90 Stimmen
    5 Beiträge
    2 Aufrufe
    lupusblackfur@lemmy.worldL
    Zuck can't be too excited to be suddenly and harshly cut out of the Oval Office Data Pipeline...
  • life trip

    Technology technology
    1
    0 Stimmen
    1 Beiträge
    4 Aufrufe
    Niemand hat geantwortet
  • 311 Stimmen
    37 Beiträge
    20 Aufrufe
    S
    Same, especially when searching technical or niche topics. Since there aren't a ton of results specific to the topic, mostly semi-related results will appear in the first page or two of a regular (non-Gemini) Google search, just due to the higher popularity of those webpages compared to the relevant webpages. Even the relevant webpages will have lots of non-relevant or semi-relevant information surrounding the answer I'm looking for. I don't know enough about it to be sure, but Gemini is probably just scraping a handful of websites on the first page, and since most of those are only semi-related, the resulting summary is a classic example of garbage in, garbage out. I also think there's probably something in the code that looks for information that is shared across multiple sources and prioritizing that over something that's only on one particular page (possibly the sole result with the information you need). Then, it phrases the summary as a direct answer to your query, misrepresenting the actual information on the pages they scraped. At least Gemini gives sources, I guess. The thing that gets on my nerves the most is how often I see people quote the summary as proof of something without checking the sources. It was bad before the rollout of Gemini, but at least back then Google was mostly scraping text and presenting it with little modification, along with a direct link to the webpage. Now, it's an LLM generating text phrased as a direct answer to a question (that was also AI-generated from your search query) using AI-summarized data points scraped from multiple webpages. It's obfuscating the source material further, but I also can't help but feel like it exposes a little of the behind-the-scenes fuckery Google has been doing for years before Gemini. How it bastardizes your query by interpreting it into a question, and then prioritizes homogeneous results that agree on the "answer" to your "question". For years they've been doing this to a certain extent, they just didn't share how they interpreted your query.
  • 92 Stimmen
    5 Beiträge
    6 Aufrufe
    H
    This is interesting to me as I like to say the llms are basically another abstraction of search. Initially it was links with no real weight that had to be gone through and then various algorithms weighted the return, then the results started giving a small blurb so one did not have to follow every link, and now your basically getting a report which should have references to the sources. I would like to see this looking at how folks engage with an llm. Basically my guess is if one treats the llm as a helper and collaborates to create the product that they will remember more than if they treat it as a servant and just instructs them to do it and takes the output as is.
  • What was Radiant AI, anyway?

    Technology technology
    6
    1
    20 Stimmen
    6 Beiträge
    14 Aufrufe
    T
    In fact Daggerfall was almost nothing but quests and other content like that.
  • WhatsApp provides no cryptographic management for group messages

    Technology technology
    3
    1
    17 Stimmen
    3 Beiträge
    14 Aufrufe
    S
    Just be sure to add only the people you want to be there. I've heard some people add others and it's a bit messy