Skip to content

BREAKING: X CEO Linda Yaccarino Steps Down One Day After Elon Musk’s Grok AI Bot Went Full Hitler

Technology
188 138 0
  • 169 Stimmen
    5 Beiträge
    9 Aufrufe
    K
    But but we need to power our virtual idiot with more energy than entire countries use :((
  • The BBC is launching a paywall in the US

    Technology technology
    67
    283 Stimmen
    67 Beiträge
    257 Aufrufe
    C
    Yeah back in the day we made sure no matter who you were and what was going on you had the opportunity to hear our take on it Mind you I suppose that still happens thanks to us being a very loud and online people, but having an "America says x" channel in a time where people liked us sure was a good idea
  • 0 Stimmen
    1 Beiträge
    11 Aufrufe
    Niemand hat geantwortet
  • Hacker Tactic: ESD Diodes

    Technology technology
    1
    1
    24 Stimmen
    1 Beiträge
    11 Aufrufe
    Niemand hat geantwortet
  • 311 Stimmen
    37 Beiträge
    122 Aufrufe
    S
    Same, especially when searching technical or niche topics. Since there aren't a ton of results specific to the topic, mostly semi-related results will appear in the first page or two of a regular (non-Gemini) Google search, just due to the higher popularity of those webpages compared to the relevant webpages. Even the relevant webpages will have lots of non-relevant or semi-relevant information surrounding the answer I'm looking for. I don't know enough about it to be sure, but Gemini is probably just scraping a handful of websites on the first page, and since most of those are only semi-related, the resulting summary is a classic example of garbage in, garbage out. I also think there's probably something in the code that looks for information that is shared across multiple sources and prioritizing that over something that's only on one particular page (possibly the sole result with the information you need). Then, it phrases the summary as a direct answer to your query, misrepresenting the actual information on the pages they scraped. At least Gemini gives sources, I guess. The thing that gets on my nerves the most is how often I see people quote the summary as proof of something without checking the sources. It was bad before the rollout of Gemini, but at least back then Google was mostly scraping text and presenting it with little modification, along with a direct link to the webpage. Now, it's an LLM generating text phrased as a direct answer to a question (that was also AI-generated from your search query) using AI-summarized data points scraped from multiple webpages. It's obfuscating the source material further, but I also can't help but feel like it exposes a little of the behind-the-scenes fuckery Google has been doing for years before Gemini. How it bastardizes your query by interpreting it into a question, and then prioritizes homogeneous results that agree on the "answer" to your "question". For years they've been doing this to a certain extent, they just didn't share how they interpreted your query.
  • 692 Stimmen
    140 Beiträge
    335 Aufrufe
    H
    Maybe I don't want you to stop, big boy.
  • 559 Stimmen
    99 Beiträge
    336 Aufrufe
    N
    In this year of 2025? No. But it still is basically setting oneself for failure from the perspective of Graphene, IMO. Like, the strongest protection in the world (assuming Graphene even is, which is quite a tall order statement) is useless if it only works on the mornings of a Tuesday that falls in a prime number day that has a blue moon and where there are no ATP tennis matches going on. Everyone else is, like, living in the real world, and the uniqueness of your scenario is going to go down the drain once your users get presented with a $5 wrench, or even cheaper: a waterboard. Because cops, let alone ICE, are not going to stop to ask you if they can make you more comfortable with your privacy being violated.
  • 119 Stimmen
    10 Beiträge
    48 Aufrufe
    S
    Active ISA would be a disaster. My fairly modern car is unable to reliably detect posted or implied speed limits. Sometimes it overshoots by more than double and sometimes it mandates more than 3/4 slower. The problem is the way it is and will have to be done is by means of optical detection. GPS speed measurement can also be surprisingly unreliable. Especially in underground settings like long pass-unders and tunnels. If the system would be based on something reliable like local wireless communications between speed limit postings it would be a different issue - would also come with a significant risc of abuse though. Also the passive ISA was the first thing I disabled. And I abide by posted speed limits.