Skip to content

Companies That Tried to Save Money With AI Are Now Spending a Fortune Hiring People to Fix Its Mistakes

Technology
103 76 0
  • 285 Stimmen
    28 Beiträge
    0 Aufrufe
    S
    Hm, I guess an encyclopedia article is more relevant than a dictionary definition, so sure. I was using the looser secondary definition... in this case an elision that references a dialect in order to call up regional relevance to the opinion expressed.
  • The Really Dark Truth About Bots

    Technology technology
    5
    84 Stimmen
    5 Beiträge
    24 Aufrufe
    plutoniumacid@lemmy.worldP
    "Engineers" a.k.a. uneducated cubicle slaves
  • Big Brother Trump Is Watching You

    Technology technology
    1
    1
    1 Stimmen
    1 Beiträge
    8 Aufrufe
    Niemand hat geantwortet
  • 241 Stimmen
    77 Beiträge
    277 Aufrufe
    jacksonlamb@lemmy.worldJ
    bizarre, dismal What's bizarre and dismal is that someone is so starved for dopamine and attention from corporations that this is how they perceive what life looks like when you are not being targetted. This is my normal view and it is far better.
  • New Orleans debates real-time facial recognition legislation

    Technology technology
    12
    1
    150 Stimmen
    12 Beiträge
    53 Aufrufe
    A
    [image: 62e40d75-1358-46a4-a7a5-1f08c6afe4dc.jpeg] Palantir had a contract with New Orleans starting around ~2012 to create their predictive policing tech that scans surveillance cameras for very vague details and still misidentifies people. It's very similar to Lavender, the tech they use to identify members of Hamas and attack with drones. This results in misidentified targets ~10% of the time, according to the IDF (likely it's a much higher misidentification rate than 10%). Palantir picked Louisiana over somewhere like San Francisco bc they knew it would be a lot easier to violate rights and privacy here and get away with it. Whatever they decide in New Orleans on Thursday during this Council meeting that nobody cares about, will likely be the first of its kind on the books legal basis to track civilians in the U.S. and allow the federal government to take control over that ability whenever they want. This could also set a precedent for use in other states. Guess who's running the entire country right now, and just gave high ranking army contracts to Palantir employees for "no reason" while they are also receiving a multimillion dollar federal contract to create an insane database on every American and giant data centers are being built all across the country.
  • 311 Stimmen
    37 Beiträge
    34 Aufrufe
    S
    Same, especially when searching technical or niche topics. Since there aren't a ton of results specific to the topic, mostly semi-related results will appear in the first page or two of a regular (non-Gemini) Google search, just due to the higher popularity of those webpages compared to the relevant webpages. Even the relevant webpages will have lots of non-relevant or semi-relevant information surrounding the answer I'm looking for. I don't know enough about it to be sure, but Gemini is probably just scraping a handful of websites on the first page, and since most of those are only semi-related, the resulting summary is a classic example of garbage in, garbage out. I also think there's probably something in the code that looks for information that is shared across multiple sources and prioritizing that over something that's only on one particular page (possibly the sole result with the information you need). Then, it phrases the summary as a direct answer to your query, misrepresenting the actual information on the pages they scraped. At least Gemini gives sources, I guess. The thing that gets on my nerves the most is how often I see people quote the summary as proof of something without checking the sources. It was bad before the rollout of Gemini, but at least back then Google was mostly scraping text and presenting it with little modification, along with a direct link to the webpage. Now, it's an LLM generating text phrased as a direct answer to a question (that was also AI-generated from your search query) using AI-summarized data points scraped from multiple webpages. It's obfuscating the source material further, but I also can't help but feel like it exposes a little of the behind-the-scenes fuckery Google has been doing for years before Gemini. How it bastardizes your query by interpreting it into a question, and then prioritizes homogeneous results that agree on the "answer" to your "question". For years they've been doing this to a certain extent, they just didn't share how they interpreted your query.
  • 13 Stimmen
    22 Beiträge
    88 Aufrufe
    T
    You might enjoy this blog post someone linked in another thread earlier today https://www.wheresyoured.at/the-era-of-the-business-idiot/
  • 121 Stimmen
    58 Beiträge
    71 Aufrufe
    D
    I bet every company has at least one employee with right-wing political views. Choosing a product based on some random quotes by employees is stupid.