Skip to content

Half of companies planning to replace customer service with AI are reversing course

Technology
179 102 193
  • 255 Stimmen
    42 Beiträge
    0 Aufrufe
    L
    The feds also like to criminally prosecute people if they do transactions that can be construed to be avoiding the reporting limit.
  • 42 Stimmen
    9 Beiträge
    26 Aufrufe
    F
    I imagine not, though I haven't looked into it.
  • 0 Stimmen
    1 Beiträge
    8 Aufrufe
    Niemand hat geantwortet
  • How LLMs could be insider threats

    Technology technology
    12
    1
    105 Stimmen
    12 Beiträge
    41 Aufrufe
    patatahooligan@lemmy.worldP
    Of course they're not "three laws safe". They're black boxes that spit out text. We don't have enough understanding and control over how they work to force them to comply with the three laws of robotics, and the LLMs themselves do not have the reasoning capability or the consistency to enforce them even if we prompt them to.
  • 311 Stimmen
    37 Beiträge
    20 Aufrufe
    S
    Same, especially when searching technical or niche topics. Since there aren't a ton of results specific to the topic, mostly semi-related results will appear in the first page or two of a regular (non-Gemini) Google search, just due to the higher popularity of those webpages compared to the relevant webpages. Even the relevant webpages will have lots of non-relevant or semi-relevant information surrounding the answer I'm looking for. I don't know enough about it to be sure, but Gemini is probably just scraping a handful of websites on the first page, and since most of those are only semi-related, the resulting summary is a classic example of garbage in, garbage out. I also think there's probably something in the code that looks for information that is shared across multiple sources and prioritizing that over something that's only on one particular page (possibly the sole result with the information you need). Then, it phrases the summary as a direct answer to your query, misrepresenting the actual information on the pages they scraped. At least Gemini gives sources, I guess. The thing that gets on my nerves the most is how often I see people quote the summary as proof of something without checking the sources. It was bad before the rollout of Gemini, but at least back then Google was mostly scraping text and presenting it with little modification, along with a direct link to the webpage. Now, it's an LLM generating text phrased as a direct answer to a question (that was also AI-generated from your search query) using AI-summarized data points scraped from multiple webpages. It's obfuscating the source material further, but I also can't help but feel like it exposes a little of the behind-the-scenes fuckery Google has been doing for years before Gemini. How it bastardizes your query by interpreting it into a question, and then prioritizes homogeneous results that agree on the "answer" to your "question". For years they've been doing this to a certain extent, they just didn't share how they interpreted your query.
  • 48 Stimmen
    5 Beiträge
    12 Aufrufe
    L
    Arguably we should be imposing 25% DST on digital products to counter the 25% tariff on aluminium and steel and then 10% on everything else. The US started it by imposing blanket tariffs in spite of our free trade agreement.
  • Large Language Models Are More Persuasive Than Humans.

    Technology technology
    3
    1
    11 Stimmen
    3 Beiträge
    15 Aufrufe
    D
    aka psychopathy is a natural advantage for managers.
  • Bill Gates to give away 99% of his wealth in the next 20 years

    Technology technology
    21
    150 Stimmen
    21 Beiträge
    63 Aufrufe
    G
    hehehehe You know, it's hilarious that you say that. Nobody ever realizes that they're talking to a starving homeless person on the internet when they meet one, do they? Believe it or not, quite a few of us do have jobs. Not all of us are disabled or addicted. That is the problem with the society we live in. We're invisible until we talk to you.