Skip to content

eSafety boss wants YouTube included in the social media ban. But AI raises even more concerns for kids

Technology
13 7 129
  • 91 Stimmen
    8 Beiträge
    9 Aufrufe
    O
    People wonder how the Nazis were able to do the things they did. This is how. Cowards fell in line and obeyed. We're repeating history, and Trump and his allies have only just begun. We're not even seven months into this nightmare. The only thing nec­es­sary for the tri­umph of evil is for good men to do noth­ing.
  • 237 Stimmen
    37 Beiträge
    230 Aufrufe
    K
    AI has some use but it always needs human oversight and the final decision must also be made by a human professional. If you use AI to speed up tasks and you know whether the output of the AI is valid or not, and you have the final decision, then you can safely use it. But if you let AI decide on and execute important tasks basically autonomously, then you have a recipe for disaster. Fully autonomous and mistake-free AI is a naive pipe dream which I don't see on the horizon at all.
  • On Demand App Development Company

    Technology technology
    1
    2
    0 Stimmen
    1 Beiträge
    19 Aufrufe
    Niemand hat geantwortet
  • Software is evolving backwards

    Technology technology
    64
    1
    341 Stimmen
    64 Beiträge
    631 Aufrufe
    M
    Came here looking for this
  • 310 Stimmen
    37 Beiträge
    360 Aufrufe
    S
    Same, especially when searching technical or niche topics. Since there aren't a ton of results specific to the topic, mostly semi-related results will appear in the first page or two of a regular (non-Gemini) Google search, just due to the higher popularity of those webpages compared to the relevant webpages. Even the relevant webpages will have lots of non-relevant or semi-relevant information surrounding the answer I'm looking for. I don't know enough about it to be sure, but Gemini is probably just scraping a handful of websites on the first page, and since most of those are only semi-related, the resulting summary is a classic example of garbage in, garbage out. I also think there's probably something in the code that looks for information that is shared across multiple sources and prioritizing that over something that's only on one particular page (possibly the sole result with the information you need). Then, it phrases the summary as a direct answer to your query, misrepresenting the actual information on the pages they scraped. At least Gemini gives sources, I guess. The thing that gets on my nerves the most is how often I see people quote the summary as proof of something without checking the sources. It was bad before the rollout of Gemini, but at least back then Google was mostly scraping text and presenting it with little modification, along with a direct link to the webpage. Now, it's an LLM generating text phrased as a direct answer to a question (that was also AI-generated from your search query) using AI-summarized data points scraped from multiple webpages. It's obfuscating the source material further, but I also can't help but feel like it exposes a little of the behind-the-scenes fuckery Google has been doing for years before Gemini. How it bastardizes your query by interpreting it into a question, and then prioritizes homogeneous results that agree on the "answer" to your "question". For years they've been doing this to a certain extent, they just didn't share how they interpreted your query.
  • 40K IoT cameras worldwide stream secrets to anyone with a browser.

    Technology technology
    18
    1
    118 Stimmen
    18 Beiträge
    162 Aufrufe
    T
    For the Emperor!
  • 88 Stimmen
    21 Beiträge
    302 Aufrufe
    J
    The self hosted model has hard coded censored content.
  • 272 Stimmen
    131 Beiträge
    1k Aufrufe
    eyedust@lemmy.dbzer0.comE
    This is good to know. I hadn't read the fine print, because I abandoned Telegram and never looked back. I hope its true and I agree, I also wouldn't think they'd do this and then renege into a possible lawsuit.