Skip to content

'We're done with Teams': German state hits uninstall on Microsoft

Technology
102 64 435
  • 52 Stimmen
    12 Beiträge
    5 Aufrufe
    E
    I can already speak to my cat. It's not really an enlightening conversation it's basically him demanding food, or he wants to go outside, or he wants better food than the food provided. That's basically the extent of his conversation skills. I mean he's a cat, he's not exactly going to talk about politics with me is it even if we could translate between our "languages"
  • 172 Stimmen
    8 Beiträge
    4 Aufrufe
    S
    I wouldn't go quite as far. This is just breacrumbs falling of the corporate table.
  • 139 Stimmen
    28 Beiträge
    162 Aufrufe
    D
    Lmao it hasn't even been a year under Trump. Calm your titties
  • 0 Stimmen
    1 Beiträge
    12 Aufrufe
    Niemand hat geantwortet
  • Remote MCP servers for VSCode

    Technology technology
    1
    1
    0 Stimmen
    1 Beiträge
    13 Aufrufe
    Niemand hat geantwortet
  • 311 Stimmen
    37 Beiträge
    152 Aufrufe
    S
    Same, especially when searching technical or niche topics. Since there aren't a ton of results specific to the topic, mostly semi-related results will appear in the first page or two of a regular (non-Gemini) Google search, just due to the higher popularity of those webpages compared to the relevant webpages. Even the relevant webpages will have lots of non-relevant or semi-relevant information surrounding the answer I'm looking for. I don't know enough about it to be sure, but Gemini is probably just scraping a handful of websites on the first page, and since most of those are only semi-related, the resulting summary is a classic example of garbage in, garbage out. I also think there's probably something in the code that looks for information that is shared across multiple sources and prioritizing that over something that's only on one particular page (possibly the sole result with the information you need). Then, it phrases the summary as a direct answer to your query, misrepresenting the actual information on the pages they scraped. At least Gemini gives sources, I guess. The thing that gets on my nerves the most is how often I see people quote the summary as proof of something without checking the sources. It was bad before the rollout of Gemini, but at least back then Google was mostly scraping text and presenting it with little modification, along with a direct link to the webpage. Now, it's an LLM generating text phrased as a direct answer to a question (that was also AI-generated from your search query) using AI-summarized data points scraped from multiple webpages. It's obfuscating the source material further, but I also can't help but feel like it exposes a little of the behind-the-scenes fuckery Google has been doing for years before Gemini. How it bastardizes your query by interpreting it into a question, and then prioritizes homogeneous results that agree on the "answer" to your "question". For years they've been doing this to a certain extent, they just didn't share how they interpreted your query.
  • 215 Stimmen
    118 Beiträge
    388 Aufrufe
    A
    Outlook has search?!
  • 1 Stimmen
    8 Beiträge
    40 Aufrufe
    L
    I think the principle could be applied to scan outside of the machine. It is making requests to 127.0.0.1:{port} - effectively using your computer as a "server" in a sort of reverse-SSRF attack. There's no reason it can't make requests to 10.10.10.1:{port} as well. Of course you'd need to guess the netmask of the network address range first, but this isn't that hard. In fact, if you consider that at least as far as the desktop site goes, most people will be browsing the web behind a standard consumer router left on defaults where it will be the first device in the DHCP range (e.g. 192.168.0.1 or 10.10.10.1), which tends to have a web UI on the LAN interface (port 8080, 80 or 443), then you'd only realistically need to scan a few addresses to determine the network address range. If you want to keep noise even lower, using just 192.168.0.1:80 and 192.168.1.1:80 I'd wager would cover 99% of consumer routers. From there you could assume that it's a /24 netmask and scan IPs to your heart's content. You could do top 10 most common ports type scans and go in-depth on anything you get a result on. I haven't tested this, but I don't see why it wouldn't work, when I was testing 13ft.io - a self-hosted 12ft.io paywall remover, an SSRF flaw like this absolutely let you perform any network request to any LAN address in range.