Skip to content

Microsoft’s Recall feature is still threat to privacy despite recent tweaks

Technology
82 42 474
  • Part of why i knew so-called "digital rights management" was fucking bullshit was because very little software ever came out that empowered me to manage MY OWN rights in the digital space.

    I need there to be FOSS applications that allow me to root-level BLOCK applications from perceiving what I'm doing, to just fucking SANDBOX ABSOLUTELY EVERYTHING BY DEFAULT and let me whitelist what specific things are allowed to directly access the hardware.

    Sadly I am not as tech savvy as I used to think I was. I might've been technologically clever twenty years ago but I hadn't managed to keep up... I think what I've described might be referred to as a "hypervisor"? And I'm told it's an overbearing, clumsy, heavy-handed overkill measure that would be difficult to implement and make everything a pain in the ass to do. So ... shit, man, I dunno... i'm just so damn tired of my hardware being bossed around by people I didn't authorize.

    Write your own OS and software then. Your hardware is running someone else’s software otherwise, so no you don’t get to control every aspect of what it does.

  • OK, so... where the hell is Recall?

    I have a Copilot + device. I am typing this in one, in fact. Recall does not seem to be anywhere to be seen. They added a deployable Google Lens-style "highlight a thing for us to review" thing. It was so intrusive and easy to deploy by accident I got a pretty good notification that I should go turn it off. Maybe that was part of the Recall rollout?

    Incidentally, this piece is... a bit weird. Not only is it an ad, but the concerns they seem to flag as still existing (presumably to sell you their security subscription) seem to be that there is no biometric unlock and just the system PIN and that they don't trust Microsoft on principle. The second is up to you, but the first doesn't really work for me. Not only is the PIN a valid override to biometrics across the board in general (Windows defaults to that when biometrics fails), but it's more secure on principle, since it can't be entered by accident or by force.

    I just don't think the featue is particularly useful for how much potential it has for accidental misuse (even if they never see the data and they keep it entirely secure). It's not the only one of this class, or even Microsoft's first attempt at this (a similar feature shipped with Windows 8). It's certainly become more of a meme than anything else at this point.

    Yeh the entire article is just an ad for AdGuard, using FUD to sell their product.

  • Well:

    1. MacOS is not malware
    2. Apple doesn't make a habit of blatantly lying about their security
    3. As you said, it doesn't actually exist
  • Well:

    1. MacOS is not malware
    2. Apple doesn't make a habit of blatantly lying about their security
    3. As you said, it doesn't actually exist

    Ah, so Apple just happens to be one of the good massive megacorps routinely deploying anti-consumer practices. Gotcha.

    See, it's that gap in perception I'm interested in. Microsoft wants nothing more than having the closed ecosystem Apple has. From their Surface line to their much maligned store to their subscription-forward, always signed-in account environment.

    Why they suck so much at selling that where Apple can get away with murder is much more interesting to me than the perceived differences between the implementations, which I would argue in a number of cases are worked backwards from the brand perception anyway. Part of it is the implementation and the execution rakes Apple chooses not to step on, but certainly not all of it, and that's fascinating.

  • Ah, so Apple just happens to be one of the good massive megacorps routinely deploying anti-consumer practices. Gotcha.

    See, it's that gap in perception I'm interested in. Microsoft wants nothing more than having the closed ecosystem Apple has. From their Surface line to their much maligned store to their subscription-forward, always signed-in account environment.

    Why they suck so much at selling that where Apple can get away with murder is much more interesting to me than the perceived differences between the implementations, which I would argue in a number of cases are worked backwards from the brand perception anyway. Part of it is the implementation and the execution rakes Apple chooses not to step on, but certainly not all of it, and that's fascinating.

    M$ is trying to take an open system and forcibly close it - after driving their user base by force into an unstable OS

    Apple were smart enough to start locking their shit down before home computers became an absolute necessity ...and do it with a functional OS

  • so Apple just happens to be one of the good massive megacorps

    No they're just a different type of shitty.

  • Programs ran through Flatpak can only access permissions and directories that it has explicit permission for. This is perfect for a very small program that only does one thing, it can get rather awkward when you need it to access multiple storage volumes. For example, I wanted to have my Steam games stored on different hard drives, but they were never visible through Steam. I had to override the Flatpak permission to give access to my mounted disks for it to work.

    The fact that we can choose to enhance the permissions beyond their default scope on a case by case basis is powerful.

  • M$ is trying to take an open system and forcibly close it - after driving their user base by force into an unstable OS

    Apple were smart enough to start locking their shit down before home computers became an absolute necessity ...and do it with a functional OS

    Apple locked down their shit way after home computers were a necessity. I'd argue it was the rollout of handheld devices that needed a home computer to fully work that made their walled garden viable.

    And Windows is the main player in home computer OSs. You can take issue with their choices, but it's certainly functional. I'd argue Win11 is annoying, but not even in the top 3 least functional versions of Windows. I mean, I was there for Me, 8.0 and Vista.

    But yes, Apple successfully deployed a locked-down, closed space, and I'm curious about why people are ok with it. That they did it early is... a solid hypothesis, I suppose.

  • so Apple just happens to be one of the good massive megacorps

    No they're just a different type of shitty.

    Right. But the reaction they get to their shittiness is very different, which is the thing I keep wondering about. Everybody keeps telling me why Microsoft is shitty and how Apple isn't shitty in those ways specifically while conceding they are in others.

    I want to know why Apple's shitty doesn't make them the poster boy for shittiness but MS's shitty does. And it does. As far back as Windows 95, Windows is the thing you use that you hate to use and love to hate. That takes work and luck. I want to know how you can dig that hole so effectively while your competition can be just as overtly crappy and still come across as sleek and all the way above good and evil. There's a fundamental truth about branding and squishy human brains buried in that phenomenon.

  • This post did not contain any content.

    Another great reason to switch to Linux. Fuck this shit

  • Right. But the reaction they get to their shittiness is very different, which is the thing I keep wondering about. Everybody keeps telling me why Microsoft is shitty and how Apple isn't shitty in those ways specifically while conceding they are in others.

    I want to know why Apple's shitty doesn't make them the poster boy for shittiness but MS's shitty does. And it does. As far back as Windows 95, Windows is the thing you use that you hate to use and love to hate. That takes work and luck. I want to know how you can dig that hole so effectively while your competition can be just as overtly crappy and still come across as sleek and all the way above good and evil. There's a fundamental truth about branding and squishy human brains buried in that phenomenon.

    I want to know why Apple's shitty doesn't make them the poster boy for shittiness but MS's shitty does

    It doesn't. They're both shitty.

  • This is just a thinly veiled ad for AdGuard.

    If only we could have a response from an independent security researcher instead of a product, that would be great.

  • Apple locked down their shit way after home computers were a necessity. I'd argue it was the rollout of handheld devices that needed a home computer to fully work that made their walled garden viable.

    And Windows is the main player in home computer OSs. You can take issue with their choices, but it's certainly functional. I'd argue Win11 is annoying, but not even in the top 3 least functional versions of Windows. I mean, I was there for Me, 8.0 and Vista.

    But yes, Apple successfully deployed a locked-down, closed space, and I'm curious about why people are ok with it. That they did it early is... a solid hypothesis, I suppose.

    Nah, that shit started to creep in with the imacs - when system 7 became macos.

    Win 11 really isn't functional. There is a serious brain drain problem in microsoft, and as a consequence they've broken some seriously fundamental shit (see: alt tab debacle) made some seriously stupid staff decisions (see: guy responsible for win11 start menu and how it's coded) and somehow even managed to break their own printer spooler.

    Vista at least had the woe that it was forced into hardware packages that weren't powerful enough to handle it, win 11 is just a steaming pile of garbage code.

  • So you’ve never wanted to find an article/headline that you vaguely remember seeing? Or a product that you looked at? Or a picture that you looked at?

    There absolutely is a use case for full reachability of everything you’ve done on your computer. Git commits and terminal history and “recent” files list don’t even come close to providing the same thing lol

    It's true that there's some usefulness in recollection, but geez I find myself digging through my browser history and being absolutely lost... whether it's an article, video, online store product, anything. Then I usually just re-search for whatever it was from scratch 🤷♂

  • 0 Stimmen
    1 Beiträge
    12 Aufrufe
    Niemand hat geantwortet
  • Jack Dorsey’s New App Just Hit a Very Embarrassing Security Snag

    Technology technology
    19
    1
    139 Stimmen
    19 Beiträge
    130 Aufrufe
    U
    Briar is Android only. Bitchat is an iOS app (may have an Android port in the future though, I think).
  • 48 Stimmen
    1 Beiträge
    14 Aufrufe
    Niemand hat geantwortet
  • 16 Stimmen
    4 Beiträge
    27 Aufrufe
    R
    Even with pirated Spotify the worsening of recommendations pushed me to pirate another service. Which is a win for Spotify, I guess.
  • Mergulhe em Aventuras Digitais com a MerwomanPG

    Technology technology
    1
    0 Stimmen
    1 Beiträge
    13 Aufrufe
    Niemand hat geantwortet
  • Programming languages

    Technology technology
    1
    1
    0 Stimmen
    1 Beiträge
    11 Aufrufe
    Niemand hat geantwortet
  • Uploading The Human Mind Could Become a Reality, Expert Says

    Technology technology
    12
    1
    6 Stimmen
    12 Beiträge
    63 Aufrufe
    r3d4ct3d@midwest.socialR
    what mustard is best for the human body?
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    88 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.