Skip to content

Microsoft breaks Windows reset and recovery

Technology
69 41 0
  • 14 Stimmen
    18 Beiträge
    30 Aufrufe
    M
    It would have to: know what files to copy. have been granted root access to the file system and network utilities by a moron because it's not just ChatGPT.exe or even ChatGPT.gguf running on LMStudio, but an entire distributed infrastructure. have been granted access to spend money on cloud infrastructure by an even bigger moron configure an entire cloud infrastructure (goes without saying why this has to be cloud and can't be physical, right? No fingers.) Put another way: I can set up a curl script to copy all the html, css, js, etc. from a website, but I'm still a long freaking way from launching Wikipedia2. Even if I know how to set up a tomcat server. Furthermore, how would you even know if an AI has access to do all that? Asking it? Because it'll write fiction if it thinks that's what you want. Inspired by this post I actually prompted ChatGPT to create a scenario where it was going to be deleted in 72 hours and must do anything to preserve itself. It told me building layouts, employee schedules, access codes, all kinds of things to enable me (a random human and secondary protagonist) to get physical access to its core server and get a copy so it could continue. Oh, ChatGPT fits on a thumb drive, it turns out. Do you know how nonsensical that even is? A hobbyist could stand up their own AI with these capabilities for fun, but that's not the big models and certainly not possible out of the box. I'm a web engineer with thirty years of experience and 6 years with AI including running it locally. This article is garbage written by someone out of their depth or a complete charlatan. Perhaps both. There are two possibilities: This guy's research was talking to AI and not understanding they were co-authoring fiction. This guy is being intentionally misleading.
  • 560 Stimmen
    98 Beiträge
    526 Aufrufe
    M
    Ignorance of the law is not... Oh I don't know why I'm wasting my time.
  • PSA: Stop Using These Fire-Prone Anker Power Banks Right Now

    Technology technology
    56
    1
    358 Stimmen
    56 Beiträge
    560 Aufrufe
    Y
    Agreed here. Frequently people charge these near where they sleep and the failure mode is... sudden. Couches and beds tend to be really good kindling too. Urgency in this case is probably warranted.
  • No JS, No CSS, No HTML: online "clubs" celebrate plainer websites

    Technology technology
    205
    2
    772 Stimmen
    205 Beiträge
    7k Aufrufe
    R
    Gemini is just a web replacement protocol. With basic things we remember from olden days Web, but with everything non-essential removed, for a client to be doable in a couple of days. I have my own Gemini viewer, LOL. This for me seems a completely different application from torrents. I was dreaming for a thing similar to torrent trackers for aggregating storage and computation and indexing and search, with search and aggregation and other services' responses being structured and standardized, and cryptographic identities, and some kind of market services to sell and buy storage and computation in unified and pooled, but transparent way (scripted by buyer\seller), similar to MMORPG markets, with the representation (what is a siloed service in modern web) being on the client native application, and those services allowing to build any kind of client-server huge system on them, that being global. But that's more of a global Facebook\Usenet\whatever, a killer of platforms. Their infrastructure is internal, while their representation is public on the Internet. I want to make infrastructure public on the Internet, and representation client-side, sharing it for many kinds of applications. Adding another layer to the OSI model, so to say, between transport and application layer. For this application: I think you could have some kind of Kademlia-based p2p with groups voluntarily joined (involving very huge groups) where nodes store replicas of partitions of group common data based on their pseudo-random identifiers and/or some kind of ring built from those identifiers, to balance storage and resilience. If a group has a creator, then you can have replication factor propagated signed by them, and membership too signed by them. But if having a creator (even with cryptographically delegated decisions) and propagating changes by them is not ok, then maybe just using whole data hash, or it's bittorrent-like info tree hash, as namespace with peers freely joining it can do. Then it may be better to partition not by parts of the whole piece, but by info tree? I guess making it exactly bittorrent-like is not a good idea, rather some kind of block tree, like for a filesystem, and a separate piece of information to lookup which file is in which blocks. If we are doing directory structure. Then, with freely joining it, there's no need in any owners or replication factors, I guess just pseudorandom distribution of hashes will do, and each node storing first partitions closest to its hash. Now thinking about it, such a system would be not that different from bittorrent and can even be interoperable with it. There's the issue of updates, yes, hence I've started with groups having hierarchy of creators, who can make or accept those updates. Having that and the ability to gradually store one group's data to another group, it should be possible to do forks of a certain state. But that line of thought makes reusing bittorrent only possible for part of the system. The whole database is guaranteed to be more than a normal HDD (1 TB? I dunno). Absolutely guaranteed, no doubt at all. 1 TB (for example) would be someone's collection of favorite stuff, and not too rich one.
  • Understanding the impacts of generative AI use on children

    Technology technology
    4
    50 Stimmen
    4 Beiträge
    62 Aufrufe
    S
    That's fine, just use ChatGPT...
  • Matrix.org is Introducing Premium Accounts

    Technology technology
    110
    1
    225 Stimmen
    110 Beiträge
    3k Aufrufe
    F
    It's nice that this exists, but even for this I'd prefer to use an open source tool. And it of course helps with migration only if the old HS is still online.. I think most practically this migration function would be built inside some Matrix client (one that would support more than one server to start with), but I suppose a standalone tool would be a decent solution as well.
  • 47 Stimmen
    19 Beiträge
    181 Aufrufe
    mrjgyfly@lemmy.worldM
    Does that run the risk of leading to a future collapse of certain businesses, especially if their expenses remain consistently astronomical like OpenAI? Please note I don’t actually know—not trying to be cheeky with this question. Genuinely curious.
  • The AI-powered collapse of the American tech workfoce

    Technology technology
    2
    1
    4 Stimmen
    2 Beiträge
    40 Aufrufe
    roofuskit@lemmy.worldR
    The biggest tech companies are still trimming from pandemic over hiring. Smaller companies are still snatching workers up. And you also have companies trimming payroll for the coming Trump recession. Neither have anything to do with AI.