Skip to content

You can still enable uBlock Origin in Chrome, here is how

Technology
127 95 11
  • 0 Stimmen
    1 Beiträge
    0 Aufrufe
    Niemand hat geantwortet
  • We're Not Innovating, We’re Just Forgetting Slower

    Technology technology
    37
    1
    258 Stimmen
    37 Beiträge
    0 Aufrufe
    R
    The author’s take is detached from reality, filled with hypocrisy and gatekeeping. "Opinionated" is another term - for friendliness and neutrality. Complaining about reality means a degree of detachment from it by intention. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer? When was the last time, Mr commenter, you had to make your own furniture because it's harder to find a thing of the right dimensions to buy? But when that was more common, it was also easier to get the materials and the tools, because ordering things over the Internet and getting them delivered the next day was less common. In terms of managing my home I feel that 00s were nicer than now. Were the centralized "silk road" of today with TSMC kicked out (a nuke, suppose, or a political change), would you prefer less efficient yet more distributed production of electronics? That would have less allowance for various things hidden from users, that happen in modern RAM. Possibly much less. If there was no technological or production cost improvement, we’d just use the old version. I think their point was that there's no architectural innovation in some things. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn’t a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap. Maybe those shifts are in market philosophies in tech. I agree, there is something appealing about it to you and me, but most people don’t care…and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There's a screwdriver. I can imagine there's a fitting basic amount of attention a piece of knowledge gets. I can imagine some person not knowing how to use a screwdriver (substitute with something better) is below that. And some are far above that, maybe. I think the majority of humans is below the level of knowledge computers in our reality require. That's not the level you or the author possess. That's about the level I possessed in my childhood, nothing impressive. Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn’t use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with “part numbers you can look up” would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today’s modern computers. It would seem we are getting a better deal from the same amount of energy spent with modern computers then. Does this seem right to you? It's philosophy and not logic, but I think you know that for getting something you pay something. There's no energy out of nowhere. Discrete components may not make sense. But maybe the insane efficiency we have is paid for with our future. It's made possible by centralization of economy and society and geopolitics, which wasn't needed to make TI-99. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren’t fluent in circuit theory that allows the camera to display the guts of the patient they’re operating on? A surgeon has another specialist nearby, and that specialist doesn't just know these things, but also a lot of other knowledge necessary for them and the surgeon to unambiguously communicate, avoiding fatal mistakes. A bit more expense is spent here than just throwing a device at a surgeon not understanding how it works. A fair bit. Such gatekeeping! So unless you know the actual engineering principles behind a device you’re using, you shouldn’t be allowed to use it? Why not: Such respect! In truth, why wouldn't we trust students to make good use of understanding of their tools and the universe around them, since every human's corpus of knowledge is unique and wonderful, and not intentionally limit them. Innovation isn’t just creating new features or functionality. In fact, most I’d argue is taking existing features or functions and delivering them for substantially less cost/effort. Is change of policy innovation? In our world I see a lot of that. Driven by social and commercial and political interests naturally. As I’m reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. A basic touch on your thoughts further is supposed to be part of school program in many countries. Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, “the best engineering solutions” are often some of the most expensive. You don’t always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past. Does more complex functionality justify this? Who decides what we need? Who decides what is better and what is worse? This comes to policy decisions again. Authority. I think modern authority is misplaced, and were it not, we'd have an environment more similar to what the author wants. The reason your TI-99 and my c64 don’t require constant updates is because they were born before the concept of cybersecurity existed. If you’re going to have internet connected devices they its a near requirement to receive updates for security. Not all updates are for security. And an insecure device still can work years after years. If you don’t want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is. Willpower is a tremendous limitation which people usually ignore. It's very hard to do this when everyone around doesn't. It would be very easy if you were choosing for yourself without network effects and interoperability requirements. So your argument for me doesn't work in your favor, when looking closely. (Similar to "if you disagree with this law, you can explain it at the police station".) Don’t think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today’s cheap commodity PCs. There's a graphical 2d space shooter game for PDP-11. Just saying. Also on its architecture some Soviet clones were made, in the form factor of PCs. With networking capabilities, they were used as command machines for other kinds of simpler PCs, or for production lines, and could be used as file shares, IIRC. I don't remember what that was called, but the absolutely weirdest part was seeing in comments people remembering using that in university computer labs and even in school computer labs, so that actually existed in the USSR. Kinda expensive though, even without Soviet inefficiency. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Yes, which leads to different requirements today. This doesn't stop the discussion. That leads it to the question what changed. We are not obligated to take the perpetual centralization of economies and societies like some divine judgement. We don’t need most of these consumer electronics to last. Who's we? Are you deciding what will Intel RnD focus on, or what will Microsoft change in their OS and applications, or what will Apple produce? Authority, again. If it still works, why isn’t he using one? Could it be he wants the new features and functionality like the rest of us? Yes. It still works for offline purposes. It doesn't work where the modern web is not operable with it. This in my opinion reinforces their idea, not yours. These are my replies. I'll add my own principal opinion - a civilization can be as tall as a human forming it. Abstractions leak, and our world is continuous, so all abstractions leak. To know which do and don't for the particular purpose, you need to know principles. You can use abstractions without looking inside them to build a system inside an architecture, but you can't build an architecture and pick real world solutions for those abstractions without understanding those real wold solutions. Also horizontal connections between abstractions are much more tolerant to leaks than vertical ones. And there's no moral law forbidding us to look above our current environment to understand in which directions it may change.
  • 1k Stimmen
    126 Beiträge
    393 Aufrufe
    S
    AI now offers to post my ads for me on Kijiji. I provide pictures and it has been accurate on price, condition, category and description. I have a lot of shit to sell and was dreading it, but this use removes the biggest barrier for me getting it done. Even helped me figure out some things I was struggling to find online for reference. Saved me at least an hour of tedium yesterday. Excellent use case.
  • 624 Stimmen
    73 Beiträge
    196 Aufrufe
    swelter_spark@reddthat.comS
    Swappa is good for tech.
  • 53 Stimmen
    3 Beiträge
    27 Aufrufe
    B
    There is nothing open about openai, and that was obvious way before they released chatgpt.
  • Why doesn't Nvidia have more competition?

    Technology technology
    22
    1
    33 Stimmen
    22 Beiträge
    87 Aufrufe
    B
    It’s funny how the article asks the question, but completely fails to answer it. About 15 years ago, Nvidia discovered there was a demand for compute in datacenters that could be met with powerful GPU’s, and they were quick to respond to it, and they had the resources to focus on it strongly, because of their huge success and high profitability in the GPU market. AMD also saw the market, and wanted to pursue it, but just over a decade ago where it began to clearly show the high potential for profitability, AMD was near bankrupt, and was very hard pressed to finance developments on GPU and compute in datacenters. AMD really tried the best they could, and was moderately successful from a technology perspective, but Nvidia already had a head start, and the proprietary development system CUDA was already an established standard that was very hard to penetrate. Intel simply fumbled the ball from start to finish. After a decade of trying to push ARM down from having the mobile crown by far, investing billions or actually the equivalent of ARM’s total revenue. They never managed to catch up to ARM despite they had the better production process at the time. This was the main focus of Intel, and Intel believed that GPU would never be more than a niche product. So when intel tried to compete on compute for datacenters, they tried to do it with X86 chips, One of their most bold efforts was to build a monstrosity of a cluster of Celeron chips, which of course performed laughably bad compared to Nvidia! Because as it turns out, the way forward at least for now, is indeed the massively parralel compute capability of a GPU, which Nvidia has refined for decades, only with (inferior) competition from AMD. But despite the lack of competition, Nvidia did not slow down, in fact with increased profits, they only grew bolder in their efforts. Making it even harder to catch up. Now AMD has had more money to compete for a while, and they do have some decent compute units, but Nvidia remains ahead and the CUDA problem is still there, so for AMD to really compete with Nvidia, they have to be better to attract customers. That’s a very tall order against Nvidia that simply seems to never stop progressing. So the only other option for AMD is to sell a bit cheaper. Which I suppose they have to. AMD and Intel were the obvious competitors, everybody else is coming from even further behind. But if I had to make a bet, it would be on Huawei. Huawei has some crazy good developers, and Trump is basically forcing them to figure it out themselves, because he is blocking Huawei and China in general from using both AMD and Nvidia AI chips. And the chips will probably be made by Chinese SMIC, because they are also prevented from using advanced production in the west, most notably TSMC. China will prevail, because it’s become a national project, of both prestige and necessity, and they have a massive talent mass and resources, so nothing can stop it now. IMO USA would clearly have been better off allowing China to use American chips. Now China will soon compete directly on both production and design too.
  • 11 Stimmen
    1 Beiträge
    14 Aufrufe
    Niemand hat geantwortet
  • 1 Stimmen
    14 Beiträge
    75 Aufrufe
    T
    ...is this some sort of joke my Nordic brain can't understand? I need to go hug a councilman.