Skip to content

We're Not Innovating, We’re Just Forgetting Slower

Technology
37 28 0
  • VHS player

    VCR.

    It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.

    I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.

    A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.

    "VCR" vs "VHS Player":

    Insane hill to die on but you do you.

  • Meanwhile, my Wi-Fi router requires a PhD in reverse engineering just to figure out why it won’t connect to the internet.

    I do think people in general could benefit from maybe $100 in tools and a healthy dose of Youtube when it comes to this point. My PC of 10 years wouldn't boot one morning because my SSD died. There wasn't anything too important on it that I hadn't backed up, but it was still a bummer. I took it apart, and started poking around. Found a short across a capacitor, so I started cycling capacitors. Sure enough, one was bad. Replaced it. Boots just fine. (Moved everything to a new SSD just in case).

    All I needed for this job was a multimeter and a soldering iron (though hot air gun made it slightly easier).

    I think the "black box" nature of electronics is mostly illusory due to how we treat our devices. A friend bought a walking treadmill that wouldn't turn on out of the box. She contacted the company, they told her to trash it and just shipped her a new one.

    She gave it to me, I took it apart. One of the headers that connects the power switch to the mainboard was just unplugged. It took literally 10 minutes to "fix" including disassembly and assembly, and all I needed was a screwdriver.

    Yet there's zero expectation of user maintenance. If it doesn't work, trash it.

    Scroll through maker TikTok

    This guy might be looking in the wrong places.

    Except with Internet shit, it's usually some dumbass at your ISP who is only trained to answer the phone and parrot from 3 different prompts. Actually getting someone who can flip the switch/register your device in the proper region to make shit actually work on their end.

    Apparently I've angered some call center workers.

  • While I 100% agree with the fact that even modern things can be fixed with some knowhow and troubleshooting (and spare capacitors or the like), there’s a few things at play:
    `

    • people generally don’t have this skill set
    • electronics tend to be made cheaper, this means they may fail faster but also means they can be replaced cheaper
    • it costs real money for tech support that can fix said issues, often many times more money than the thing costs to replace
      `

    As a retro enthusiast, I’ve fixed my share of electronics that only needed an hour and a $2 capacitor. But there was also $7 shipping for the cap, and 30-60min of labor, and my knowhow in troubleshooting and experience. If the company had to send someone out, they’d likely spend well over $200 for time, gas, labor, parts, etc. not including a vehicle for the tech and the facility nearby and all that good stuff. Even in the retro sphere, the math starts to side towards fix because of the rarity, but it’s not always clear.

    As a retro enthusiast, I’ve fixed my share of electronics that only needed an hour and a $2 capacitor. But there was also $7 shipping for the cap, and 30-60min of labor, and my knowhow in troubleshooting and experience. If the company had to send someone out, they’d likely spend well over $200 for time, gas, labor, parts, etc. not including a vehicle for the tech and the facility nearby and all that good stuff.

    This is exactly it. I used to work for a manufacturer that made devices they would often need to repair. They would bill non-warranty labor at $100/hour, plus the cost of parts. Their products were primarily used by professionals, so that was fine when it was being done to repair something that cost between $700-$4,000 new, especially for people who were making money using the product. When they launched a product at a $500 MSRP, though, it started to get harder, and even more so when competition forced them to lower the price to $400. When I left they were about to launch a product targeted at amateurs, originally aiming for a $200 price. It was actually being built by a Chinese competitor, with our software guys contributing to the system and putting our logo on it. Spending $100 labor to repair a $200 device was going to be a tough sell, and when I left the plan for warranty “repairs” was to just give the customer a replacement unit and scrap the defective one. And I’m sure the repair labor rate was going up; they had a hard time hiring qualified technicians at the rate they wanted to pay, and most of the department had quit/moved to new roles when I left, so they were surely having to increase pay and the rate they billed.

    When something’s being built on an assembly line mostly by machine and/or low-cost Asian labor, it’s harder for a company to justify paying a skilled technician’s labor in a western country when that makes the cost of repair close to the cost of a new unit.

  • This article is so weirdly written

    One of his points is that a vhs player is easily fixable while a wifi router isn't. These things aren't even remotely the same. They don't serve the same function, they don't have the same complexity. Comparing their repairability makes no sense because they serve different functions. Just because I know how to repair a keyboard doesn't mean I know how to fix a tv.

    Most of his complaints are on the capitalization of modern technology, which is not a problem of innovation and knowledge, it's an economics and political problem.

    Fire good. Angry gods strike ground. Man take fire. Place food on top. Simple.
    E-e-elic-tri-s-i-ty bad. Complicated. Not know who volt is. Sparks scary. Place food on top not know how.

  • VHS player

    VCR.

    It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.

    I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.

    A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.

    "VCR" vs "VHS Player":

    Conjuring up a frequency graph from 2004-present doesn't help your argument, as the VCR format wars were pretty much over a good 15 years beforehand.

    "VCR" could have meant either VHS or Betamax to a consumer in the early '80s.

    At least VHS specifies a particular standard, and "player" in that context has a loose connection with record player, or tape player , being the thing you play your purchased records / tapes / videos on.

  • This post did not contain any content.

    A year or two ago I read about some guy who is still managing the trailer park he inherited from his dad with a TRS-80 (I think), using an app he wrote way back when. If it works it works!

  • You don't have to fix everything, but just doing stuff like replacing connectors and capacitors could probably save 10% of the shit that we throw away, and it's not that hard to try.

    I do agree with that completely and I'd like to add to it with an additional point.

    When things break it sucks, but this does present you with an opportunity. If it's already not working, there's no harm in taking it apart and taking a look around. Maybe you'll see something obviously at fault, maybe you won't. But there's literally no harm in trying to fix it, especially if otherwise you were planning to toss it out.

    And I really can't tell you the number of times I've seen a device stop working, and apon closer inspection the entire problem was something very simple, like an old wire broke at the solder point, and with it disconnected, the power switch doesn't work. When I was a kid and didn't know how to solder, I would fix issues like that with some aluminum foil, and often it worked. Just start with a screwdriver, open things up, take a look around. We owe it to ourselves and to the planet to just give it a shot.

  • This post did not contain any content.

    USB-C. We are clearly progressing. I never want to go back to the world where every phone had a proprietary power brick and connector.

  • Except with Internet shit, it's usually some dumbass at your ISP who is only trained to answer the phone and parrot from 3 different prompts. Actually getting someone who can flip the switch/register your device in the proper region to make shit actually work on their end.

    Apparently I've angered some call center workers.

    Ah yes, the shibboleet

  • The author's take is detached from reality, filled with hypocrisy and gatekeeping.

    This isn't nostalgia talking — it's a recognition that we’ve traded reliability and understanding for the illusion of progress.

    It absolutely is nostalgia talking. Yes your TI-99 fires up immediately when plugged in, and its old. However my Commodore 64 of the same era risk being fried because the 5v regulator doesn't age well and when fails dumps higher voltage right into the RAM and CPU. Oh, and c64 machines were never built with overvoltage protection because of cost savings. So don't confuse age with some idea of golden era reliability. RAM ICs were also regularly failed in those age of computers. This is why you had RAM testing programs and socketed ICs. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?

    Today’s innovation cycle has become a kind of collective amnesia, where every few years we rediscover fundamental concepts, slap a new acronym on them, and pretend we’ve revolutionized computing. Edge computing? That’s just distributed processing with better marketing. Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files. Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.

    By that logic, even the TI-99 he's loving on is just a fancier ENIAC or UNIVAC. All technology is built upon the era before it. If there was no technological or production cost improvement, we'd just use the old version. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn't a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.

    There’s something deeply humbling about opening a 40-year-old piece of electronics and finding components you can actually identify. Resistors, capacitors, integrated circuits with part numbers you can look up. Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste.

    I agree, there is something appealing about it to you and me, but most people don't care....and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There were absolutely users of TI-99 and C64 computers in the 80s that didn't give two shits about the shift register ICs or the UART that made the modem work, but they loved that they could get invoices from their loading dock sent electronically instead of a piece of paper carried (and lost!) through multiple hands.

    Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn't use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with "part numbers you can look up" would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today's modern computers.

    This isn't their fault — it's a systemic problem. Our education and industry reward breadth over depth, familiarity over fluency. We’ve optimized for shipping features quickly rather than understanding systems thoroughly. The result is a kind of technical learned helplessness, where practitioners become dependent on abstractions they can’t peer beneath.

    Ugh, this is frustrating. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren't fluent in circuit theory that allows the camera to display the guts of the patient they're operating on? No, of course not. We want that surgeon to keep studying new surgical technics, not trying to use Ohm's Law to calculate the current draw of the device he's using. Mr author, you and I hobby at electronics (and vintage computing) but just because its an interest of ours, doesn't mean it has to be of everyone.

    What We Need Now: We need editors who know what a Bode plot is. We need technical writing that assumes intelligence rather than ignorance. We need educational systems that teach principles alongside tools, theory alongside practice.

    Such gatekeeping! So unless you know the actual engineering principles behind a device you're using, you shouldn't be allowed to use it?

    Most importantly, we need to stop mistaking novelty for innovation and complexity for progress.

    Innovation isn't just creating new features or functionality. In fact, most I'd argue is taking existing features or functions and delivering them for substantially less cost/effort.

    As I'm reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. Does the Mr author know when to till soil or plant seed? How about the amount of irrigation Durum wheat needs during the hot season? How about when to harvest? What moisture level should the resulting harvest have before being taking to market or put in long term storage? Yet there he sits, eating the sandwich blissfully unaware of all the steps and effort needed to just make the wheat that goes into the bread. The farmer sits and wonders if Mr author's next article will be deriding the public on just eating bread and how we've forgotten how to grow wheat. Will Mr Author say we need fewer people ordering sandwiches and more people consulting US GIS maps for rainfall statistics and studying nitrogen fixing techniques for soil health? No, probably not.

    The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them.

    Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, "the best engineering solutions" are often some of the most expensive. You don't always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.

    They don't require constant updates or cloud connectivity or subscription services. They just work, year after year, doing exactly what they were designed to do.

    The reason your TI-99 and my c64 don't require constant updates is because they were born before the concept of cybersecurity existed. If you're going to have internet connected devices they its a near requirement to receive updates for security.

    If you don't want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.

    That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code.

    It is a machine of extremely limited functionality with a comparably simple design and construction. Don't think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today's cheap commodity PCs.

    It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

    Take off the rose colored glasses. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Sales of it absolutely served quarterly revenue numbers even back in the 1980s.

    We used to build things that lasted.

    We don't need most of these consumer electronics to last. Proof positive is the computer Mr. author is writing his article on is unlikely to be an Intel based 486 running at 33Mhz from the mid 90s (or a 68030 Mac). If it still works, why isn't he using one? Could it be he wants the new features and functionality like the rest of us? Over-engineering is a thing, and it sounds like what the author is preaching.

    Apologies if my post turned into a rant.

    The author’s take is detached from reality, filled with hypocrisy and gatekeeping.

    "Opinionated" is another term - for friendliness and neutrality. Complaining about reality means a degree of detachment from it by intention.

    When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?

    When was the last time, Mr commenter, you had to make your own furniture because it's harder to find a thing of the right dimensions to buy? But when that was more common, it was also easier to get the materials and the tools, because ordering things over the Internet and getting them delivered the next day was less common. In terms of managing my home I feel that 00s were nicer than now.

    Were the centralized "silk road" of today with TSMC kicked out (a nuke, suppose, or a political change), would you prefer less efficient yet more distributed production of electronics? That would have less allowance for various things hidden from users, that happen in modern RAM. Possibly much less.

    If there was no technological or production cost improvement, we’d just use the old version.

    I think their point was that there's no architectural innovation in some things.

    Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn’t a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.

    Maybe those shifts are in market philosophies in tech.

    I agree, there is something appealing about it to you and me, but most people don’t care…and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be.

    There's a screwdriver. I can imagine there's a fitting basic amount of attention a piece of knowledge gets. I can imagine some person not knowing how to use a screwdriver (substitute with something better) is below that. And some are far above that, maybe.

    I think the majority of humans is below the level of knowledge computers in our reality require. That's not the level you or the author possess. That's about the level I possessed in my childhood, nothing impressive.

    Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn’t use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with “part numbers you can look up” would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today’s modern computers.

    It would seem we are getting a better deal from the same amount of energy spent with modern computers then. Does this seem right to you?

    It's philosophy and not logic, but I think you know that for getting something you pay something. There's no energy out of nowhere.

    Discrete components may not make sense. But maybe the insane efficiency we have is paid for with our future. It's made possible by centralization of economy and society and geopolitics, which wasn't needed to make TI-99.

    Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren’t fluent in circuit theory that allows the camera to display the guts of the patient they’re operating on?

    A surgeon has another specialist nearby, and that specialist doesn't just know these things, but also a lot of other knowledge necessary for them and the surgeon to unambiguously communicate, avoiding fatal mistakes. A bit more expense is spent here than just throwing a device at a surgeon not understanding how it works. A fair bit.

    Such gatekeeping! So unless you know the actual engineering principles behind a device you’re using, you shouldn’t be allowed to use it?

    Why not:

    Such respect! In truth, why wouldn't we trust students to make good use of understanding of their tools and the universe around them, since every human's corpus of knowledge is unique and wonderful, and not intentionally limit them.

    Innovation isn’t just creating new features or functionality. In fact, most I’d argue is taking existing features or functions and delivering them for substantially less cost/effort.

    Is change of policy innovation? In our world I see a lot of that. Driven by social and commercial and political interests naturally.

    As I’m reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread.

    A basic touch on your thoughts further is supposed to be part of school program in many countries.

    Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, “the best engineering solutions” are often some of the most expensive. You don’t always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.

    Does more complex functionality justify this? Who decides what we need? Who decides what is better and what is worse?

    This comes to policy decisions again. Authority. I think modern authority is misplaced, and were it not, we'd have an environment more similar to what the author wants.

    The reason your TI-99 and my c64 don’t require constant updates is because they were born before the concept of cybersecurity existed. If you’re going to have internet connected devices they its a near requirement to receive updates for security.

    Not all updates are for security. And an insecure device still can work years after years.

    If you don’t want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.

    Willpower is a tremendous limitation which people usually ignore. It's very hard to do this when everyone around doesn't. It would be very easy if you were choosing for yourself without network effects and interoperability requirements.

    So your argument for me doesn't work in your favor, when looking closely. (Similar to "if you disagree with this law, you can explain it at the police station".)

    Don’t think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today’s cheap commodity PCs.

    There's a graphical 2d space shooter game for PDP-11. Just saying.

    Also on its architecture some Soviet clones were made, in the form factor of PCs. With networking capabilities, they were used as command machines for other kinds of simpler PCs, or for production lines, and could be used as file shares, IIRC. I don't remember what that was called, but the absolutely weirdest part was seeing in comments people remembering using that in university computer labs and even in school computer labs, so that actually existed in the USSR.

    Kinda expensive though, even without Soviet inefficiency.

    It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell.

    Yes, which leads to different requirements today. This doesn't stop the discussion. That leads it to the question what changed. We are not obligated to take the perpetual centralization of economies and societies like some divine judgement.

    We don’t need most of these consumer electronics to last.

    Who's we? Are you deciding what will Intel RnD focus on, or what will Microsoft change in their OS and applications, or what will Apple produce?

    Authority, again.

    If it still works, why isn’t he using one? Could it be he wants the new features and functionality like the rest of us?

    Yes. It still works for offline purposes. It doesn't work where the modern web is not operable with it. This in my opinion reinforces their idea, not yours.

    These are my replies. I'll add my own principal opinion - a civilization can be as tall as a human forming it. Abstractions leak, and our world is continuous, so all abstractions leak. To know which do and don't for the particular purpose, you need to know principles. You can use abstractions without looking inside them to build a system inside an architecture, but you can't build an architecture and pick real world solutions for those abstractions without understanding those real wold solutions. Also horizontal connections between abstractions are much more tolerant to leaks than vertical ones.

    And there's no moral law forbidding us to look above our current environment to understand in which directions it may change.