Skip to content

We're Not Innovating, We’re Just Forgetting Slower

Technology
36 27 0
  • Meanwhile, my Wi-Fi router requires a PhD in reverse engineering just to figure out why it won’t connect to the internet.

    I do think people in general could benefit from maybe $100 in tools and a healthy dose of Youtube when it comes to this point. My PC of 10 years wouldn't boot one morning because my SSD died. There wasn't anything too important on it that I hadn't backed up, but it was still a bummer. I took it apart, and started poking around. Found a short across a capacitor, so I started cycling capacitors. Sure enough, one was bad. Replaced it. Boots just fine. (Moved everything to a new SSD just in case).

    All I needed for this job was a multimeter and a soldering iron (though hot air gun made it slightly easier).

    I think the "black box" nature of electronics is mostly illusory due to how we treat our devices. A friend bought a walking treadmill that wouldn't turn on out of the box. She contacted the company, they told her to trash it and just shipped her a new one.

    She gave it to me, I took it apart. One of the headers that connects the power switch to the mainboard was just unplugged. It took literally 10 minutes to "fix" including disassembly and assembly, and all I needed was a screwdriver.

    Yet there's zero expectation of user maintenance. If it doesn't work, trash it.

    Scroll through maker TikTok

    This guy might be looking in the wrong places.

    She gave it to me, I took it apart. One of the headers that connects the power switch to the mainboard was just unplugged. It took literally 10 minutes to "fix" including disassembly and assembly, and all I needed was a screwdriver.

    My buddy has a $6000 projector. He found it in the trash. The only thing wrong with it was a cracked solder on the power supply.

    Similarly, I have a $5000 audio console that I got for ~$100 in parts; it had a bad power supply. Honestly, probably just a bad capacitor on the power supply, but I didn’t feel like desoldering every capacitor to check their capacitance. Diagnosing the power supply took about 5 minutes, and most of that was just finding all of the screws that were holding the case together. A quick read with a multimeter told me everything I needed to know. Swapped out the supply, and it has been working fine ever since.

  • This post did not contain any content.

    I believe we are also looking at survivorship bias.

    A vast majority of small devices fail sometime in the first 10 years they are made. Some are designed that way, some are used heavily and broken after a while, some will stick around until their battery becomes a spicy pillow. Lithium will eventually stop working so no matter what, that small device that you cant replace the battery WILL die.

    But some devices have parts that are repairable and they tend to stick around.

  • There are real limits to repairability in modern devices, some placed there just in order to force you to pay the manufacturer more money. But you're right that there's a lot we could do that we're just not bothering to do.

    You don't have to fix everything, but just doing stuff like replacing connectors and capacitors could probably save 10% of the shit that we throw away, and it's not that hard to try.

  • She gave it to me, I took it apart. One of the headers that connects the power switch to the mainboard was just unplugged. It took literally 10 minutes to "fix" including disassembly and assembly, and all I needed was a screwdriver.

    My buddy has a $6000 projector. He found it in the trash. The only thing wrong with it was a cracked solder on the power supply.

    Similarly, I have a $5000 audio console that I got for ~$100 in parts; it had a bad power supply. Honestly, probably just a bad capacitor on the power supply, but I didn’t feel like desoldering every capacitor to check their capacitance. Diagnosing the power supply took about 5 minutes, and most of that was just finding all of the screws that were holding the case together. A quick read with a multimeter told me everything I needed to know. Swapped out the supply, and it has been working fine ever since.

    I can top that. I got a broken $100 BlueYeti microphone for $10 on eBay. The USB cable they shipped it with was bad.

  • pay to have the unit returned, spend valuable technician time diagnosing and fixing an issue and then pay to ship the repaired unit back.

    My point is that in a better world, people could fix this kind of thing themselves. Like offer a discount for their trouble and have them or their mechanic aunt come by and fix it.

    Oh, I fully agree.

    I really want to go back to electronics and appliances being both more robust and more repairable. It's just that the vast majority of the population disagrees with that once they learn that it will make things cost more initially.

  • Meanwhile, my Wi-Fi router requires a PhD in reverse engineering just to figure out why it won’t connect to the internet.

    I do think people in general could benefit from maybe $100 in tools and a healthy dose of Youtube when it comes to this point. My PC of 10 years wouldn't boot one morning because my SSD died. There wasn't anything too important on it that I hadn't backed up, but it was still a bummer. I took it apart, and started poking around. Found a short across a capacitor, so I started cycling capacitors. Sure enough, one was bad. Replaced it. Boots just fine. (Moved everything to a new SSD just in case).

    All I needed for this job was a multimeter and a soldering iron (though hot air gun made it slightly easier).

    I think the "black box" nature of electronics is mostly illusory due to how we treat our devices. A friend bought a walking treadmill that wouldn't turn on out of the box. She contacted the company, they told her to trash it and just shipped her a new one.

    She gave it to me, I took it apart. One of the headers that connects the power switch to the mainboard was just unplugged. It took literally 10 minutes to "fix" including disassembly and assembly, and all I needed was a screwdriver.

    Yet there's zero expectation of user maintenance. If it doesn't work, trash it.

    Scroll through maker TikTok

    This guy might be looking in the wrong places.

    eh. give me schematics. i can't fix anything beyond trivial issues without it.

    then it won't be as much of a black box.

  • eh. give me schematics. i can't fix anything beyond trivial issues without it.

    then it won't be as much of a black box.

    beyond trivial issues

    I'd argue that 10-15% of issues are trivial issues and are worth investigating even without a schematic if the alternative is just throwing something away.

  • beyond trivial issues

    I'd argue that 10-15% of issues are trivial issues and are worth investigating even without a schematic if the alternative is just throwing something away.

    and i do because i don't want to throw away this expensive piece of tech. but like, manufacturers in the early 2000s were still sharing this very valuable information with me. i hate planned obsolescence with a passion.

  • I think the "black box" nature of electronics is mostly illusory due to how we treat our devices. A friend bought a walking treadmill that wouldn't turn on out of the box. She contacted the company, they told her to trash it and just shipped her a new one.

    She gave it to me, I took it apart. One of the headers that connects the power switch to the mainboard was just unplugged. It took literally 10 minutes to "fix" including disassembly and assembly, and all I needed was a screwdriver.

    This is a symptom of industry switching to cheap "disposable" electronics, rather than more expensive, robust, and repairable ones.

    From the treadmill company's point of view, it's cheaper to just lose one unit and pay shipping one way rather than pay to have the unit returned, spend valuable technician time diagnosing and fixing an issue and then pay to ship the repaired unit back.

    About 50 years ago, you could find appliance repair shops that would fix your broken toaster or TV, and parts for stuff like that were easily available. Now, with the advanced automation in building these, combined with the increased difficulty of repair(fine-work soldering, firmware debuging and the like) it makes way more sense to just replace the whole thing.

    Agreed it definitely depends on what you buy. I inherited a stereo amp from my uncle who always buys really nice gear. I have had it repaired or been able to repair it anytime a component failed and it is now 30 years old. But it was built to last that long not to be disposed of in five.

    Right to repair is not just for nerds and tinkerers. We all deserve repairable products.

  • This post did not contain any content.

    The author's take is detached from reality, filled with hypocrisy and gatekeeping.

    This isn't nostalgia talking — it's a recognition that we’ve traded reliability and understanding for the illusion of progress.

    It absolutely is nostalgia talking. Yes your TI-99 fires up immediately when plugged in, and its old. However my Commodore 64 of the same era risk being fried because the 5v regulator doesn't age well and when fails dumps higher voltage right into the RAM and CPU. Oh, and c64 machines were never built with overvoltage protection because of cost savings. So don't confuse age with some idea of golden era reliability. RAM ICs were also regularly failed in those age of computers. This is why you had RAM testing programs and socketed ICs. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?

    Today’s innovation cycle has become a kind of collective amnesia, where every few years we rediscover fundamental concepts, slap a new acronym on them, and pretend we’ve revolutionized computing. Edge computing? That’s just distributed processing with better marketing. Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files. Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.

    By that logic, even the TI-99 he's loving on is just a fancier ENIAC or UNIVAC. All technology is built upon the era before it. If there was no technological or production cost improvement, we'd just use the old version. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn't a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.

    There’s something deeply humbling about opening a 40-year-old piece of electronics and finding components you can actually identify. Resistors, capacitors, integrated circuits with part numbers you can look up. Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste.

    I agree, there is something appealing about it to you and me, but most people don't care....and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There were absolutely users of TI-99 and C64 computers in the 80s that didn't give two shits about the shift register ICs or the UART that made the modem work, but they loved that they could get invoices from their loading dock sent electronically instead of a piece of paper carried (and lost!) through multiple hands.

    Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn't use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with "part numbers you can look up" would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today's modern computers.

    This isn't their fault — it's a systemic problem. Our education and industry reward breadth over depth, familiarity over fluency. We’ve optimized for shipping features quickly rather than understanding systems thoroughly. The result is a kind of technical learned helplessness, where practitioners become dependent on abstractions they can’t peer beneath.

    Ugh, this is frustrating. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren't fluent in circuit theory that allows the camera to display the guts of the patient they're operating on? No, of course not. We want that surgeon to keep studying new surgical technics, not trying to use Ohm's Law to calculate the current draw of the device he's using. Mr author, you and I hobby at electronics (and vintage computing) but just because its an interest of ours, doesn't mean it has to be of everyone.

    What We Need Now: We need editors who know what a Bode plot is. We need technical writing that assumes intelligence rather than ignorance. We need educational systems that teach principles alongside tools, theory alongside practice.

    Such gatekeeping! So unless you know the actual engineering principles behind a device you're using, you shouldn't be allowed to use it?

    Most importantly, we need to stop mistaking novelty for innovation and complexity for progress.

    Innovation isn't just creating new features or functionality. In fact, most I'd argue is taking existing features or functions and delivering them for substantially less cost/effort.

    As I'm reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. Does the Mr author know when to till soil or plant seed? How about the amount of irrigation Durum wheat needs during the hot season? How about when to harvest? What moisture level should the resulting harvest have before being taking to market or put in long term storage? Yet there he sits, eating the sandwich blissfully unaware of all the steps and effort needed to just make the wheat that goes into the bread. The farmer sits and wonders if Mr author's next article will be deriding the public on just eating bread and how we've forgotten how to grow wheat. Will Mr Author say we need fewer people ordering sandwiches and more people consulting US GIS maps for rainfall statistics and studying nitrogen fixing techniques for soil health? No, probably not.

    The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them.

    Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, "the best engineering solutions" are often some of the most expensive. You don't always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.

    They don't require constant updates or cloud connectivity or subscription services. They just work, year after year, doing exactly what they were designed to do.

    The reason your TI-99 and my c64 don't require constant updates is because they were born before the concept of cybersecurity existed. If you're going to have internet connected devices they its a near requirement to receive updates for security.

    If you don't want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.

    That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code.

    It is a machine of extremely limited functionality with a comparably simple design and construction. Don't think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today's cheap commodity PCs.

    It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

    Take off the rose colored glasses. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Sales of it absolutely served quarterly revenue numbers even back in the 1980s.

    We used to build things that lasted.

    We don't need most of these consumer electronics to last. Proof positive is the computer Mr. author is writing his article on is unlikely to be an Intel based 486 running at 33Mhz from the mid 90s (or a 68030 Mac). If it still works, why isn't he using one? Could it be he wants the new features and functionality like the rest of us? Over-engineering is a thing, and it sounds like what the author is preaching.

    Apologies if my post turned into a rant.

  • The author's take is detached from reality, filled with hypocrisy and gatekeeping.

    This isn't nostalgia talking — it's a recognition that we’ve traded reliability and understanding for the illusion of progress.

    It absolutely is nostalgia talking. Yes your TI-99 fires up immediately when plugged in, and its old. However my Commodore 64 of the same era risk being fried because the 5v regulator doesn't age well and when fails dumps higher voltage right into the RAM and CPU. Oh, and c64 machines were never built with overvoltage protection because of cost savings. So don't confuse age with some idea of golden era reliability. RAM ICs were also regularly failed in those age of computers. This is why you had RAM testing programs and socketed ICs. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?

    Today’s innovation cycle has become a kind of collective amnesia, where every few years we rediscover fundamental concepts, slap a new acronym on them, and pretend we’ve revolutionized computing. Edge computing? That’s just distributed processing with better marketing. Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files. Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.

    By that logic, even the TI-99 he's loving on is just a fancier ENIAC or UNIVAC. All technology is built upon the era before it. If there was no technological or production cost improvement, we'd just use the old version. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn't a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.

    There’s something deeply humbling about opening a 40-year-old piece of electronics and finding components you can actually identify. Resistors, capacitors, integrated circuits with part numbers you can look up. Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste.

    I agree, there is something appealing about it to you and me, but most people don't care....and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There were absolutely users of TI-99 and C64 computers in the 80s that didn't give two shits about the shift register ICs or the UART that made the modem work, but they loved that they could get invoices from their loading dock sent electronically instead of a piece of paper carried (and lost!) through multiple hands.

    Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn't use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with "part numbers you can look up" would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today's modern computers.

    This isn't their fault — it's a systemic problem. Our education and industry reward breadth over depth, familiarity over fluency. We’ve optimized for shipping features quickly rather than understanding systems thoroughly. The result is a kind of technical learned helplessness, where practitioners become dependent on abstractions they can’t peer beneath.

    Ugh, this is frustrating. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren't fluent in circuit theory that allows the camera to display the guts of the patient they're operating on? No, of course not. We want that surgeon to keep studying new surgical technics, not trying to use Ohm's Law to calculate the current draw of the device he's using. Mr author, you and I hobby at electronics (and vintage computing) but just because its an interest of ours, doesn't mean it has to be of everyone.

    What We Need Now: We need editors who know what a Bode plot is. We need technical writing that assumes intelligence rather than ignorance. We need educational systems that teach principles alongside tools, theory alongside practice.

    Such gatekeeping! So unless you know the actual engineering principles behind a device you're using, you shouldn't be allowed to use it?

    Most importantly, we need to stop mistaking novelty for innovation and complexity for progress.

    Innovation isn't just creating new features or functionality. In fact, most I'd argue is taking existing features or functions and delivering them for substantially less cost/effort.

    As I'm reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. Does the Mr author know when to till soil or plant seed? How about the amount of irrigation Durum wheat needs during the hot season? How about when to harvest? What moisture level should the resulting harvest have before being taking to market or put in long term storage? Yet there he sits, eating the sandwich blissfully unaware of all the steps and effort needed to just make the wheat that goes into the bread. The farmer sits and wonders if Mr author's next article will be deriding the public on just eating bread and how we've forgotten how to grow wheat. Will Mr Author say we need fewer people ordering sandwiches and more people consulting US GIS maps for rainfall statistics and studying nitrogen fixing techniques for soil health? No, probably not.

    The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them.

    Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, "the best engineering solutions" are often some of the most expensive. You don't always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.

    They don't require constant updates or cloud connectivity or subscription services. They just work, year after year, doing exactly what they were designed to do.

    The reason your TI-99 and my c64 don't require constant updates is because they were born before the concept of cybersecurity existed. If you're going to have internet connected devices they its a near requirement to receive updates for security.

    If you don't want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.

    That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code.

    It is a machine of extremely limited functionality with a comparably simple design and construction. Don't think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today's cheap commodity PCs.

    It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

    Take off the rose colored glasses. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Sales of it absolutely served quarterly revenue numbers even back in the 1980s.

    We used to build things that lasted.

    We don't need most of these consumer electronics to last. Proof positive is the computer Mr. author is writing his article on is unlikely to be an Intel based 486 running at 33Mhz from the mid 90s (or a 68030 Mac). If it still works, why isn't he using one? Could it be he wants the new features and functionality like the rest of us? Over-engineering is a thing, and it sounds like what the author is preaching.

    Apologies if my post turned into a rant.

    There's also that things have shrunk enough and that takes more precision and specialized tools to repair. And there's also that some people, myself included, have absolutely no business messing around with capacitors and the like. It is just not everyone's skillset, and that's okay, we live with other people who cover what our personal skills don't, and it comes with very lethal consequences if messed up. Which is also another reason that companies don't like people tinkering with the insides of things; electricity does not care who you are and if not respected it can and will kill you, start a fire, etc. It's one of the reasons companies don't like people messing with the insides of electronics; bad PR and lawsuits if someone gets hurt.

  • The author's take is detached from reality, filled with hypocrisy and gatekeeping.

    This isn't nostalgia talking — it's a recognition that we’ve traded reliability and understanding for the illusion of progress.

    It absolutely is nostalgia talking. Yes your TI-99 fires up immediately when plugged in, and its old. However my Commodore 64 of the same era risk being fried because the 5v regulator doesn't age well and when fails dumps higher voltage right into the RAM and CPU. Oh, and c64 machines were never built with overvoltage protection because of cost savings. So don't confuse age with some idea of golden era reliability. RAM ICs were also regularly failed in those age of computers. This is why you had RAM testing programs and socketed ICs. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?

    Today’s innovation cycle has become a kind of collective amnesia, where every few years we rediscover fundamental concepts, slap a new acronym on them, and pretend we’ve revolutionized computing. Edge computing? That’s just distributed processing with better marketing. Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files. Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.

    By that logic, even the TI-99 he's loving on is just a fancier ENIAC or UNIVAC. All technology is built upon the era before it. If there was no technological or production cost improvement, we'd just use the old version. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn't a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.

    There’s something deeply humbling about opening a 40-year-old piece of electronics and finding components you can actually identify. Resistors, capacitors, integrated circuits with part numbers you can look up. Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste.

    I agree, there is something appealing about it to you and me, but most people don't care....and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There were absolutely users of TI-99 and C64 computers in the 80s that didn't give two shits about the shift register ICs or the UART that made the modem work, but they loved that they could get invoices from their loading dock sent electronically instead of a piece of paper carried (and lost!) through multiple hands.

    Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn't use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with "part numbers you can look up" would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today's modern computers.

    This isn't their fault — it's a systemic problem. Our education and industry reward breadth over depth, familiarity over fluency. We’ve optimized for shipping features quickly rather than understanding systems thoroughly. The result is a kind of technical learned helplessness, where practitioners become dependent on abstractions they can’t peer beneath.

    Ugh, this is frustrating. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren't fluent in circuit theory that allows the camera to display the guts of the patient they're operating on? No, of course not. We want that surgeon to keep studying new surgical technics, not trying to use Ohm's Law to calculate the current draw of the device he's using. Mr author, you and I hobby at electronics (and vintage computing) but just because its an interest of ours, doesn't mean it has to be of everyone.

    What We Need Now: We need editors who know what a Bode plot is. We need technical writing that assumes intelligence rather than ignorance. We need educational systems that teach principles alongside tools, theory alongside practice.

    Such gatekeeping! So unless you know the actual engineering principles behind a device you're using, you shouldn't be allowed to use it?

    Most importantly, we need to stop mistaking novelty for innovation and complexity for progress.

    Innovation isn't just creating new features or functionality. In fact, most I'd argue is taking existing features or functions and delivering them for substantially less cost/effort.

    As I'm reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. Does the Mr author know when to till soil or plant seed? How about the amount of irrigation Durum wheat needs during the hot season? How about when to harvest? What moisture level should the resulting harvest have before being taking to market or put in long term storage? Yet there he sits, eating the sandwich blissfully unaware of all the steps and effort needed to just make the wheat that goes into the bread. The farmer sits and wonders if Mr author's next article will be deriding the public on just eating bread and how we've forgotten how to grow wheat. Will Mr Author say we need fewer people ordering sandwiches and more people consulting US GIS maps for rainfall statistics and studying nitrogen fixing techniques for soil health? No, probably not.

    The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them.

    Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, "the best engineering solutions" are often some of the most expensive. You don't always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.

    They don't require constant updates or cloud connectivity or subscription services. They just work, year after year, doing exactly what they were designed to do.

    The reason your TI-99 and my c64 don't require constant updates is because they were born before the concept of cybersecurity existed. If you're going to have internet connected devices they its a near requirement to receive updates for security.

    If you don't want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.

    That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code.

    It is a machine of extremely limited functionality with a comparably simple design and construction. Don't think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today's cheap commodity PCs.

    It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

    Take off the rose colored glasses. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Sales of it absolutely served quarterly revenue numbers even back in the 1980s.

    We used to build things that lasted.

    We don't need most of these consumer electronics to last. Proof positive is the computer Mr. author is writing his article on is unlikely to be an Intel based 486 running at 33Mhz from the mid 90s (or a 68030 Mac). If it still works, why isn't he using one? Could it be he wants the new features and functionality like the rest of us? Over-engineering is a thing, and it sounds like what the author is preaching.

    Apologies if my post turned into a rant.

    I came here to post a screed a bit like this, but you did it so eloquently I don't have to, so thanks! A perfect take, imo.

  • The author's take is detached from reality, filled with hypocrisy and gatekeeping.

    This isn't nostalgia talking — it's a recognition that we’ve traded reliability and understanding for the illusion of progress.

    It absolutely is nostalgia talking. Yes your TI-99 fires up immediately when plugged in, and its old. However my Commodore 64 of the same era risk being fried because the 5v regulator doesn't age well and when fails dumps higher voltage right into the RAM and CPU. Oh, and c64 machines were never built with overvoltage protection because of cost savings. So don't confuse age with some idea of golden era reliability. RAM ICs were also regularly failed in those age of computers. This is why you had RAM testing programs and socketed ICs. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?

    Today’s innovation cycle has become a kind of collective amnesia, where every few years we rediscover fundamental concepts, slap a new acronym on them, and pretend we’ve revolutionized computing. Edge computing? That’s just distributed processing with better marketing. Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files. Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.

    By that logic, even the TI-99 he's loving on is just a fancier ENIAC or UNIVAC. All technology is built upon the era before it. If there was no technological or production cost improvement, we'd just use the old version. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn't a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.

    There’s something deeply humbling about opening a 40-year-old piece of electronics and finding components you can actually identify. Resistors, capacitors, integrated circuits with part numbers you can look up. Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste.

    I agree, there is something appealing about it to you and me, but most people don't care....and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There were absolutely users of TI-99 and C64 computers in the 80s that didn't give two shits about the shift register ICs or the UART that made the modem work, but they loved that they could get invoices from their loading dock sent electronically instead of a piece of paper carried (and lost!) through multiple hands.

    Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn't use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with "part numbers you can look up" would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today's modern computers.

    This isn't their fault — it's a systemic problem. Our education and industry reward breadth over depth, familiarity over fluency. We’ve optimized for shipping features quickly rather than understanding systems thoroughly. The result is a kind of technical learned helplessness, where practitioners become dependent on abstractions they can’t peer beneath.

    Ugh, this is frustrating. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren't fluent in circuit theory that allows the camera to display the guts of the patient they're operating on? No, of course not. We want that surgeon to keep studying new surgical technics, not trying to use Ohm's Law to calculate the current draw of the device he's using. Mr author, you and I hobby at electronics (and vintage computing) but just because its an interest of ours, doesn't mean it has to be of everyone.

    What We Need Now: We need editors who know what a Bode plot is. We need technical writing that assumes intelligence rather than ignorance. We need educational systems that teach principles alongside tools, theory alongside practice.

    Such gatekeeping! So unless you know the actual engineering principles behind a device you're using, you shouldn't be allowed to use it?

    Most importantly, we need to stop mistaking novelty for innovation and complexity for progress.

    Innovation isn't just creating new features or functionality. In fact, most I'd argue is taking existing features or functions and delivering them for substantially less cost/effort.

    As I'm reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. Does the Mr author know when to till soil or plant seed? How about the amount of irrigation Durum wheat needs during the hot season? How about when to harvest? What moisture level should the resulting harvest have before being taking to market or put in long term storage? Yet there he sits, eating the sandwich blissfully unaware of all the steps and effort needed to just make the wheat that goes into the bread. The farmer sits and wonders if Mr author's next article will be deriding the public on just eating bread and how we've forgotten how to grow wheat. Will Mr Author say we need fewer people ordering sandwiches and more people consulting US GIS maps for rainfall statistics and studying nitrogen fixing techniques for soil health? No, probably not.

    The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them.

    Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, "the best engineering solutions" are often some of the most expensive. You don't always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.

    They don't require constant updates or cloud connectivity or subscription services. They just work, year after year, doing exactly what they were designed to do.

    The reason your TI-99 and my c64 don't require constant updates is because they were born before the concept of cybersecurity existed. If you're going to have internet connected devices they its a near requirement to receive updates for security.

    If you don't want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.

    That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code.

    It is a machine of extremely limited functionality with a comparably simple design and construction. Don't think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today's cheap commodity PCs.

    It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

    Take off the rose colored glasses. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Sales of it absolutely served quarterly revenue numbers even back in the 1980s.

    We used to build things that lasted.

    We don't need most of these consumer electronics to last. Proof positive is the computer Mr. author is writing his article on is unlikely to be an Intel based 486 running at 33Mhz from the mid 90s (or a 68030 Mac). If it still works, why isn't he using one? Could it be he wants the new features and functionality like the rest of us? Over-engineering is a thing, and it sounds like what the author is preaching.

    Apologies if my post turned into a rant.

    I like a lot of your responses. I agree about nostalgia being a main driver of his article. However, i think the bits about how a doctor needs to know how a medical tool functions etc, is a little misplaced. I think the author was referring to the makers of the device not understanding what theyre making, not so much the end user. I ALSO think the author would prefer more broad technical literacy, but his core arguement seemed to be that those making things dont understand the tech they're built upon and that unintended consequences can occur when that happens. Worse, if the current technology has been abstracted enough times, eventually no one will know enough to fix it.

  • This post did not contain any content.

    This article is so weirdly written

    One of his points is that a vhs player is easily fixable while a wifi router isn't. These things aren't even remotely the same. They don't serve the same function, they don't have the same complexity. Comparing their repairability makes no sense because they serve different functions. Just because I know how to repair a keyboard doesn't mean I know how to fix a tv.

    Most of his complaints are on the capitalization of modern technology, which is not a problem of innovation and knowledge, it's an economics and political problem.

  • I like a lot of your responses. I agree about nostalgia being a main driver of his article. However, i think the bits about how a doctor needs to know how a medical tool functions etc, is a little misplaced. I think the author was referring to the makers of the device not understanding what theyre making, not so much the end user. I ALSO think the author would prefer more broad technical literacy, but his core arguement seemed to be that those making things dont understand the tech they're built upon and that unintended consequences can occur when that happens. Worse, if the current technology has been abstracted enough times, eventually no one will know enough to fix it.

    I think the author was referring to the makers of the device not understanding what theyre making, not so much the end user.

    Just to make sure I'm following your thread of thought, are you referring to this part of the author's opinion piece or something else in his text?

    "This wouldn’t matter if it were just marketing hyperbole, but the misunderstanding has real consequences. Companies are making billion-dollar bets on technologies they don’t understand, while actual researchers struggle to separate legitimate progress from venture capital fever dreams. We’re drowning in noise generated by people who mistake familiarity with terminology for comprehension of the underlying principles."

  • I think the "black box" nature of electronics is mostly illusory due to how we treat our devices. A friend bought a walking treadmill that wouldn't turn on out of the box. She contacted the company, they told her to trash it and just shipped her a new one.

    She gave it to me, I took it apart. One of the headers that connects the power switch to the mainboard was just unplugged. It took literally 10 minutes to "fix" including disassembly and assembly, and all I needed was a screwdriver.

    This is a symptom of industry switching to cheap "disposable" electronics, rather than more expensive, robust, and repairable ones.

    From the treadmill company's point of view, it's cheaper to just lose one unit and pay shipping one way rather than pay to have the unit returned, spend valuable technician time diagnosing and fixing an issue and then pay to ship the repaired unit back.

    About 50 years ago, you could find appliance repair shops that would fix your broken toaster or TV, and parts for stuff like that were easily available. Now, with the advanced automation in building these, combined with the increased difficulty of repair(fine-work soldering, firmware debuging and the like) it makes way more sense to just replace the whole thing.

    Now, with the advanced automation in building these, combined with the increased difficulty of repair(fine-work soldering, firmware debuging and the like) it makes way more sense to just replace the whole thing.

    The other valid component to your argument is the cost of labor now. It is more expensive to maintain a staff of people to perform repairs and manage the logistics of transporting units to service than it is to simply lose 100% of the wholesale value of the handful of items that fail within the warranty period. Labor, especially skilled labor, is really really expensive in the western world.

  • This post did not contain any content.

    VHS player

    VCR.

    It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.

    I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.

    A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.

    "VCR" vs "VHS Player":

  • VHS player

    VCR.

    It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.

    I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.

    A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.

    "VCR" vs "VHS Player":

    My memory was that there were exclusive players for TV stations. But all the consumer ones could record.

  • VHS player

    VCR.

    It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.

    I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.

    A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.

    "VCR" vs "VHS Player":

    What is this, Perl? It's not write-only. VCR may stand for video cassette recorder but it's also a VCP.

  • This post did not contain any content.

    This really made me want to find a BASIC emulator and a collection of old programs and introduce them to my kids. Infeelnlike LUA is the modern BASIC. I just picked up Replicube and plan to use it to introduce my daughter to LUA.

  • Teachers Are Not OK

    Technology technology
    18
    1
    252 Stimmen
    18 Beiträge
    98 Aufrufe
    curious_canid@lemmy.caC
    AI is so far from being the main problem with our current US educational system that I'm not sure why we bother to talk about it. Until we can produce students who meet minimum standards for literacy and critical thinking, AI is a sideshow.
  • 31 Stimmen
    1 Beiträge
    12 Aufrufe
    Niemand hat geantwortet
  • Science and Technology News and Commentary: Aardvark Daily

    Technology technology
    2
    7 Stimmen
    2 Beiträge
    22 Aufrufe
    I
    What are you on about with this? Last news post 2013?
  • 179 Stimmen
    1 Beiträge
    13 Aufrufe
    Niemand hat geantwortet
  • 50 Stimmen
    11 Beiträge
    60 Aufrufe
    G
    Anyone here use XING?
  • The people who think AI might become conscious

    Technology technology
    8
    1
    6 Stimmen
    8 Beiträge
    40 Aufrufe
    ?
    List of people who know what the fuck consciousness even is:
  • 4 Stimmen
    20 Beiträge
    94 Aufrufe
    V
    Oh, I get it. You're a purposefully ignorant dumbass.
  • Microsoft Bans Employees From Using DeepSeek App

    Technology technology
    11
    1
    121 Stimmen
    11 Beiträge
    56 Aufrufe
    L
    (Premise - suppose I accept that there is such a definable thing as capitalism) I'm not sure why you feel the need to state this in a discussion that already assumes it as a necessary precondition of, but, uh, you do you. People blaming capitalism for everything then build a country that imports grain, while before them and after them it’s among the largest exporters on the planet (if we combine Russia and Ukraine for the “after” metric, no pun intended). ...what? What does this have to do with literally anything, much less my comment about innovation/competition? Even setting aside the wild-assed assumptions you're making about me criticizing capitalism means I 'blame [it] for everything', this tirade you've launched into, presumably about Ukraine and the USSR, has no bearing on anything even tangentially related to this conversation. People praising capitalism create conditions in which there’s no reason to praise it. Like, it’s competitive - they kill competitiveness with patents, IP, very complex legal systems. It’s self-regulating and self-optimizing - they make regulations and do bailouts preventing sick companies from dying, make laws after their interests, then reactively make regulations to make conditions with them existing bearable, which have a side effect of killing smaller companies. Please allow me to reiterate: ...what? Capitalists didn't build literally any of those things, governments did, and capitalists have been trying to escape, subvert, or dismantle those systems at every turn, so this... vain, confusing attempt to pin a medal on capitalism's chest for restraining itself is not only wrong, it fails to understand basic facts about history. It's the opposite of self-regulating because it actively seeks to dismantle regulations (environmental, labor, wage, etc), and the only thing it optimizes for is the wealth of oligarchs, and maybe if they're lucky, there will be a few crumbs left over for their simps. That’s the problem, both “socialist” and “capitalist” ideal systems ignore ape power dynamics. I'm going to go ahead an assume that 'the problem' has more to do with assuming that complex interacting systems can be simplified to 'ape (or any other animal's) power dynamics' than with failing to let the richest people just do whatever they want. Such systems should be designed on top of the fact that jungle law is always allowed So we should just be cool with everybody being poor so Jeff Bezos or whoever can upgrade his megayacht to a gigayacht or whatever? Let me say this in the politest way I know how: LOL no. Also, do you remember when I said this? ‘Won’t someone please think of the billionaires’ is wearing kinda thin You know, right before you went on this very long-winded, surreal, barely-coherent ramble? Did you imagine I would be convinced by literally any of it when all it amounts to is one giant, extraneous, tedious equivalent of 'Won't someone please think of the billionaires?' Simp harder and I bet maybe you can get a crumb or two yourself.