Skip to content

We're Not Innovating, We’re Just Forgetting Slower

Technology
36 27 0
  • beyond trivial issues

    I'd argue that 10-15% of issues are trivial issues and are worth investigating even without a schematic if the alternative is just throwing something away.

    and i do because i don't want to throw away this expensive piece of tech. but like, manufacturers in the early 2000s were still sharing this very valuable information with me. i hate planned obsolescence with a passion.

  • I think the "black box" nature of electronics is mostly illusory due to how we treat our devices. A friend bought a walking treadmill that wouldn't turn on out of the box. She contacted the company, they told her to trash it and just shipped her a new one.

    She gave it to me, I took it apart. One of the headers that connects the power switch to the mainboard was just unplugged. It took literally 10 minutes to "fix" including disassembly and assembly, and all I needed was a screwdriver.

    This is a symptom of industry switching to cheap "disposable" electronics, rather than more expensive, robust, and repairable ones.

    From the treadmill company's point of view, it's cheaper to just lose one unit and pay shipping one way rather than pay to have the unit returned, spend valuable technician time diagnosing and fixing an issue and then pay to ship the repaired unit back.

    About 50 years ago, you could find appliance repair shops that would fix your broken toaster or TV, and parts for stuff like that were easily available. Now, with the advanced automation in building these, combined with the increased difficulty of repair(fine-work soldering, firmware debuging and the like) it makes way more sense to just replace the whole thing.

    Agreed it definitely depends on what you buy. I inherited a stereo amp from my uncle who always buys really nice gear. I have had it repaired or been able to repair it anytime a component failed and it is now 30 years old. But it was built to last that long not to be disposed of in five.

    Right to repair is not just for nerds and tinkerers. We all deserve repairable products.

  • This post did not contain any content.

    The author's take is detached from reality, filled with hypocrisy and gatekeeping.

    This isn't nostalgia talking — it's a recognition that we’ve traded reliability and understanding for the illusion of progress.

    It absolutely is nostalgia talking. Yes your TI-99 fires up immediately when plugged in, and its old. However my Commodore 64 of the same era risk being fried because the 5v regulator doesn't age well and when fails dumps higher voltage right into the RAM and CPU. Oh, and c64 machines were never built with overvoltage protection because of cost savings. So don't confuse age with some idea of golden era reliability. RAM ICs were also regularly failed in those age of computers. This is why you had RAM testing programs and socketed ICs. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?

    Today’s innovation cycle has become a kind of collective amnesia, where every few years we rediscover fundamental concepts, slap a new acronym on them, and pretend we’ve revolutionized computing. Edge computing? That’s just distributed processing with better marketing. Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files. Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.

    By that logic, even the TI-99 he's loving on is just a fancier ENIAC or UNIVAC. All technology is built upon the era before it. If there was no technological or production cost improvement, we'd just use the old version. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn't a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.

    There’s something deeply humbling about opening a 40-year-old piece of electronics and finding components you can actually identify. Resistors, capacitors, integrated circuits with part numbers you can look up. Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste.

    I agree, there is something appealing about it to you and me, but most people don't care....and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There were absolutely users of TI-99 and C64 computers in the 80s that didn't give two shits about the shift register ICs or the UART that made the modem work, but they loved that they could get invoices from their loading dock sent electronically instead of a piece of paper carried (and lost!) through multiple hands.

    Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn't use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with "part numbers you can look up" would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today's modern computers.

    This isn't their fault — it's a systemic problem. Our education and industry reward breadth over depth, familiarity over fluency. We’ve optimized for shipping features quickly rather than understanding systems thoroughly. The result is a kind of technical learned helplessness, where practitioners become dependent on abstractions they can’t peer beneath.

    Ugh, this is frustrating. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren't fluent in circuit theory that allows the camera to display the guts of the patient they're operating on? No, of course not. We want that surgeon to keep studying new surgical technics, not trying to use Ohm's Law to calculate the current draw of the device he's using. Mr author, you and I hobby at electronics (and vintage computing) but just because its an interest of ours, doesn't mean it has to be of everyone.

    What We Need Now: We need editors who know what a Bode plot is. We need technical writing that assumes intelligence rather than ignorance. We need educational systems that teach principles alongside tools, theory alongside practice.

    Such gatekeeping! So unless you know the actual engineering principles behind a device you're using, you shouldn't be allowed to use it?

    Most importantly, we need to stop mistaking novelty for innovation and complexity for progress.

    Innovation isn't just creating new features or functionality. In fact, most I'd argue is taking existing features or functions and delivering them for substantially less cost/effort.

    As I'm reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. Does the Mr author know when to till soil or plant seed? How about the amount of irrigation Durum wheat needs during the hot season? How about when to harvest? What moisture level should the resulting harvest have before being taking to market or put in long term storage? Yet there he sits, eating the sandwich blissfully unaware of all the steps and effort needed to just make the wheat that goes into the bread. The farmer sits and wonders if Mr author's next article will be deriding the public on just eating bread and how we've forgotten how to grow wheat. Will Mr Author say we need fewer people ordering sandwiches and more people consulting US GIS maps for rainfall statistics and studying nitrogen fixing techniques for soil health? No, probably not.

    The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them.

    Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, "the best engineering solutions" are often some of the most expensive. You don't always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.

    They don't require constant updates or cloud connectivity or subscription services. They just work, year after year, doing exactly what they were designed to do.

    The reason your TI-99 and my c64 don't require constant updates is because they were born before the concept of cybersecurity existed. If you're going to have internet connected devices they its a near requirement to receive updates for security.

    If you don't want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.

    That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code.

    It is a machine of extremely limited functionality with a comparably simple design and construction. Don't think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today's cheap commodity PCs.

    It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

    Take off the rose colored glasses. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Sales of it absolutely served quarterly revenue numbers even back in the 1980s.

    We used to build things that lasted.

    We don't need most of these consumer electronics to last. Proof positive is the computer Mr. author is writing his article on is unlikely to be an Intel based 486 running at 33Mhz from the mid 90s (or a 68030 Mac). If it still works, why isn't he using one? Could it be he wants the new features and functionality like the rest of us? Over-engineering is a thing, and it sounds like what the author is preaching.

    Apologies if my post turned into a rant.

  • The author's take is detached from reality, filled with hypocrisy and gatekeeping.

    This isn't nostalgia talking — it's a recognition that we’ve traded reliability and understanding for the illusion of progress.

    It absolutely is nostalgia talking. Yes your TI-99 fires up immediately when plugged in, and its old. However my Commodore 64 of the same era risk being fried because the 5v regulator doesn't age well and when fails dumps higher voltage right into the RAM and CPU. Oh, and c64 machines were never built with overvoltage protection because of cost savings. So don't confuse age with some idea of golden era reliability. RAM ICs were also regularly failed in those age of computers. This is why you had RAM testing programs and socketed ICs. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?

    Today’s innovation cycle has become a kind of collective amnesia, where every few years we rediscover fundamental concepts, slap a new acronym on them, and pretend we’ve revolutionized computing. Edge computing? That’s just distributed processing with better marketing. Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files. Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.

    By that logic, even the TI-99 he's loving on is just a fancier ENIAC or UNIVAC. All technology is built upon the era before it. If there was no technological or production cost improvement, we'd just use the old version. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn't a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.

    There’s something deeply humbling about opening a 40-year-old piece of electronics and finding components you can actually identify. Resistors, capacitors, integrated circuits with part numbers you can look up. Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste.

    I agree, there is something appealing about it to you and me, but most people don't care....and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There were absolutely users of TI-99 and C64 computers in the 80s that didn't give two shits about the shift register ICs or the UART that made the modem work, but they loved that they could get invoices from their loading dock sent electronically instead of a piece of paper carried (and lost!) through multiple hands.

    Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn't use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with "part numbers you can look up" would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today's modern computers.

    This isn't their fault — it's a systemic problem. Our education and industry reward breadth over depth, familiarity over fluency. We’ve optimized for shipping features quickly rather than understanding systems thoroughly. The result is a kind of technical learned helplessness, where practitioners become dependent on abstractions they can’t peer beneath.

    Ugh, this is frustrating. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren't fluent in circuit theory that allows the camera to display the guts of the patient they're operating on? No, of course not. We want that surgeon to keep studying new surgical technics, not trying to use Ohm's Law to calculate the current draw of the device he's using. Mr author, you and I hobby at electronics (and vintage computing) but just because its an interest of ours, doesn't mean it has to be of everyone.

    What We Need Now: We need editors who know what a Bode plot is. We need technical writing that assumes intelligence rather than ignorance. We need educational systems that teach principles alongside tools, theory alongside practice.

    Such gatekeeping! So unless you know the actual engineering principles behind a device you're using, you shouldn't be allowed to use it?

    Most importantly, we need to stop mistaking novelty for innovation and complexity for progress.

    Innovation isn't just creating new features or functionality. In fact, most I'd argue is taking existing features or functions and delivering them for substantially less cost/effort.

    As I'm reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. Does the Mr author know when to till soil or plant seed? How about the amount of irrigation Durum wheat needs during the hot season? How about when to harvest? What moisture level should the resulting harvest have before being taking to market or put in long term storage? Yet there he sits, eating the sandwich blissfully unaware of all the steps and effort needed to just make the wheat that goes into the bread. The farmer sits and wonders if Mr author's next article will be deriding the public on just eating bread and how we've forgotten how to grow wheat. Will Mr Author say we need fewer people ordering sandwiches and more people consulting US GIS maps for rainfall statistics and studying nitrogen fixing techniques for soil health? No, probably not.

    The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them.

    Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, "the best engineering solutions" are often some of the most expensive. You don't always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.

    They don't require constant updates or cloud connectivity or subscription services. They just work, year after year, doing exactly what they were designed to do.

    The reason your TI-99 and my c64 don't require constant updates is because they were born before the concept of cybersecurity existed. If you're going to have internet connected devices they its a near requirement to receive updates for security.

    If you don't want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.

    That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code.

    It is a machine of extremely limited functionality with a comparably simple design and construction. Don't think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today's cheap commodity PCs.

    It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

    Take off the rose colored glasses. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Sales of it absolutely served quarterly revenue numbers even back in the 1980s.

    We used to build things that lasted.

    We don't need most of these consumer electronics to last. Proof positive is the computer Mr. author is writing his article on is unlikely to be an Intel based 486 running at 33Mhz from the mid 90s (or a 68030 Mac). If it still works, why isn't he using one? Could it be he wants the new features and functionality like the rest of us? Over-engineering is a thing, and it sounds like what the author is preaching.

    Apologies if my post turned into a rant.

    There's also that things have shrunk enough and that takes more precision and specialized tools to repair. And there's also that some people, myself included, have absolutely no business messing around with capacitors and the like. It is just not everyone's skillset, and that's okay, we live with other people who cover what our personal skills don't, and it comes with very lethal consequences if messed up. Which is also another reason that companies don't like people tinkering with the insides of things; electricity does not care who you are and if not respected it can and will kill you, start a fire, etc. It's one of the reasons companies don't like people messing with the insides of electronics; bad PR and lawsuits if someone gets hurt.

  • The author's take is detached from reality, filled with hypocrisy and gatekeeping.

    This isn't nostalgia talking — it's a recognition that we’ve traded reliability and understanding for the illusion of progress.

    It absolutely is nostalgia talking. Yes your TI-99 fires up immediately when plugged in, and its old. However my Commodore 64 of the same era risk being fried because the 5v regulator doesn't age well and when fails dumps higher voltage right into the RAM and CPU. Oh, and c64 machines were never built with overvoltage protection because of cost savings. So don't confuse age with some idea of golden era reliability. RAM ICs were also regularly failed in those age of computers. This is why you had RAM testing programs and socketed ICs. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?

    Today’s innovation cycle has become a kind of collective amnesia, where every few years we rediscover fundamental concepts, slap a new acronym on them, and pretend we’ve revolutionized computing. Edge computing? That’s just distributed processing with better marketing. Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files. Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.

    By that logic, even the TI-99 he's loving on is just a fancier ENIAC or UNIVAC. All technology is built upon the era before it. If there was no technological or production cost improvement, we'd just use the old version. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn't a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.

    There’s something deeply humbling about opening a 40-year-old piece of electronics and finding components you can actually identify. Resistors, capacitors, integrated circuits with part numbers you can look up. Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste.

    I agree, there is something appealing about it to you and me, but most people don't care....and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There were absolutely users of TI-99 and C64 computers in the 80s that didn't give two shits about the shift register ICs or the UART that made the modem work, but they loved that they could get invoices from their loading dock sent electronically instead of a piece of paper carried (and lost!) through multiple hands.

    Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn't use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with "part numbers you can look up" would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today's modern computers.

    This isn't their fault — it's a systemic problem. Our education and industry reward breadth over depth, familiarity over fluency. We’ve optimized for shipping features quickly rather than understanding systems thoroughly. The result is a kind of technical learned helplessness, where practitioners become dependent on abstractions they can’t peer beneath.

    Ugh, this is frustrating. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren't fluent in circuit theory that allows the camera to display the guts of the patient they're operating on? No, of course not. We want that surgeon to keep studying new surgical technics, not trying to use Ohm's Law to calculate the current draw of the device he's using. Mr author, you and I hobby at electronics (and vintage computing) but just because its an interest of ours, doesn't mean it has to be of everyone.

    What We Need Now: We need editors who know what a Bode plot is. We need technical writing that assumes intelligence rather than ignorance. We need educational systems that teach principles alongside tools, theory alongside practice.

    Such gatekeeping! So unless you know the actual engineering principles behind a device you're using, you shouldn't be allowed to use it?

    Most importantly, we need to stop mistaking novelty for innovation and complexity for progress.

    Innovation isn't just creating new features or functionality. In fact, most I'd argue is taking existing features or functions and delivering them for substantially less cost/effort.

    As I'm reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. Does the Mr author know when to till soil or plant seed? How about the amount of irrigation Durum wheat needs during the hot season? How about when to harvest? What moisture level should the resulting harvest have before being taking to market or put in long term storage? Yet there he sits, eating the sandwich blissfully unaware of all the steps and effort needed to just make the wheat that goes into the bread. The farmer sits and wonders if Mr author's next article will be deriding the public on just eating bread and how we've forgotten how to grow wheat. Will Mr Author say we need fewer people ordering sandwiches and more people consulting US GIS maps for rainfall statistics and studying nitrogen fixing techniques for soil health? No, probably not.

    The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them.

    Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, "the best engineering solutions" are often some of the most expensive. You don't always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.

    They don't require constant updates or cloud connectivity or subscription services. They just work, year after year, doing exactly what they were designed to do.

    The reason your TI-99 and my c64 don't require constant updates is because they were born before the concept of cybersecurity existed. If you're going to have internet connected devices they its a near requirement to receive updates for security.

    If you don't want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.

    That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code.

    It is a machine of extremely limited functionality with a comparably simple design and construction. Don't think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today's cheap commodity PCs.

    It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

    Take off the rose colored glasses. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Sales of it absolutely served quarterly revenue numbers even back in the 1980s.

    We used to build things that lasted.

    We don't need most of these consumer electronics to last. Proof positive is the computer Mr. author is writing his article on is unlikely to be an Intel based 486 running at 33Mhz from the mid 90s (or a 68030 Mac). If it still works, why isn't he using one? Could it be he wants the new features and functionality like the rest of us? Over-engineering is a thing, and it sounds like what the author is preaching.

    Apologies if my post turned into a rant.

    I came here to post a screed a bit like this, but you did it so eloquently I don't have to, so thanks! A perfect take, imo.

  • The author's take is detached from reality, filled with hypocrisy and gatekeeping.

    This isn't nostalgia talking — it's a recognition that we’ve traded reliability and understanding for the illusion of progress.

    It absolutely is nostalgia talking. Yes your TI-99 fires up immediately when plugged in, and its old. However my Commodore 64 of the same era risk being fried because the 5v regulator doesn't age well and when fails dumps higher voltage right into the RAM and CPU. Oh, and c64 machines were never built with overvoltage protection because of cost savings. So don't confuse age with some idea of golden era reliability. RAM ICs were also regularly failed in those age of computers. This is why you had RAM testing programs and socketed ICs. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?

    Today’s innovation cycle has become a kind of collective amnesia, where every few years we rediscover fundamental concepts, slap a new acronym on them, and pretend we’ve revolutionized computing. Edge computing? That’s just distributed processing with better marketing. Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files. Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.

    By that logic, even the TI-99 he's loving on is just a fancier ENIAC or UNIVAC. All technology is built upon the era before it. If there was no technological or production cost improvement, we'd just use the old version. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn't a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.

    There’s something deeply humbling about opening a 40-year-old piece of electronics and finding components you can actually identify. Resistors, capacitors, integrated circuits with part numbers you can look up. Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste.

    I agree, there is something appealing about it to you and me, but most people don't care....and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There were absolutely users of TI-99 and C64 computers in the 80s that didn't give two shits about the shift register ICs or the UART that made the modem work, but they loved that they could get invoices from their loading dock sent electronically instead of a piece of paper carried (and lost!) through multiple hands.

    Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn't use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with "part numbers you can look up" would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today's modern computers.

    This isn't their fault — it's a systemic problem. Our education and industry reward breadth over depth, familiarity over fluency. We’ve optimized for shipping features quickly rather than understanding systems thoroughly. The result is a kind of technical learned helplessness, where practitioners become dependent on abstractions they can’t peer beneath.

    Ugh, this is frustrating. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren't fluent in circuit theory that allows the camera to display the guts of the patient they're operating on? No, of course not. We want that surgeon to keep studying new surgical technics, not trying to use Ohm's Law to calculate the current draw of the device he's using. Mr author, you and I hobby at electronics (and vintage computing) but just because its an interest of ours, doesn't mean it has to be of everyone.

    What We Need Now: We need editors who know what a Bode plot is. We need technical writing that assumes intelligence rather than ignorance. We need educational systems that teach principles alongside tools, theory alongside practice.

    Such gatekeeping! So unless you know the actual engineering principles behind a device you're using, you shouldn't be allowed to use it?

    Most importantly, we need to stop mistaking novelty for innovation and complexity for progress.

    Innovation isn't just creating new features or functionality. In fact, most I'd argue is taking existing features or functions and delivering them for substantially less cost/effort.

    As I'm reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. Does the Mr author know when to till soil or plant seed? How about the amount of irrigation Durum wheat needs during the hot season? How about when to harvest? What moisture level should the resulting harvest have before being taking to market or put in long term storage? Yet there he sits, eating the sandwich blissfully unaware of all the steps and effort needed to just make the wheat that goes into the bread. The farmer sits and wonders if Mr author's next article will be deriding the public on just eating bread and how we've forgotten how to grow wheat. Will Mr Author say we need fewer people ordering sandwiches and more people consulting US GIS maps for rainfall statistics and studying nitrogen fixing techniques for soil health? No, probably not.

    The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them.

    Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, "the best engineering solutions" are often some of the most expensive. You don't always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.

    They don't require constant updates or cloud connectivity or subscription services. They just work, year after year, doing exactly what they were designed to do.

    The reason your TI-99 and my c64 don't require constant updates is because they were born before the concept of cybersecurity existed. If you're going to have internet connected devices they its a near requirement to receive updates for security.

    If you don't want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.

    That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code.

    It is a machine of extremely limited functionality with a comparably simple design and construction. Don't think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today's cheap commodity PCs.

    It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

    Take off the rose colored glasses. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Sales of it absolutely served quarterly revenue numbers even back in the 1980s.

    We used to build things that lasted.

    We don't need most of these consumer electronics to last. Proof positive is the computer Mr. author is writing his article on is unlikely to be an Intel based 486 running at 33Mhz from the mid 90s (or a 68030 Mac). If it still works, why isn't he using one? Could it be he wants the new features and functionality like the rest of us? Over-engineering is a thing, and it sounds like what the author is preaching.

    Apologies if my post turned into a rant.

    I like a lot of your responses. I agree about nostalgia being a main driver of his article. However, i think the bits about how a doctor needs to know how a medical tool functions etc, is a little misplaced. I think the author was referring to the makers of the device not understanding what theyre making, not so much the end user. I ALSO think the author would prefer more broad technical literacy, but his core arguement seemed to be that those making things dont understand the tech they're built upon and that unintended consequences can occur when that happens. Worse, if the current technology has been abstracted enough times, eventually no one will know enough to fix it.

  • This post did not contain any content.

    This article is so weirdly written

    One of his points is that a vhs player is easily fixable while a wifi router isn't. These things aren't even remotely the same. They don't serve the same function, they don't have the same complexity. Comparing their repairability makes no sense because they serve different functions. Just because I know how to repair a keyboard doesn't mean I know how to fix a tv.

    Most of his complaints are on the capitalization of modern technology, which is not a problem of innovation and knowledge, it's an economics and political problem.

  • I like a lot of your responses. I agree about nostalgia being a main driver of his article. However, i think the bits about how a doctor needs to know how a medical tool functions etc, is a little misplaced. I think the author was referring to the makers of the device not understanding what theyre making, not so much the end user. I ALSO think the author would prefer more broad technical literacy, but his core arguement seemed to be that those making things dont understand the tech they're built upon and that unintended consequences can occur when that happens. Worse, if the current technology has been abstracted enough times, eventually no one will know enough to fix it.

    I think the author was referring to the makers of the device not understanding what theyre making, not so much the end user.

    Just to make sure I'm following your thread of thought, are you referring to this part of the author's opinion piece or something else in his text?

    "This wouldn’t matter if it were just marketing hyperbole, but the misunderstanding has real consequences. Companies are making billion-dollar bets on technologies they don’t understand, while actual researchers struggle to separate legitimate progress from venture capital fever dreams. We’re drowning in noise generated by people who mistake familiarity with terminology for comprehension of the underlying principles."

  • I think the "black box" nature of electronics is mostly illusory due to how we treat our devices. A friend bought a walking treadmill that wouldn't turn on out of the box. She contacted the company, they told her to trash it and just shipped her a new one.

    She gave it to me, I took it apart. One of the headers that connects the power switch to the mainboard was just unplugged. It took literally 10 minutes to "fix" including disassembly and assembly, and all I needed was a screwdriver.

    This is a symptom of industry switching to cheap "disposable" electronics, rather than more expensive, robust, and repairable ones.

    From the treadmill company's point of view, it's cheaper to just lose one unit and pay shipping one way rather than pay to have the unit returned, spend valuable technician time diagnosing and fixing an issue and then pay to ship the repaired unit back.

    About 50 years ago, you could find appliance repair shops that would fix your broken toaster or TV, and parts for stuff like that were easily available. Now, with the advanced automation in building these, combined with the increased difficulty of repair(fine-work soldering, firmware debuging and the like) it makes way more sense to just replace the whole thing.

    Now, with the advanced automation in building these, combined with the increased difficulty of repair(fine-work soldering, firmware debuging and the like) it makes way more sense to just replace the whole thing.

    The other valid component to your argument is the cost of labor now. It is more expensive to maintain a staff of people to perform repairs and manage the logistics of transporting units to service than it is to simply lose 100% of the wholesale value of the handful of items that fail within the warranty period. Labor, especially skilled labor, is really really expensive in the western world.

  • This post did not contain any content.

    VHS player

    VCR.

    It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.

    I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.

    A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.

    "VCR" vs "VHS Player":

  • VHS player

    VCR.

    It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.

    I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.

    A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.

    "VCR" vs "VHS Player":

    My memory was that there were exclusive players for TV stations. But all the consumer ones could record.

  • VHS player

    VCR.

    It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.

    I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.

    A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.

    "VCR" vs "VHS Player":

    What is this, Perl? It's not write-only. VCR may stand for video cassette recorder but it's also a VCP.

  • This post did not contain any content.

    This really made me want to find a BASIC emulator and a collection of old programs and introduce them to my kids. Infeelnlike LUA is the modern BASIC. I just picked up Replicube and plan to use it to introduce my daughter to LUA.

  • VHS player

    VCR.

    It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.

    I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.

    A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.

    "VCR" vs "VHS Player":

    Insane hill to die on but you do you.

  • Meanwhile, my Wi-Fi router requires a PhD in reverse engineering just to figure out why it won’t connect to the internet.

    I do think people in general could benefit from maybe $100 in tools and a healthy dose of Youtube when it comes to this point. My PC of 10 years wouldn't boot one morning because my SSD died. There wasn't anything too important on it that I hadn't backed up, but it was still a bummer. I took it apart, and started poking around. Found a short across a capacitor, so I started cycling capacitors. Sure enough, one was bad. Replaced it. Boots just fine. (Moved everything to a new SSD just in case).

    All I needed for this job was a multimeter and a soldering iron (though hot air gun made it slightly easier).

    I think the "black box" nature of electronics is mostly illusory due to how we treat our devices. A friend bought a walking treadmill that wouldn't turn on out of the box. She contacted the company, they told her to trash it and just shipped her a new one.

    She gave it to me, I took it apart. One of the headers that connects the power switch to the mainboard was just unplugged. It took literally 10 minutes to "fix" including disassembly and assembly, and all I needed was a screwdriver.

    Yet there's zero expectation of user maintenance. If it doesn't work, trash it.

    Scroll through maker TikTok

    This guy might be looking in the wrong places.

    Except with Internet shit, it's usually some dumbass at your ISP who is only trained to answer the phone and parrot from 3 different prompts. Actually getting someone who can flip the switch/register your device in the proper region to make shit actually work on their end.

  • While I 100% agree with the fact that even modern things can be fixed with some knowhow and troubleshooting (and spare capacitors or the like), there’s a few things at play:
    `

    • people generally don’t have this skill set
    • electronics tend to be made cheaper, this means they may fail faster but also means they can be replaced cheaper
    • it costs real money for tech support that can fix said issues, often many times more money than the thing costs to replace
      `

    As a retro enthusiast, I’ve fixed my share of electronics that only needed an hour and a $2 capacitor. But there was also $7 shipping for the cap, and 30-60min of labor, and my knowhow in troubleshooting and experience. If the company had to send someone out, they’d likely spend well over $200 for time, gas, labor, parts, etc. not including a vehicle for the tech and the facility nearby and all that good stuff. Even in the retro sphere, the math starts to side towards fix because of the rarity, but it’s not always clear.

    As a retro enthusiast, I’ve fixed my share of electronics that only needed an hour and a $2 capacitor. But there was also $7 shipping for the cap, and 30-60min of labor, and my knowhow in troubleshooting and experience. If the company had to send someone out, they’d likely spend well over $200 for time, gas, labor, parts, etc. not including a vehicle for the tech and the facility nearby and all that good stuff.

    This is exactly it. I used to work for a manufacturer that made devices they would often need to repair. They would bill non-warranty labor at $100/hour, plus the cost of parts. Their products were primarily used by professionals, so that was fine when it was being done to repair something that cost between $700-$4,000 new, especially for people who were making money using the product. When they launched a product at a $500 MSRP, though, it started to get harder, and even more so when competition forced them to lower the price to $400. When I left they were about to launch a product targeted at amateurs, originally aiming for a $200 price. It was actually being built by a Chinese competitor, with our software guys contributing to the system and putting our logo on it. Spending $100 labor to repair a $200 device was going to be a tough sell, and when I left the plan for warranty “repairs” was to just give the customer a replacement unit and scrap the defective one. And I’m sure the repair labor rate was going up; they had a hard time hiring qualified technicians at the rate they wanted to pay, and most of the department had quit/moved to new roles when I left, so they were surely having to increase pay and the rate they billed.

    When something’s being built on an assembly line mostly by machine and/or low-cost Asian labor, it’s harder for a company to justify paying a skilled technician’s labor in a western country when that makes the cost of repair close to the cost of a new unit.

  • This article is so weirdly written

    One of his points is that a vhs player is easily fixable while a wifi router isn't. These things aren't even remotely the same. They don't serve the same function, they don't have the same complexity. Comparing their repairability makes no sense because they serve different functions. Just because I know how to repair a keyboard doesn't mean I know how to fix a tv.

    Most of his complaints are on the capitalization of modern technology, which is not a problem of innovation and knowledge, it's an economics and political problem.

    Fire good. Angry gods strike ground. Man take fire. Place food on top. Simple.
    E-e-elic-tri-s-i-ty bad. Complicated. Not know who volt is. Sparks scary. Place food on top not know how.

  • VHS player

    VCR.

    It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.

    I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.

    A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.

    "VCR" vs "VHS Player":

    Conjuring up a frequency graph from 2004-present doesn't help your argument, as the VCR format wars were pretty much over a good 15 years beforehand.

    "VCR" could have meant either VHS or Betamax to a consumer in the early '80s.

    At least VHS specifies a particular standard, and "player" in that context has a loose connection with record player, or tape player , being the thing you play your purchased records / tapes / videos on.

  • This post did not contain any content.

    A year or two ago I read about some guy who is still managing the trailer park he inherited from his dad with a TRS-80 (I think), using an app he wrote way back when. If it works it works!

  • You don't have to fix everything, but just doing stuff like replacing connectors and capacitors could probably save 10% of the shit that we throw away, and it's not that hard to try.

    I do agree with that completely and I'd like to add to it with an additional point.

    When things break it sucks, but this does present you with an opportunity. If it's already not working, there's no harm in taking it apart and taking a look around. Maybe you'll see something obviously at fault, maybe you won't. But there's literally no harm in trying to fix it, especially if otherwise you were planning to toss it out.

    And I really can't tell you the number of times I've seen a device stop working, and apon closer inspection the entire problem was something very simple, like an old wire broke at the solder point, and with it disconnected, the power switch doesn't work. When I was a kid and didn't know how to solder, I would fix issues like that with some aluminum foil, and often it worked. Just start with a screwdriver, open things up, take a look around. We owe it to ourselves and to the planet to just give it a shot.