We're Not Innovating, We’re Just Forgetting Slower
-
VHS player
VCR.
It stands for Video Cassette Recorder. There is no such thing as a "player". They all have recording capability.
I've programmed the intel 8051. I made a firmware update to get it working on 4G/LTE modems. I must say the debug tools weren't the greatest. There was a lot of logging invovled.
A lot of modern tech is garbage. You just need to practice the purchasing habits of Richard Stallman. There are literally hundreds of routers on the market that you can install your own custom OS on. This is the case with many phones, and almost every PC.
"VCR" vs "VHS Player":
Conjuring up a frequency graph from 2004-present doesn't help your argument, as the VCR format wars were pretty much over a good 15 years beforehand.
"VCR" could have meant either VHS or Betamax to a consumer in the early '80s.
At least VHS specifies a particular standard, and "player" in that context has a loose connection with record player, or tape player , being the thing you play your purchased records / tapes / videos on.
-
This post did not contain any content.
A year or two ago I read about some guy who is still managing the trailer park he inherited from his dad with a TRS-80 (I think), using an app he wrote way back when. If it works it works!
-
You don't have to fix everything, but just doing stuff like replacing connectors and capacitors could probably save 10% of the shit that we throw away, and it's not that hard to try.
I do agree with that completely and I'd like to add to it with an additional point.
When things break it sucks, but this does present you with an opportunity. If it's already not working, there's no harm in taking it apart and taking a look around. Maybe you'll see something obviously at fault, maybe you won't. But there's literally no harm in trying to fix it, especially if otherwise you were planning to toss it out.
And I really can't tell you the number of times I've seen a device stop working, and apon closer inspection the entire problem was something very simple, like an old wire broke at the solder point, and with it disconnected, the power switch doesn't work. When I was a kid and didn't know how to solder, I would fix issues like that with some aluminum foil, and often it worked. Just start with a screwdriver, open things up, take a look around. We owe it to ourselves and to the planet to just give it a shot.
-
This post did not contain any content.
USB-C. We are clearly progressing. I never want to go back to the world where every phone had a proprietary power brick and connector.
-
Except with Internet shit, it's usually some dumbass at your ISP who is only trained to answer the phone and parrot from 3 different prompts. Actually getting someone who can flip the switch/register your device in the proper region to make shit actually work on their end.
Apparently I've angered some call center workers.
Ah yes, the shibboleet
-
The author's take is detached from reality, filled with hypocrisy and gatekeeping.
This isn't nostalgia talking — it's a recognition that we’ve traded reliability and understanding for the illusion of progress.
It absolutely is nostalgia talking. Yes your TI-99 fires up immediately when plugged in, and its old. However my Commodore 64 of the same era risk being fried because the 5v regulator doesn't age well and when fails dumps higher voltage right into the RAM and CPU. Oh, and c64 machines were never built with overvoltage protection because of cost savings. So don't confuse age with some idea of golden era reliability. RAM ICs were also regularly failed in those age of computers. This is why you had RAM testing programs and socketed ICs. When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?
Today’s innovation cycle has become a kind of collective amnesia, where every few years we rediscover fundamental concepts, slap a new acronym on them, and pretend we’ve revolutionized computing. Edge computing? That’s just distributed processing with better marketing. Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files. Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.
By that logic, even the TI-99 he's loving on is just a fancier ENIAC or UNIVAC. All technology is built upon the era before it. If there was no technological or production cost improvement, we'd just use the old version. Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn't a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.
There’s something deeply humbling about opening a 40-year-old piece of electronics and finding components you can actually identify. Resistors, capacitors, integrated circuits with part numbers you can look up. Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste.
I agree, there is something appealing about it to you and me, but most people don't care....and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be. There were absolutely users of TI-99 and C64 computers in the 80s that didn't give two shits about the shift register ICs or the UART that made the modem work, but they loved that they could get invoices from their loading dock sent electronically instead of a piece of paper carried (and lost!) through multiple hands.
Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn't use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with "part numbers you can look up" would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today's modern computers.
This isn't their fault — it's a systemic problem. Our education and industry reward breadth over depth, familiarity over fluency. We’ve optimized for shipping features quickly rather than understanding systems thoroughly. The result is a kind of technical learned helplessness, where practitioners become dependent on abstractions they can’t peer beneath.
Ugh, this is frustrating. Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren't fluent in circuit theory that allows the camera to display the guts of the patient they're operating on? No, of course not. We want that surgeon to keep studying new surgical technics, not trying to use Ohm's Law to calculate the current draw of the device he's using. Mr author, you and I hobby at electronics (and vintage computing) but just because its an interest of ours, doesn't mean it has to be of everyone.
What We Need Now: We need editors who know what a Bode plot is. We need technical writing that assumes intelligence rather than ignorance. We need educational systems that teach principles alongside tools, theory alongside practice.
Such gatekeeping! So unless you know the actual engineering principles behind a device you're using, you shouldn't be allowed to use it?
Most importantly, we need to stop mistaking novelty for innovation and complexity for progress.
Innovation isn't just creating new features or functionality. In fact, most I'd argue is taking existing features or functions and delivering them for substantially less cost/effort.
As I'm reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread. Does the Mr author know when to till soil or plant seed? How about the amount of irrigation Durum wheat needs during the hot season? How about when to harvest? What moisture level should the resulting harvest have before being taking to market or put in long term storage? Yet there he sits, eating the sandwich blissfully unaware of all the steps and effort needed to just make the wheat that goes into the bread. The farmer sits and wonders if Mr author's next article will be deriding the public on just eating bread and how we've forgotten how to grow wheat. Will Mr Author say we need fewer people ordering sandwiches and more people consulting US GIS maps for rainfall statistics and studying nitrogen fixing techniques for soil health? No, probably not.
The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them.
Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, "the best engineering solutions" are often some of the most expensive. You don't always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.
They don't require constant updates or cloud connectivity or subscription services. They just work, year after year, doing exactly what they were designed to do.
The reason your TI-99 and my c64 don't require constant updates is because they were born before the concept of cybersecurity existed. If you're going to have internet connected devices they its a near requirement to receive updates for security.
If you don't want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.
That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code.
It is a machine of extremely limited functionality with a comparably simple design and construction. Don't think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today's cheap commodity PCs.
It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.
Take off the rose colored glasses. It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell. Sales of it absolutely served quarterly revenue numbers even back in the 1980s.
We used to build things that lasted.
We don't need most of these consumer electronics to last. Proof positive is the computer Mr. author is writing his article on is unlikely to be an Intel based 486 running at 33Mhz from the mid 90s (or a 68030 Mac). If it still works, why isn't he using one? Could it be he wants the new features and functionality like the rest of us? Over-engineering is a thing, and it sounds like what the author is preaching.
Apologies if my post turned into a rant.
The author’s take is detached from reality, filled with hypocrisy and gatekeeping.
"Opinionated" is another term - for friendliness and neutrality. Complaining about reality means a degree of detachment from it by intention.
When was the last time, Mr author, you had to replace a failed DIMM in your modern computer?
When was the last time, Mr commenter, you had to make your own furniture because it's harder to find a thing of the right dimensions to buy? But when that was more common, it was also easier to get the materials and the tools, because ordering things over the Internet and getting them delivered the next day was less common. In terms of managing my home I feel that 00s were nicer than now.
Were the centralized "silk road" of today with TSMC kicked out (a nuke, suppose, or a political change), would you prefer less efficient yet more distributed production of electronics? That would have less allowance for various things hidden from users, that happen in modern RAM. Possibly much less.
If there was no technological or production cost improvement, we’d just use the old version.
I think their point was that there's no architectural innovation in some things.
Yes, there is a regular shift in computing philosophy, but this is driving by new technologies and usually computing performance descending to be accessibly at commodity pricing. The Raspberry Pi wasn’t a revolutionary fast computer, but it changed the world because it was enough computing power and it was dirt cheap.
Maybe those shifts are in market philosophies in tech.
I agree, there is something appealing about it to you and me, but most people don’t care…and thats okay! To them its a tool to get something done. They are not in love with the tool, nor do they need to be.
There's a screwdriver. I can imagine there's a fitting basic amount of attention a piece of knowledge gets. I can imagine some person not knowing how to use a screwdriver (substitute with something better) is below that. And some are far above that, maybe.
I think the majority of humans is below the level of knowledge computers in our reality require. That's not the level you or the author possess. That's about the level I possessed in my childhood, nothing impressive.
Mr. author, no one is stopping you from using your TI-99 today, but in fact you didn’t use it to write your article either. Why is that? Because the TI-99 is a tiny fraction of the function and complexity of a modern computer. Creating something close to a modern computer from discrete components with “part numbers you can look up” would be massively expensive, incredibly slow, and comparatively consume massive amounts of electricity vs today’s modern computers.
It would seem we are getting a better deal from the same amount of energy spent with modern computers then. Does this seem right to you?
It's philosophy and not logic, but I think you know that for getting something you pay something. There's no energy out of nowhere.
Discrete components may not make sense. But maybe the insane efficiency we have is paid for with our future. It's made possible by centralization of economy and society and geopolitics, which wasn't needed to make TI-99.
Do you think a surgeon understands how a CCD electronic camera works that is attached to their laparoscope? Is the surgeon un-educated that they aren’t fluent in circuit theory that allows the camera to display the guts of the patient they’re operating on?
A surgeon has another specialist nearby, and that specialist doesn't just know these things, but also a lot of other knowledge necessary for them and the surgeon to unambiguously communicate, avoiding fatal mistakes. A bit more expense is spent here than just throwing a device at a surgeon not understanding how it works. A fair bit.
Such gatekeeping! So unless you know the actual engineering principles behind a device you’re using, you shouldn’t be allowed to use it?
Why not:
Such respect! In truth, why wouldn't we trust students to make good use of understanding of their tools and the universe around them, since every human's corpus of knowledge is unique and wonderful, and not intentionally limit them.
Innovation isn’t just creating new features or functionality. In fact, most I’d argue is taking existing features or functions and delivering them for substantially less cost/effort.
Is change of policy innovation? In our world I see a lot of that. Driven by social and commercial and political interests naturally.
As I’m reading this article, I am thinking about a farmer watching Mr. author eat a sandwich made with bread.
A basic touch on your thoughts further is supposed to be part of school program in many countries.
Perhaps, but these simple solutions also can frequently only offer simple functionality. Additionally, “the best engineering solutions” are often some of the most expensive. You don’t always need the best, and if best is the only option, then that may mean going without, which is worst than a mediocre solution and what we frequently had in the past.
Does more complex functionality justify this? Who decides what we need? Who decides what is better and what is worse?
This comes to policy decisions again. Authority. I think modern authority is misplaced, and were it not, we'd have an environment more similar to what the author wants.
The reason your TI-99 and my c64 don’t require constant updates is because they were born before the concept of cybersecurity existed. If you’re going to have internet connected devices they its a near requirement to receive updates for security.
Not all updates are for security. And an insecure device still can work years after years.
If you don’t want internet connected devices, you can get those too, but they may be extremely expensive, so pony up the cash and put your money where your mouth is.
Willpower is a tremendous limitation which people usually ignore. It's very hard to do this when everyone around doesn't. It would be very easy if you were choosing for yourself without network effects and interoperability requirements.
So your argument for me doesn't work in your favor, when looking closely. (Similar to "if you disagree with this law, you can explain it at the police station".)
Don’t think even a DEC PDP 11 mainframe sold in the same era was entirely known by a handful of people, and even that is a tiny fraction of functionality of today’s cheap commodity PCs.
There's a graphical 2d space shooter game for PDP-11. Just saying.
Also on its architecture some Soviet clones were made, in the form factor of PCs. With networking capabilities, they were used as command machines for other kinds of simpler PCs, or for production lines, and could be used as file shares, IIRC. I don't remember what that was called, but the absolutely weirdest part was seeing in comments people remembering using that in university computer labs and even in school computer labs, so that actually existed in the USSR.
Kinda expensive though, even without Soviet inefficiency.
It was made as a consumer electronics product with the least cost they thought they could get away with and have it still sell.
Yes, which leads to different requirements today. This doesn't stop the discussion. That leads it to the question what changed. We are not obligated to take the perpetual centralization of economies and societies like some divine judgement.
We don’t need most of these consumer electronics to last.
Who's we? Are you deciding what will Intel RnD focus on, or what will Microsoft change in their OS and applications, or what will Apple produce?
Authority, again.
If it still works, why isn’t he using one? Could it be he wants the new features and functionality like the rest of us?
Yes. It still works for offline purposes. It doesn't work where the modern web is not operable with it. This in my opinion reinforces their idea, not yours.
These are my replies. I'll add my own principal opinion - a civilization can be as tall as a human forming it. Abstractions leak, and our world is continuous, so all abstractions leak. To know which do and don't for the particular purpose, you need to know principles. You can use abstractions without looking inside them to build a system inside an architecture, but you can't build an architecture and pick real world solutions for those abstractions without understanding those real wold solutions. Also horizontal connections between abstractions are much more tolerant to leaks than vertical ones.
And there's no moral law forbidding us to look above our current environment to understand in which directions it may change.
-
I think the author was referring to the makers of the device not understanding what theyre making, not so much the end user.
Just to make sure I'm following your thread of thought, are you referring to this part of the author's opinion piece or something else in his text?
"This wouldn’t matter if it were just marketing hyperbole, but the misunderstanding has real consequences. Companies are making billion-dollar bets on technologies they don’t understand, while actual researchers struggle to separate legitimate progress from venture capital fever dreams. We’re drowning in noise generated by people who mistake familiarity with terminology for comprehension of the underlying principles."
Yes, but also the bit about when someone creates an application without understanding the underlying way that it actually functions. Like I can make a web app, but i don't need to understand memory allocation to do it. The maker of the app is a level or two of abstraction from what the base metal of the computer is being told to do.
-
Yes, but also the bit about when someone creates an application without understanding the underlying way that it actually functions. Like I can make a web app, but i don't need to understand memory allocation to do it. The maker of the app is a level or two of abstraction from what the base metal of the computer is being told to do.
Gotcha, thank you for the extra context so I understand your point. I'll respond to your original statement now that I understand it better:
I ALSO think the author would prefer more broad technical literacy, but his core arguement seemed to be that those making things dont understand the tech they’re built upon and that unintended consequences can occur when that happens.
I think the author's argument on that is also not a great one.
Lets take your web app example. As you said, you can make the app, but you don't understand the memory allocation, and why? Because the high level language or framework you wrote it in does memory management and garbage collection. However, there are many, many, MANY, more layers of abstraction beside just your code and the interpreter. Do you know the webserver front to back? Do you know which ring your app or the web server is operating in inside the OS (ring 3 BTW)? Do you know how the IP stack works in the server? Do you know how the networking works that resolves names to IP addresses or routes the traffic appropriately? Do you know how the firewalls work that the traffic is going over when it leaves the server? Back on the server, do you know how the operating system makes calls to the hardware via device drivers (ring 1) or how those calls are handled by the OS kernel (ring 0)? Do you know how the system bus works on the motherboard or how the L1, L2, and L3 cache affect the operation and performance of the server overall? How about that assembly language isn't even the bottom of abstraction? Below that all of this data is merely an abstraction of binary, which is really just the presence or absence of voltage on a pit or in a bit register in ICs scattered across the system?
I'll say probably not. And thats just fine! Why? Because unless your web app is going to be loaded onto a spacecraft with a 20 to 40 year life span and you'll never be able to touch it again, then having all of that extra knowledge and understanding only have slight impacts on the web app for its entire life. Once you get one or maybe two levels of abstraction down, the knowledge is a novelty not a requirement. There's also exceptions to this if you're writing software for embedded systems where you have limited system resources, but again, this is an edge case that very very few people will ever need to worry about. The people in those generally professions do have the deep understanding of those platforms they're responsible for.
Focus on your web app. Make sure its solving the problem that it was written to solve. Yes, you might need to dive a bit deeper to eek out some performance, but that comes with time and experience anyway. The author talks like even the most novice people need the ultimately deep understanding through all layers of abstraction. I think that is too much of a burden, especially when it acts as a barrier to people being able to jump in and use the technology to solve problems.
Perhaps the best example of the world that I think the author wants would be the 1960s Apollo program. This was a time where the pinnacle of technology was being deployed in real-time to solve world moving problems. Human kind was trying to land on the moon! The most heroic optimization of machines and procedures had to be accomplished for even a chance for this to go right. The best of the best had to know every. little. thing. about. everything. People's lives were at stake! National pride was at stake! Failure was NOT an option! All of that speaks to more of what the author wants for everyone today.
However, that's trying to solve a problem that doesn't exist today. Compute power today is CHEAP!!! High level program languages and frameworks are so easy to understand that programming it is accessible to everyone with a device and a desire to use it. We're not going to the moon with this. Its the kid down the block that figured out how to use If This Then That to make a light bulb turn on when he farts into a microphone. The beauty is the accessibility. The democratization of compute. We don't need gatekeepers demanding the deepest commitment to understanding before the primitive humans are allowed to use fire.
Are there going to be problems or things that don't work? Yes. Will the net benefit of cheap and readily available compute in the hands of everyone be greater than the detriments, I believe yes. It appears the author disagrees with me.
/sorry for the wall of text
-
Gotcha, thank you for the extra context so I understand your point. I'll respond to your original statement now that I understand it better:
I ALSO think the author would prefer more broad technical literacy, but his core arguement seemed to be that those making things dont understand the tech they’re built upon and that unintended consequences can occur when that happens.
I think the author's argument on that is also not a great one.
Lets take your web app example. As you said, you can make the app, but you don't understand the memory allocation, and why? Because the high level language or framework you wrote it in does memory management and garbage collection. However, there are many, many, MANY, more layers of abstraction beside just your code and the interpreter. Do you know the webserver front to back? Do you know which ring your app or the web server is operating in inside the OS (ring 3 BTW)? Do you know how the IP stack works in the server? Do you know how the networking works that resolves names to IP addresses or routes the traffic appropriately? Do you know how the firewalls work that the traffic is going over when it leaves the server? Back on the server, do you know how the operating system makes calls to the hardware via device drivers (ring 1) or how those calls are handled by the OS kernel (ring 0)? Do you know how the system bus works on the motherboard or how the L1, L2, and L3 cache affect the operation and performance of the server overall? How about that assembly language isn't even the bottom of abstraction? Below that all of this data is merely an abstraction of binary, which is really just the presence or absence of voltage on a pit or in a bit register in ICs scattered across the system?
I'll say probably not. And thats just fine! Why? Because unless your web app is going to be loaded onto a spacecraft with a 20 to 40 year life span and you'll never be able to touch it again, then having all of that extra knowledge and understanding only have slight impacts on the web app for its entire life. Once you get one or maybe two levels of abstraction down, the knowledge is a novelty not a requirement. There's also exceptions to this if you're writing software for embedded systems where you have limited system resources, but again, this is an edge case that very very few people will ever need to worry about. The people in those generally professions do have the deep understanding of those platforms they're responsible for.
Focus on your web app. Make sure its solving the problem that it was written to solve. Yes, you might need to dive a bit deeper to eek out some performance, but that comes with time and experience anyway. The author talks like even the most novice people need the ultimately deep understanding through all layers of abstraction. I think that is too much of a burden, especially when it acts as a barrier to people being able to jump in and use the technology to solve problems.
Perhaps the best example of the world that I think the author wants would be the 1960s Apollo program. This was a time where the pinnacle of technology was being deployed in real-time to solve world moving problems. Human kind was trying to land on the moon! The most heroic optimization of machines and procedures had to be accomplished for even a chance for this to go right. The best of the best had to know every. little. thing. about. everything. People's lives were at stake! National pride was at stake! Failure was NOT an option! All of that speaks to more of what the author wants for everyone today.
However, that's trying to solve a problem that doesn't exist today. Compute power today is CHEAP!!! High level program languages and frameworks are so easy to understand that programming it is accessible to everyone with a device and a desire to use it. We're not going to the moon with this. Its the kid down the block that figured out how to use If This Then That to make a light bulb turn on when he farts into a microphone. The beauty is the accessibility. The democratization of compute. We don't need gatekeepers demanding the deepest commitment to understanding before the primitive humans are allowed to use fire.
Are there going to be problems or things that don't work? Yes. Will the net benefit of cheap and readily available compute in the hands of everyone be greater than the detriments, I believe yes. It appears the author disagrees with me.
/sorry for the wall of text
As with your original comment, i like your argument.
Additionally, I dig the wall of text. WoT, written well, leaves little ambiguity and helps focus the conversation.
I don't disagree on any particular point. I agree that its a net positive for programming to be approachable to more people, and that it can't be approachable to many while requiring apollo era genius and deep understanding of technology. It would be a very different world if only PhDs could program computers.
To that, I believe the article author is overstating a subtle concern that I think is theoretically relevant and important to explore.
If, over the fullness of decades, programming becomes so approachable (ie, you tell an AI in plain language what you want and it makes it flawlessly), people will have less incentive to learn the foundational concepts required to make the same program "from scratch". Extending that train of thought, we could reach a point where a fundamental, "middle-technology" fails and there simply isn't anyone who understands how to fix the problem. I suspect there will always be hobbiests and engineers that maintain esoteric knowledge for a variety of reasons. But, with all the levels of abstraction and fail points inadvertently built in to code over so much time passing, it's possible to imagine a situation where essentially no-one understands the library of the language that a core dependency was written in decades before. Not only would it be a challange to fix, it could be hard to find in the first place.
If the break happens in your favorite cocktail recipe app, its Inconvenient. If the break happens in a necessary system relied on by fintec to move peoples money from purchase to vendor to bank to vendor to person, the scale and importance of the break is devastating to the world. Even if you can seek out and find the few that have knowledge enough to solve the problem, the time spent with such a necessary function of modern life unavailable would be catastrophic.
If a corporation, in an effort to save money, opts to hire a cheap 'vibe-coder' in the '20s and something they 'vibe' winds up in important stacks, it could build fault lines into future code that may be used for who-knows-what decades from now.
There are a lot of ifs in my examples. It may never happen and we'll get the advantage of all the ideas that are able to be made reality through accessibility. However, it's better to think about it now rather than contend with the eventually all at once when a catastrophe occurs.
You're right that doom and gloom isn't helpful, but I don't think the broader idea is without merit. -
As with your original comment, i like your argument.
Additionally, I dig the wall of text. WoT, written well, leaves little ambiguity and helps focus the conversation.
I don't disagree on any particular point. I agree that its a net positive for programming to be approachable to more people, and that it can't be approachable to many while requiring apollo era genius and deep understanding of technology. It would be a very different world if only PhDs could program computers.
To that, I believe the article author is overstating a subtle concern that I think is theoretically relevant and important to explore.
If, over the fullness of decades, programming becomes so approachable (ie, you tell an AI in plain language what you want and it makes it flawlessly), people will have less incentive to learn the foundational concepts required to make the same program "from scratch". Extending that train of thought, we could reach a point where a fundamental, "middle-technology" fails and there simply isn't anyone who understands how to fix the problem. I suspect there will always be hobbiests and engineers that maintain esoteric knowledge for a variety of reasons. But, with all the levels of abstraction and fail points inadvertently built in to code over so much time passing, it's possible to imagine a situation where essentially no-one understands the library of the language that a core dependency was written in decades before. Not only would it be a challange to fix, it could be hard to find in the first place.
If the break happens in your favorite cocktail recipe app, its Inconvenient. If the break happens in a necessary system relied on by fintec to move peoples money from purchase to vendor to bank to vendor to person, the scale and importance of the break is devastating to the world. Even if you can seek out and find the few that have knowledge enough to solve the problem, the time spent with such a necessary function of modern life unavailable would be catastrophic.
If a corporation, in an effort to save money, opts to hire a cheap 'vibe-coder' in the '20s and something they 'vibe' winds up in important stacks, it could build fault lines into future code that may be used for who-knows-what decades from now.
There are a lot of ifs in my examples. It may never happen and we'll get the advantage of all the ideas that are able to be made reality through accessibility. However, it's better to think about it now rather than contend with the eventually all at once when a catastrophe occurs.
You're right that doom and gloom isn't helpful, but I don't think the broader idea is without merit.There are a lot of ifs in my examples. It may never happen and we’ll get the advantage of all the ideas that are able to be made reality through accessibility. However, it’s better to think about it now rather than contend with the eventually all at once when a catastrophe occurs. You’re right that doom and gloom isn’t helpful, but I don’t think the broader idea is without merit.
There are some actual real-life examples that match your theoreticals, but the piece missing is the scale of consequences. What has generally occurred is that the fallout from the old thing failing wasn't that big of a deal, or that a modern solution could be designed and built completely replacing the legacy solution even without full understanding of it.
A really really small example of this if from my old 1980s Commodore 64 computer. At the time it used a very revolutionary sound chip to make music and sound effects. It was called the SID chip. Here's one of the them constructed in 1987.
It combined digital technologies (which are still used today) with analog technologies (that nobody makes anymore in the same way). Sadly, these chips also have a habit of dying over time because of how they were originally manufactured. With the supply of these continuously shrinking there were efforts to come up with a modern replacement. Keep in mind these are hobbyists. What they came up with was this:
This is essentially a whole Raspberry Pi computer that fits in the same socket in the 1980s Commodore 64 that accepts the input music instructions from the computer and runs custom written software to produce the same desired output the legacy digital/analog SID chip built in 1982. The computing power in this modern replacement SID chip replacement is more than 30x that of the entire Commodore 64 from the 80s! It could be considered overkill to use so much computing power where the original didn't, but again, compute is dirt cheap today. This new part isn't expensive either. Its about $35 to buy.
This is what I think will happen when our legacy systems finally die without the knowledge to service or maintain them. Modern engineers using modern technologies will replace them providing the same function.
-
There are a lot of ifs in my examples. It may never happen and we’ll get the advantage of all the ideas that are able to be made reality through accessibility. However, it’s better to think about it now rather than contend with the eventually all at once when a catastrophe occurs. You’re right that doom and gloom isn’t helpful, but I don’t think the broader idea is without merit.
There are some actual real-life examples that match your theoreticals, but the piece missing is the scale of consequences. What has generally occurred is that the fallout from the old thing failing wasn't that big of a deal, or that a modern solution could be designed and built completely replacing the legacy solution even without full understanding of it.
A really really small example of this if from my old 1980s Commodore 64 computer. At the time it used a very revolutionary sound chip to make music and sound effects. It was called the SID chip. Here's one of the them constructed in 1987.
It combined digital technologies (which are still used today) with analog technologies (that nobody makes anymore in the same way). Sadly, these chips also have a habit of dying over time because of how they were originally manufactured. With the supply of these continuously shrinking there were efforts to come up with a modern replacement. Keep in mind these are hobbyists. What they came up with was this:
This is essentially a whole Raspberry Pi computer that fits in the same socket in the 1980s Commodore 64 that accepts the input music instructions from the computer and runs custom written software to produce the same desired output the legacy digital/analog SID chip built in 1982. The computing power in this modern replacement SID chip replacement is more than 30x that of the entire Commodore 64 from the 80s! It could be considered overkill to use so much computing power where the original didn't, but again, compute is dirt cheap today. This new part isn't expensive either. Its about $35 to buy.
This is what I think will happen when our legacy systems finally die without the knowledge to service or maintain them. Modern engineers using modern technologies will replace them providing the same function.
I certainly hope so! Human ingenuity has gotton us here. I'm interacting with you across who knows how much distance, using a handheld device that folds up.
.....but, just because we've gotten ahead of trouble and found solutions thus far, doesn't mean that an unintended bit of code, or hardware fault, or lack of imagination can't cause consequences further down the road.
I appreciate your optimism and pragmatic understanding. You seem to be a solution driven person that believes in our ability to reason and fix things. We'll definitely need that type of attitude and approach when and if something goes sideways. -
I certainly hope so! Human ingenuity has gotton us here. I'm interacting with you across who knows how much distance, using a handheld device that folds up.
.....but, just because we've gotten ahead of trouble and found solutions thus far, doesn't mean that an unintended bit of code, or hardware fault, or lack of imagination can't cause consequences further down the road.
I appreciate your optimism and pragmatic understanding. You seem to be a solution driven person that believes in our ability to reason and fix things. We'll definitely need that type of attitude and approach when and if something goes sideways.…but, just because we’ve gotten ahead of trouble and found solutions thus far, doesn’t mean that an unintended bit of code, or hardware fault, or lack of imagination can’t cause consequences further down the road.
Absolutely true.
I guess my thought is that the benefits of our rapid growth outweigh the consequences of forgotten technology. I'll admit though, I'm not unbiased. I have a vested interest. I do very well professionally being the bridge of some older technologies to modern ones myself.