AI Experts No Longer Saving for Retirement Because They Assume AI Will Kill Us All by Then
-
This post did not contain any content.
-
This post did not contain any content.
Because they're fucking idiots and hype men...
They want people to believe this is true, so that they panic buy in.
This is reminiscent of tobacco execs saying tobacco doesn't cause cancer. They're not "experts" they're paid shills who will say anything to make the stick price go up.
-
This post did not contain any content.
I'd like to say this isn't true for my kid, but then he did buy that hot new Corvette...
-
This post did not contain any content.
-
This post did not contain any content.
Are these AI Experts in the room with us now?
-
This post did not contain any content.
AI doomsday marketing wank has the same vibe as preteens at a sleepover getting spooked by a ouija board.
-
AI doomsday marketing wank has the same vibe as preteens at a sleepover getting spooked by a ouija board.
Yes, but no different than AI competence wank. LLMs are a significant step forward, but not even remotely intelligent. It's all bullshit and hype.
One day it won't be. We aren't there yet.
-
AI doomsday marketing wank has the same vibe as preteens at a sleepover getting spooked by a ouija board.
It honestly reminds me of the Y2K doomers.
-
This post did not contain any content.
That is a terrible gamble
-
It honestly reminds me of the Y2K doomers.
No, Y2K would have been catastrophic.
But a shit ton of people out in a ridiculous amount of work and everything was updated.
People concerned were fucking "doomers", they were the reason shit didn't go terribly.
But because rational.people helped everyone avoid the consequences, idiots think we could have just ignored it and had the same result.
It's fucking ridiculous you didn't learn this lesson from covid even if you're too young to have experienced Y2K
-
This post did not contain any content.
Just ask it how many Rs are in blueberry, and run away while it has to think about it a while.
-
No, Y2K would have been catastrophic.
But a shit ton of people out in a ridiculous amount of work and everything was updated.
People concerned were fucking "doomers", they were the reason shit didn't go terribly.
But because rational.people helped everyone avoid the consequences, idiots think we could have just ignored it and had the same result.
It's fucking ridiculous you didn't learn this lesson from covid even if you're too young to have experienced Y2K
Had a friend who was hired by a temp agency to reprogram outdated mainframes in an ancient programming language. He was paid for training. There were ARMIES of people like him doing last-minute fixes. Y2K would have been a tremendous disaster if people had just ignored it.
See also: the environment, but without armies of people working on fixes.
-
No, Y2K would have been catastrophic.
But a shit ton of people out in a ridiculous amount of work and everything was updated.
People concerned were fucking "doomers", they were the reason shit didn't go terribly.
But because rational.people helped everyone avoid the consequences, idiots think we could have just ignored it and had the same result.
It's fucking ridiculous you didn't learn this lesson from covid even if you're too young to have experienced Y2K
Its like the problems around the ozone hole, or acid rain.
A lot of people scrambled and worked very hard to find an alternative that didn't cause problems, and now it's almost like they never existed, and people think it was much ado over nothing.
-
This post did not contain any content.
Right. If I had dumped all my money into AI stonks and was overall deeply invested into silicon valley I'd sell the doomer story, too. Keeps the bubble alive.
Meanwhile the REAL threat of the AI hype, i.e. overburdening ecosystems with ridiculously hungry data centers and putting children's sanity in the hands of a hallucinating sycophantic autofill can be entirely ignored because "AGI BAD SO WE MUST BUILD AGI".
-
No, Y2K would have been catastrophic.
But a shit ton of people out in a ridiculous amount of work and everything was updated.
People concerned were fucking "doomers", they were the reason shit didn't go terribly.
But because rational.people helped everyone avoid the consequences, idiots think we could have just ignored it and had the same result.
It's fucking ridiculous you didn't learn this lesson from covid even if you're too young to have experienced Y2K
But did they fill in their TPS reports?
-
AI doomsday marketing wank has the same vibe as preteens at a sleepover getting spooked by a ouija board.
It doesn't have to be good to be bad.
-
Right. If I had dumped all my money into AI stonks and was overall deeply invested into silicon valley I'd sell the doomer story, too. Keeps the bubble alive.
Meanwhile the REAL threat of the AI hype, i.e. overburdening ecosystems with ridiculously hungry data centers and putting children's sanity in the hands of a hallucinating sycophantic autofill can be entirely ignored because "AGI BAD SO WE MUST BUILD AGI".
It's not about "AI stonks" really. If one genuinely believes that AGI will be the end of us then any form of retirement savings are just waste of time.
I really think that most investors aren't as hyped about AI stocks as the anti-AI crowd online wants us to believe. They may have increased the weight of their investments on the tech sector but the vast majority of investors are aware of the risk of not diversifying your portfolio and if you're someone with actual wealth you can invest then they're probably not putting it all on Open AI.
The recent drop in AI stocks that was in the news a week or two back doesn't even register on the value of my portfolio even though nearly all of the top companies on it are tech companies.
-
This post did not contain any content.
Or useless old people will be turned in to a soylent green to feed the working meat slaves.
-
This post did not contain any content.
What's all that nonsense about what's his face tidying up his affairs because he thinks AI is going to kill us all.
If it is going to kill us all I don't think updating your will is going to have any effect on anything. Such obvious hype, it's ridiculous.
"Our product is so amazingly brilliant it'll probably kill everyone everywhere" is exactly the kind of marketing that appeals to bosses.
-
That is a terrible gamble
It also doesn't seem to have much of an upside.
As the robots break down your defences and prepare to incinerate you you look around smugly and say "thank God I don't have any retirement savings".
Meanwhile in a parallel universe you are living in 2050 on social care payments. While everyone around you is taking trips to the moon.