Skip to content

If AI takes most of our jobs, money as we know it will be over. What then?

Technology
54 31 12
  • AI isn't going to take anyone's job.

    We will fire a bunch of workers while delusion nepo babies try to figure out why an autocomplete bot can think critically or do any complex tasks, then they will close their buisness or rehire people after a few years of failure, and it won't impact the owner's quality of life in any way because they have more wealth then they will ever need

    We should absolutely have a UBI that's funded by taxing 100% of wealth over a set number and redistributing it perpetually.

    Agreed for the most part, but I disagree about the 100% taxes thing. I think we should instead cap inheritance/gifts, not income. You can be as wealthy as you want, but once you die, it all goes back to the common pot.

    I don't care about rich people, I mostly just care about generational wealth.

  • When democratic governance withers what fills the power vacuum is feudalism.

    Technofeudalism is feudalism with computers.

    Ironically, to create a space that selects for and protects distributed decisionmaking (the desire of most sane anarchists), you need a strong government!

    Anarchism is a project. It's not just a matter of eliminating the state. That would just result in Mad Max.

    You need people to work together to help each others needs. I help you because I might need help someday, too. That builds a real community. And then maybe, just maybe, we solve each others problems enough that the state is unnecessary.

    Is it a pipe dream? Maybe. But the steps towards that are worth doing, anyway.

  • AI can't do my job.

    I'm the guy they call when the machines go down.

    I fully suspect some billionaire will invent "vibe repairing".

  • But the billionaires won't need us as slaves once they have their fleets of robots.

    You can always repurpose an asset and use it in another way or resell. They won't need us for what we do today but maybe they'll get a liking for human flesh afterwards. Always useful.

  • "What then?"

    "Same as it ever was!"

    We all fight over resources that actually matter (like food, water, shelter and security) instead the previous things (money), for the enjoyment of our overlords.

    Seriously, the people who have power to change the outcome of the future seem to either straight not be planning for this future scenario, or are planning for a horribly distopian version of this future scenario.

    Oh, they've planned for it. They have their billionaire bunkers. Bezos has three that we know of.

  • The current tech/IT sector is heavily relying on and riding hype trains. It's a bit like the fashion industry that way. But this AI hype so far has only been somewhat useful.

    Current general LLMs are decent for prototyping or example output to jump-start you into the general direction of your destination, but their output always needs supervision and most often it needs fixing.
    If you apply unreliable and constantly changing AI to everything, and completely throw out humans, just because it's cheaper, then you'll get vastly inferior results. You probably get faster results, but the results will have tons of errors which introduces tons of extra problems you never had before.
    I can see AI fully replacing some jobs in some specific areas where errors don't matter much. But that's about it. For all other jobs or purposes, AI will be an extra tool, nothing more, nothing less.

    AI has its uses within specific domains, when trained only on domain-specific and truthful data. You know, things like AlphaZero or AlphaGo. Or AIs revealing new methods not known before to reach the same goal. But these general AIs like ChatGPT which are trained on basically the whole web with all the crap in it... it's never going to be truly great. And it's also becoming worse over time, i.e. not improving much at all, because the web will be even fuller with AI-generated crap in the future. So the AIs slurp up all that crap too. The training data gets muddier over time. The promise of AIs getting even more powerful as time goes on is just a marketing lie. There's most likely a saturation curve, and we're most likely very close to the saturation already, where it won't really get any better. You could already see this by comparing the jump from GPT-3 to GPT-4 (big) and then GPT-4 to GPT-5 (much smaller). Or take a look at FSD cars. Also not really happening, unless you like crashes. Of course, the companies want to keep the illusion rolling so they'll always claim the next big revolution is just around the corner. Because they profit from investments and monthly paying customers, and as long as they can keep that illusion up and profit from that, they don't even need to fulfill any more promises.

    Current general LLMs are decent for prototyping or example output to jump-start you into the general direction of your destination, but their output always needs supervision and most often it needs fixing.

    This.

    LLMs do not produce anything that can be relied upon confidently without human review, and after the bubble pops, that's only going to become more true.

    Hell, I'm glad the first time I ever used it it gave me a bugged hallucinated and false reply. I asked it to give me a summary of the 2023 Super Bowl and learned that Patrick Mahomes kicked a field goal to win the game.

  • Anarchism is a project. It's not just a matter of eliminating the state. That would just result in Mad Max.

    You need people to work together to help each others needs. I help you because I might need help someday, too. That builds a real community. And then maybe, just maybe, we solve each others problems enough that the state is unnecessary.

    Is it a pipe dream? Maybe. But the steps towards that are worth doing, anyway.

    Of course the power dynamics cannot ever be eliminated (either by breeding or enculturation) from the interpersonal relationships.

    Instead, power can be regulated and managed, to maximize distributed decisionmaking, and to protect those decisionmakers who could not or would not protect themselves.

    In a free for all, feudalism will always result. The strong and the willing will rule over the weak and the unwilling.

    There have to be limits to the power dynamics. Those limits will have to be enforced to protect the vulnerable, the gullible, and the unwilling (those who have the capability to exercise power, but refuse by choice), etc. This requires advanced democratic governance with a very strong government.

    Doing away with the government is just a speedrun toward technofeudalism.

    Working to create a protected space that selects for distributed decisionmaking is the actual project. That's an actually sane, worthwhile and achievable goal.

  • I fully suspect some billionaire will invent "vibe repairing".

    Followed quickly by "vibe bankruptcy proceedings"

  • I think money as we know is already over. We don't need AI for that. Just look at prices, wages and economy.

  • Ever had an "AI" show up at 2AM on an emergency call to fix a gas leak? How about an "AI" to cook a breakfast sandwich? Maybe an "AI" is taking over babysitting while you're out of town...? No?

    "AI" doesn't do anything. But if your job primarily revolves around words or pictures on a screen, maybe "AI" can help you with that.

  • Agreed for the most part, but I disagree about the 100% taxes thing. I think we should instead cap inheritance/gifts, not income. You can be as wealthy as you want, but once you die, it all goes back to the common pot.

    I don't care about rich people, I mostly just care about generational wealth.

    I mean, those are kinda two sides of the same coin. Both ways to limit the compounding of wealth in few hands.

    I'm open to all these ideas, and more

  • If an AI puts you out of work it should have to pay your salary.

    Or more likely it was a shitty job that shouldn't have been done by a human in the first place.

  • I mean, those are kinda two sides of the same coin. Both ways to limit the compounding of wealth in few hands.

    I'm open to all these ideas, and more

    Wealth and income are two different things. We should tax wealth savagely, i.e., the ownership of assets, and we should also tax income, but to a lesser degree.

    Just to level set: income refers to the flow of money earned over a period, like a salary or wages, while wealth represents the accumulated assets minus liabilities at a specific point in time

  • Same as ever…was that money wasn’t needed.

    Do you need money within your neighborhood or your family? Do you pay people for giving a favor?

    A favor is just a form of debt, and debt is money. It does not matter whether it's written down on paper, or just remembered.

  • Ever had an "AI" show up at 2AM on an emergency call to fix a gas leak? How about an "AI" to cook a breakfast sandwich? Maybe an "AI" is taking over babysitting while you're out of town...? No?

    "AI" doesn't do anything. But if your job primarily revolves around words or pictures on a screen, maybe "AI" can help you with that.

    Imagine pretending like things pertaining to "words or pictures on a screen" cannot be doing anything. I get it, everyone hates AI, no need to invalidate many real peoples' jobs.

  • AI isn't going to take anyone's job.

    We will fire a bunch of workers while delusion nepo babies try to figure out why an autocomplete bot can think critically or do any complex tasks, then they will close their buisness or rehire people after a few years of failure, and it won't impact the owner's quality of life in any way because they have more wealth then they will ever need

    We should absolutely have a UBI that's funded by taxing 100% of wealth over a set number and redistributing it perpetually.

    100% taxing anything is just an unrealistic goal and not even needed. Going to 70-80% on income tax is enough. .Well that and fix the loan structure countries like the US has, make them unable to use stocks for colleteral for -loans. Musks wants another loan? Make him increase his salary or dividend it out and tax that with 70-80%.

    If something is taxed

  • Agreed for the most part, but I disagree about the 100% taxes thing. I think we should instead cap inheritance/gifts, not income. You can be as wealthy as you want, but once you die, it all goes back to the common pot.

    I don't care about rich people, I mostly just care about generational wealth.

    You can be as wealthy as you want, but once you die, it all goes back to the common pot.

    So that 18-year-old that loses both his parents will inherent nothing? So he would have to live on the street or something? Or that women who lose their husband, so inheriting the other half of their combined income? Which will cause her to lose the house etc?

    Inheriting a couple 100k or even a mill isn't really the issue. Do tax it, yes, but 100% tax on anything is unreasonable, or at least if that happens at once. That's why we have multiple different taxes.

  • Oh, they've planned for it. They have their billionaire bunkers. Bezos has three that we know of.

    Who is he going to staff it with? There is zero chance he will cook and clean for himself and he will need a substantial army to protect himself from the collapse of the human race.

    If he goes with people how long before the security guards realise they can dump his scrawny ass off a cliff and have a better quality of life as money becomes irrelevant? If he goes with robots how long before they break beyond what they can self repair? If he goes with robots with human engineers, how long before the engineers realise the same thing as the security guards and program the robots to kill Bezos?

  • You joke but post scarcity anarchism is probably the only truly viable post capitalist society where the state actually has a real chance of withering away. That means good praxis is anything which reduces scarcity - both in the form of technological developments and sustainability/ecology. And yes, harm reduction measures which foster collaboration and social cohesion and create actualized humans with real agency and a real stake in their own communities.

    The problem with so much leftist thought is precisely that it denies agency to those it seeks to liberate. "Luxury gay space communism" is a meme, but it's based on a post-left idea which is actually far more rooted in reality than a lot of ML orthodoxy.

    The problem with so much leftist thought is precisely that it denies agency to those it seeks to liberate.

    Which is why at some point I decided that I'm fine with explaining my opinions though Trotskyism and not anarcho-capitalism. I didn't stop being ancap in essence (recently went to an ancap group in TG and was glad to see that the main principles haven't been lost), but in Russia most people around use communist terms and logic on politics without even realizing it. Even the right-wing and nationalist kind talk like that (the official "communist" party doesn't, though, it sounds like moderate nazis with weird symbolic). And if I want to find a way to improve something, it very clearly doesn't lie in conceiving a structure and then trying to make it real through power or deceit.

  • When democratic governance withers what fills the power vacuum is feudalism.

    Technofeudalism is feudalism with computers.

    Ironically, to create a space that selects for and protects distributed decisionmaking (the desire of most sane anarchists), you need a strong government!

    It's not feudalism, it's the usual fascism in making. Maybe with some capitalist mechanisms.