Skip to content

Why so much hate toward AI?

Technology
73 46 215
  • Because the goal of "AI" is to make the grand majority of us all obsolete. The billion-dollar question AI is trying to solve is "why should we continue to pay wages?".
    That is bad for everyone who isn't part of the owner class. Even if you personally benefit from using it to make yourself more productive/creative/... the data you input can and WILL eventually be used against you.

    If you only self-host and know what you're doing, this might be somewhat different, but it still won't stop the big guys from trying to swallow all the others whole.

    Reads like a rant against the industrial revolution. "The industry is only concerned about replacing workers with steam engines!"

  • Not much to win with.

    A fake bubble of broken technology that's not capable of doing what is advertised, it's environmentally destructive, its used for identification and genocide, it threatens and actually takes jobs, and concentrates money and power with the already wealthy.

    It's either broken and not capable or takes jobs.

    You can't be both useless and destroying jobs at the same time

  • My main gripes are more philosophical in nature, but should we automate away certain parts of the human experience? Should we automate art? Should we automate human connections?

    On top of these, there's also the concern of spam. AI is quick enough to flood the internet with low-effort garbage.

    The industrial revolution called, they want their argument against the use of automated looms back.

  • It's either broken and not capable or takes jobs.

    You can't be both useless and destroying jobs at the same time

    Have you never had a corporate job? A technology can be very much useless while incompetent 'managers' who believe it can do better than humans WILL buy the former to get rid of the latter, even though that's a stupid thing to do, in order to meet their yearly targets and other similar idiotic measures of division/team 'productivity'

  • Because of studies like https://arxiv.org/abs/2211.03622:

    Overall, we find that participants who had access to an AI assistant based on OpenAI's codex-davinci-002 model wrote significantly less secure code than those without access. Additionally, participants with access to an AI assistant were more likely to believe they wrote secure code than those without access to the AI assistant.

    Seems like this is a good argument for specialization. Have AI make bad but fast code, pay specialty people to improve and make it secure when needed. My 2026 Furby with no connection to the outside world doesn't need secure code, it just needs to make kids smile.

  • It's either broken and not capable or takes jobs.

    You can't be both useless and destroying jobs at the same time

    And yet AI pulls through and somehow does manage to do both

  • Seems like this is a good argument for specialization. Have AI make bad but fast code, pay specialty people to improve and make it secure when needed. My 2026 Furby with no connection to the outside world doesn't need secure code, it just needs to make kids smile.

    They're called programmers, and it's faster and less expensive all around to just have humans do it better the first time.

  • There's also the issue of people now flooding the internet with AI generated tutorials and documentation, making things even harder. I managed to botch the Linux on my Raspberry Pi so hard I couldn't fix it easily, all thanks to a crappy AI generated tutorial on adding to path that I didn't immediately spot.

    With art, it can't really be controlled enough to be useful for anything much beyond spam machine, but spammers only care about social media clout and/or ad revenue.

    and also chatbot-generated bug reports (like curl) and entire open source projects (i guess for some stupid crypto scheme)

  • It's either broken and not capable or takes jobs.

    You can't be both useless and destroying jobs at the same time

    it's not ai taking your job, it's your boss. all they need to believe is that language-shaped noise generator can make it work, doesn't matter if it does (it doesn't). then business either suffers greatly or hires people back (like klarna)

  • I''m curious about the strong negative feelings towards AI and LLMs. While I don't defend them, I see their usefulness, especially in coding. Is the backlash due to media narratives about AI replacing software engineers? Or is it the theft of training material without attribution? I want to understand why this topic evokes such emotion and why discussions often focus on negativity rather than control, safety, or advancements.

    It's a massive new disruptive technology and people are scared of what changes it will bring. AI companies are putting out tons of propaganda both claiming AI can do anything and fear mongering that AI is going to surpass and subjugate us to back up that same narrative.

    Also, there is so much focus on democratizing content creation, which is at best a very mixed bag, and little attention is given to collaborative uses (which I think is where AI shines) because it's so much harder to demonstrate, and it demands critical thinking skills and underlying knowledge.

    In short, everything AI is hyped as is a lie, and that's all most people see. When you're poking around with it, you're most likely to just ask it to do something for you: write a paper, create a picture, whatever, and the results won't impress anyone actually good at those things, and impress the fuck out of people who don't know any better.

    This simultaneously reinforces two things to two different groups: AI is utter garbage and AI is smarter than half the people you know and is going to take all the jobs.

  • But but, now idea man can vibecode. this shit destroys separation between management and codebase making it perfect antiproductivity tool

  • The industrial revolution called, they want their argument against the use of automated looms back.

    The capitalists owning the AI thanking you for fighting on their side.

  • Reads like a rant against the industrial revolution. "The industry is only concerned about replacing workers with steam engines!"

    You're probably not wrong. It's definitely along the same lines... although the repercussions of this particular one will be infinitely greater than those of the industrial revolution.

    Also, industrialization made for better products because of better manufacturing processes. I'm by no means sure we can say the same about AI. Maybe some day, but today it's just "an advanced dumbass" considering most real world scenarios.

  • They're called programmers, and it's faster and less expensive all around to just have humans do it better the first time.

    Have you talked to any programmers about this? I know several who, in the past 6 months alone, have completely changed their view on exactly how effective AI is in automating parts of their coding. Not only are they using it, they are paying to use it because it gives them a personal return on investment...but you know, you can keep using that push lawnmower, just don't complain when the kids next door run circles around you at a quarter the cost.

  • I''m curious about the strong negative feelings towards AI and LLMs. While I don't defend them, I see their usefulness, especially in coding. Is the backlash due to media narratives about AI replacing software engineers? Or is it the theft of training material without attribution? I want to understand why this topic evokes such emotion and why discussions often focus on negativity rather than control, safety, or advancements.

    Its not particularly accurate and then there's the privacy concerns

  • I''m curious about the strong negative feelings towards AI and LLMs. While I don't defend them, I see their usefulness, especially in coding. Is the backlash due to media narratives about AI replacing software engineers? Or is it the theft of training material without attribution? I want to understand why this topic evokes such emotion and why discussions often focus on negativity rather than control, safety, or advancements.

    • Useless fake spam content.
    • Posting AI slop ruins the "social" part of social media. You're not reading real human thoughts anymore, just statistically plausible words.
    • Same with machine-generated "art". What's the point?
    • AI companies are leeches; they steal work for the purpose of undercutting the original creators with derivative content.
    • Vibe coders produce utter garbage that nobody, especially not themselves understands, and somehow are smug about it.
    • A lot of AI stuff is a useless waste of resources.

    Most of the hate is justified IMO, but a couple weeks ago I died on the hill arguing that an LLM can be useful as a code documentation search engine. Once the train started, even a reply that thought software libraries contain books got upvotes.

  • It's either broken and not capable or takes jobs.

    You can't be both useless and destroying jobs at the same time

    It can absolutely be both. Expensive competent people are replaced with inexpensive morons all the time.

    • Useless fake spam content.
    • Posting AI slop ruins the "social" part of social media. You're not reading real human thoughts anymore, just statistically plausible words.
    • Same with machine-generated "art". What's the point?
    • AI companies are leeches; they steal work for the purpose of undercutting the original creators with derivative content.
    • Vibe coders produce utter garbage that nobody, especially not themselves understands, and somehow are smug about it.
    • A lot of AI stuff is a useless waste of resources.

    Most of the hate is justified IMO, but a couple weeks ago I died on the hill arguing that an LLM can be useful as a code documentation search engine. Once the train started, even a reply that thought software libraries contain books got upvotes.

    Not to mention the environmental cost is literally astronomical. I would be very interested if AI code is functional x times out of 10 because it's success statistic for every other type of generation is much lower.

  • The capitalists owning the AI thanking you for fighting on their side.

    Lots of assumptions there. In case you actually care, I don't think any one company should be allowed to own the base system that allows AI to function, especially if it's trained off of public content or content owned by other groups, but that's kind of immaterial here. It seems insane to villainize a technology because of who might make money off of it. These are two separate arguments (and frankly, they historically have the opposite benefactors from what you would expect).

    Prior to the industrial revolution, weaving was done by hand, making all cloth expensive or the result of sweatshops (and it was still comparatively expensive as opposed to today). Case in point, you can find many pieces of historical worker clothing that was specifically made using every piece of a rectangular piece of fabric because you did not want to waste any little bit (today it's common for people to throw any scraps away because they don't like the section of pattern).

    With the advent of automated looms several things happened:

    • the skilled workers who could operate the looms quickly were put out of a job because the machine could do things much faster, although it required a few specialized operators to set up and repair the equipment.
    • the owners of the fabric mills that couldn't afford to upgrade either died out or specialized in fabrics that could not be made by the machines (which set up an arms race of sorts where the machine builders kept improving things)
    • the quality of fabric went down: when it was previously possible to have different structures of fabric with just a simple order to the worker, it took a while for machines to do something other than a simple weave (actually it took the work of Ada Lovelace, and see above mentioned arms race), and looms even today require a different range of threads than what can be hand woven, but...
    • the cost went down so much that the accessibility went through the roof. Suddenly the average pauper COULD afford to clothe their entire family with a weeks worth of clothes. New industries cropped up. Health and economic mobility soared.

    This is a huge oversimplification, but history is well known to repeat itself due to human nature. Follow the bullets above with today's arguments against AI and you will see an often ignored end result: humanity can grow to have more time and resources to improve the health and wellness of our population IF we use the tools. You can choose to complain that the contract worker isn't going to get paid his equivalent of $5/hr for spending 2 weeks arguing back and forth about a dog logo for a new pet store, but I am going to celebrate the person who realizes they can automate a system to find new business filings and approach every new business in their area with a package of 20 logos each that were AI generated using unique prompts from their experience in logo design all while reducing their workload and making more money.

  • On top of everything else people mentioned, it's so profoundly stupid to me that AI is being pushed to take my summary of a message and turn it into an email, only for AI to then take those emails and spit out a summary again.

    At that point just let me ditch the formality and send over the summary in the first place.

    But more generally, I don't have an issue with "AI" just generative AI. And I have a huge issue with it being touted as this Oracle of knowledge when it isn't. It's dangerous to view it that way. Right now we're "okay" at differentiating real information from hallucinations, but so many people aren't and it will just get worse as people get complacent and AI gets better at hiding.

    Part of this is the natural evolution of techology and I'm sure the situation will improve, but it's being pushed so hard in the meantime and making the problem worse.

    The first Chat GPT models were kept private for being too dangerous, and they weren't even as "good" as the modern ones. I wish we could go back to those days.

    At that point just let me ditch the formality and send over the summary in the first place.

    A tangent a little bit but so much this. Why haven't we normalized using fewer words already?
    Why do we keep writing (some blogs and all of content marketing) whole screens of text to convey just a sentence of real content?
    Why do we keep the useless hello and regards instead of just directly getting to the points already?

  • 57 Stimmen
    3 Beiträge
    3 Aufrufe
    S
    What a piece of shit. Luckily the lady did not take her life.
  • 10 Stimmen
    3 Beiträge
    21 Aufrufe
    T
    "Science" under capitalism has always been funded and developed by/for fascists. The originals in the USA were the founding enslavers. The nazis had their time. Now it's the zios. R&D for genocide as usual.
  • 111 Stimmen
    24 Beiträge
    80 Aufrufe
    O
    Ingesting all the artwork you ever created by obtaining it illegally and feeding it into my plagarism remix machine is theft of your work, because I did not pay for it. Separately, keeping a copy of this work so I can do this repeatedly is also stealing your work. The judge ruled the first was okay but the second was not because the first is "transformative", which sadly means to me that the judge despite best efforts does not understand how a weighted matrix of tokens works and that while they may have some prevention steps in place now, early models showed the tech for what it was as it regurgitated text with only minor differences in word choice here and there. Current models have layers on top to try and prevent this user input, but escaping those safeguards is common, and it's also only masking the fact that the entire model is built off of the theft of other's work.
  • Founder of 23andMe buys back company out of bankruptcy auction

    Technology technology
    60
    1
    348 Stimmen
    60 Beiträge
    201 Aufrufe
    A
    Come on up to Canada, we still got that garlic bomb. I can still taste the one from last week
  • 692 Stimmen
    140 Beiträge
    206 Aufrufe
    H
    Maybe I don't want you to stop, big boy.
  • X/Twitter Pause Encrypted DMs.

    Technology technology
    52
    2
    257 Stimmen
    52 Beiträge
    178 Aufrufe
    L
    There may be several reasons for this. If I had to guess, they found a critical flaw and had to shut it down for security reasons.
  • 54 Stimmen
    18 Beiträge
    67 Aufrufe
    halcyon@discuss.tchncs.deH
    Though babble fish is a funny term, Douglas Adams named the creature "Babel fish", after the biblical story of the tower of Babel.
  • Microsoft Bans Employees From Using DeepSeek App

    Technology technology
    11
    1
    121 Stimmen
    11 Beiträge
    41 Aufrufe
    L
    (Premise - suppose I accept that there is such a definable thing as capitalism) I'm not sure why you feel the need to state this in a discussion that already assumes it as a necessary precondition of, but, uh, you do you. People blaming capitalism for everything then build a country that imports grain, while before them and after them it’s among the largest exporters on the planet (if we combine Russia and Ukraine for the “after” metric, no pun intended). ...what? What does this have to do with literally anything, much less my comment about innovation/competition? Even setting aside the wild-assed assumptions you're making about me criticizing capitalism means I 'blame [it] for everything', this tirade you've launched into, presumably about Ukraine and the USSR, has no bearing on anything even tangentially related to this conversation. People praising capitalism create conditions in which there’s no reason to praise it. Like, it’s competitive - they kill competitiveness with patents, IP, very complex legal systems. It’s self-regulating and self-optimizing - they make regulations and do bailouts preventing sick companies from dying, make laws after their interests, then reactively make regulations to make conditions with them existing bearable, which have a side effect of killing smaller companies. Please allow me to reiterate: ...what? Capitalists didn't build literally any of those things, governments did, and capitalists have been trying to escape, subvert, or dismantle those systems at every turn, so this... vain, confusing attempt to pin a medal on capitalism's chest for restraining itself is not only wrong, it fails to understand basic facts about history. It's the opposite of self-regulating because it actively seeks to dismantle regulations (environmental, labor, wage, etc), and the only thing it optimizes for is the wealth of oligarchs, and maybe if they're lucky, there will be a few crumbs left over for their simps. That’s the problem, both “socialist” and “capitalist” ideal systems ignore ape power dynamics. I'm going to go ahead an assume that 'the problem' has more to do with assuming that complex interacting systems can be simplified to 'ape (or any other animal's) power dynamics' than with failing to let the richest people just do whatever they want. Such systems should be designed on top of the fact that jungle law is always allowed So we should just be cool with everybody being poor so Jeff Bezos or whoever can upgrade his megayacht to a gigayacht or whatever? Let me say this in the politest way I know how: LOL no. Also, do you remember when I said this? ‘Won’t someone please think of the billionaires’ is wearing kinda thin You know, right before you went on this very long-winded, surreal, barely-coherent ramble? Did you imagine I would be convinced by literally any of it when all it amounts to is one giant, extraneous, tedious equivalent of 'Won't someone please think of the billionaires?' Simp harder and I bet maybe you can get a crumb or two yourself.