Skip to content

AI slows down some experienced software developers, study finds

Technology
81 35 0
  • Was it really Russia’s invasion, or just because the interest rates went up to prevent too much inflation after the COVID stimulus packages? Hard to imagine Russia had that much demand for software compared to the rest of the world.

    Did you not read what I wrote?

    Inflation went up due to the knock-on effects of the sanctions. Specifically prices for oil and gas skyrocketed.

    And since everything runs on oil and gas, all prices skyrocketed.

    Covid stimulus packages had nothing to do with that, especially in 2023, 2024 and 2025, when there were no COVID stimulus packages, yet the inflation was much higher than at any time during COVID.

    Surely it is not too much to ask that people remember what year stuff happened in, especially if we are talking about things that happened just 2 years ago.

  • Experienced software developer, here. "AI" is useful to me in some contexts. Specifically when I want to scaffold out a completely new application (so I'm not worried about clobbering existing code) and I don't want to do it by hand, it saves me time.

    And... that's about it. It sucks at code review, and will break shit in your repo if you let it.

    Everyone on Lemmy is a software developer.

  • This post did not contain any content.

    Explain this too me AI. Reads back exactly what's on the screen including comments somehow with more words but less information
    Ok....

    Ok, this is tricky. AI, can you do this refactoring so I don't have to keep track of everything. No... Thats all wrong... Yeah I know it's complicated, that's why I wanted it refactored. No you can't do that... fuck now I can either toss all your changes and do it myself or spend the next 3 hours rewriting it.

    Yeah I struggle to find how anyone finds this garbage useful.

  • My fear for the software industry is that we'll end up replacing junior devs with AI assistance, and then in a decade or two, we'll see a lack of mid-level and senior devs, because they never had a chance to enter the industry.

    100% agreed. It should not be used as a replacement but rather as an augmentation to get the real benefits.

  • Explain this too me AI. Reads back exactly what's on the screen including comments somehow with more words but less information
    Ok....

    Ok, this is tricky. AI, can you do this refactoring so I don't have to keep track of everything. No... Thats all wrong... Yeah I know it's complicated, that's why I wanted it refactored. No you can't do that... fuck now I can either toss all your changes and do it myself or spend the next 3 hours rewriting it.

    Yeah I struggle to find how anyone finds this garbage useful.

    I have asked questions, had conversations for company and generated images for role playing with AI.

    I've been happy with it, so far.

  • This post did not contain any content.

    "Using something that you're not experienced with and haven't yet worked out how to best integrate into your workflow slows some people down"

    Wow, what an insight! More at 8!

    As I said on this article when it was posted to another instance:

    AI is a tool to use. Like with all tools, there are right ways and wrong ways and inefficient ways and all other ways to use them. You can’t say that they slow people down as a whole just because some people get slowed down.

  • Explain this too me AI. Reads back exactly what's on the screen including comments somehow with more words but less information
    Ok....

    Ok, this is tricky. AI, can you do this refactoring so I don't have to keep track of everything. No... Thats all wrong... Yeah I know it's complicated, that's why I wanted it refactored. No you can't do that... fuck now I can either toss all your changes and do it myself or spend the next 3 hours rewriting it.

    Yeah I struggle to find how anyone finds this garbage useful.

    Sounds like you just need to find a better way to use AI in your workflows.

    Github Copilot in Visual Studio for example is fantastic and offers suggestions including entire functions that often do exactly what you wanted it to do, because it has the context of all of your code (if you give it that, of course).

  • Experienced software developer, here. "AI" is useful to me in some contexts. Specifically when I want to scaffold out a completely new application (so I'm not worried about clobbering existing code) and I don't want to do it by hand, it saves me time.

    And... that's about it. It sucks at code review, and will break shit in your repo if you let it.

    I've found it to be great at writing unit tests too.

    I use github copilot in VS and it's fantastic. It just throws up suggestions for code completions and entire functions etc, and is easily ignored if you just want to do it yourself, but in my experience it's very good.

    Like you said, using it to get the meat and bones of an application from scratch is fantastic. I've used it to make some awesome little command line programs for some of my less technical co-workers to use for frequent tasks, and then even got it to make a nice GUI over the top of it. Takes like 10% of the time it would have taken me to do it - you just need to know how to use it, like with any other tool.

  • Like I said, I do find it useful at times. But not only shouldn't it replace coders, it fundamentally can't. At least, not without a fundamental rearchitecturing of how they work.

    The reason it goes down a "really bad path" is that it's basically glorified autocomplete. It doesn't know anything.

    On top of that, spoken and written language are very imprecise, and there's no way for an LLM to derive what you really wanted from context clues such as your tone of voice.

    Take the phrase "fruit flies like a banana." Am I saying that a piece of fruit might fly in a manner akin to how another piece of fruit, a banana, flies if thrown? Or am I saying that the insect called the fruit fly might like to consume a banana?

    It's a humorous line, but my point is serious: We unintentionally speak in ambiguous ways like that all the time. And while we've got brains that can interpret unspoken signals to parse intended meaning from a word or phrase, LLMs don't.

    The reason it goes down a “really bad path” is that it’s basically glorified autocomplete. It doesn’t know anything.

    Not quite true - GitHub Copilot in VS for example can be given access to your entire repo/project/etc and it then "knows" how things tie together and work together, so it can get more context for its suggestions and created code.

  • I like the saying that LLMs are good at stuff you don’t know. That’s about it.

    They're also bad at that though, because if you don't know that stuff then you don't know if what it's telling you is right or wrong.

  • AI tools are way less useful than a junior engineer, and they aren't an investment that turns into a senior engineer either.

    They're tools that can help a junior engineer and a senior engineer with their job.

    Given a database, AI can probably write a data access layer in whatever language you want quicker than a junior developer could.

  • They might become seniors for 99% more investment. Or they crash out as “not a great fit” which happens too. Juniors aren’t just “senior seeds” to be planted

    Interesting downvotes, especially how there are more than there are upvotes.

    Do people think "junior" and "senior" here just relate to age and/or time in the workplace? Someone could work in software dev for 20 years and still be a junior dev. It's knowledge and skill level based, not just time-in-industry based.

  • Explain this too me AI. Reads back exactly what's on the screen including comments somehow with more words but less information
    Ok....

    Ok, this is tricky. AI, can you do this refactoring so I don't have to keep track of everything. No... Thats all wrong... Yeah I know it's complicated, that's why I wanted it refactored. No you can't do that... fuck now I can either toss all your changes and do it myself or spend the next 3 hours rewriting it.

    Yeah I struggle to find how anyone finds this garbage useful.

    You shouldn't think of "AI" as intelligent and ask it to do something tricky. The boring stuff that's mostly just typing, that's what you get the LLMs to do. "Make a DTO for this table <paste>" "Interface for this JSON <paste>"

    I just have a bunch of conversations going where I can paste stuff into and it will generate basic code. Then it's just connecting things up, but that's the fun part anyway.

  • The reason it goes down a “really bad path” is that it’s basically glorified autocomplete. It doesn’t know anything.

    Not quite true - GitHub Copilot in VS for example can be given access to your entire repo/project/etc and it then "knows" how things tie together and work together, so it can get more context for its suggestions and created code.

    That's still not actually knowing anything. It's just temporarily adding more context to its model.

    And it's always very temporary. I have a yarn project I'm working on right now, and I used Copilot in VS Code in agent mode to scaffold it as an experiment. One of the refinements I included in the prompt file to build it is reminders throughout for things it wouldn't need reminding of if it actually "knew" the repo.

    • I had to constantly remind it that it's a yarn project, otherwise it would inevitably start trying to use NPM as it progressed through the prompt.
    • For some reason, when it's in agent mode and it makes a mistake, it wants to delete files it has fucked up, which always requires human intervention, so I peppered the prompt with reminders not to do that, but to blank the file out and start over in it.
    • The frontend of the project uses TailwindCSS. It could not remember not to keep trying to downgrade its configuration to an earlier version instead of using the current one, so I wrote the entire configuration for it by hand and inserted it into the prompt file. If I let it try to build the configuration itself, it would inevitably fuck it up and then say something completely false, like, "The version of TailwindCSS we're using is still in beta, let me try downgrading to the previous version."

    I'm not saying it wasn't helpful. It probably cut 20% off the time it would have taken me to scaffold out the app myself, which is significant. But it certainly couldn't keep track of the context provided by the repo, even though it was creating that context itself.

    Working with Copilot is like working with a very talented and fast junior developer whose methamphetamine addiction has been getting the better of it lately, and who has early onset dementia or a brain injury that destroyed their short-term memory.

  • 0 Stimmen
    2 Beiträge
    12 Aufrufe
    H
    Just to add — this survey is for literally anyone who's been through the project phase in college. We’re trying to figure out: What stops students from building cool stuff? What actually helps students finish a project? How mentors/teachers can support better? And whether buying/selling projects is something people genuinely do — and why. Super grateful to anyone who fills it. And if you’ve had an experience (good or bad) with your project — feel free to share it here too
  • 0 Stimmen
    1 Beiträge
    11 Aufrufe
    Niemand hat geantwortet
  • 1k Stimmen
    95 Beiträge
    17 Aufrufe
    G
    Obviously the law must be simple enough to follow so that for Jim’s furniture shop is not a problem nor a too high cost to respect it, but it must be clear that if you break it you can cease to exist as company. I think this may be the root of our disagreement, I do not believe that there is any law making body today that is capable of an elegantly simple law. I could be too naive, but I think it is possible. We also definitely have a difference on opinion when it comes to the severity of the infraction, in my mind, while privacy is important, it should not have the same level of punishments associated with it when compared to something on the level of poisoning water ways; I think that a privacy law should hurt but be able to be learned from while in the poison case it should result in the bankruptcy of a company. The severity is directly proportional to the number of people affected. If you violate the privacy of 200 million people is the same that you poison the water of 10 people. And while with the poisoning scenario it could be better to jail the responsible people (for a very, very long time) and let the company survive to clean the water, once your privacy is violated there is no way back, a company could not fix it. The issue we find ourselves with today is that the aggregate of all privacy breaches makes it harmful to the people, but with a sizeable enough fine, I find it hard to believe that there would be major or lasting damage. So how much money your privacy it's worth ? 6 For this reason I don’t think it is wise to write laws that will bankrupt a company off of one infraction which was not directly or indirectly harmful to the physical well being of the people: and I am using indirectly a little bit more strict than I would like to since as I said before, the aggregate of all the information is harmful. The point is that the goal is not to bankrupt companies but to have them behave right. The penalty associated to every law IS the tool that make you respect the law. And it must be so high that you don't want to break the law. I would have to look into the laws in question, but on a surface level I think that any company should be subjected to the same baseline privacy laws, so if there isn’t anything screwy within the law that apple, Google, and Facebook are ignoring, I think it should apply to them. Trust me on this one, direct experience payment processors have a lot more rules to follow to be able to work. I do not want jail time for the CEO by default but he need to know that he will pay personally if the company break the law, it is the only way to make him run the company being sure that it follow the laws. For some reason I don’t have my usual cynicism when it comes to this issue. I think that the magnitude of loses that vested interests have in these companies would make it so that companies would police themselves for fear of losing profits. That being said I wouldn’t be opposed to some form of personal accountability on corporate leadership, but I fear that they will just end up finding a way to create a scapegoat everytime. It is not cynicism. I simply think that a huge fine to a single person (the CEO for example) is useless since it too easy to avoid and if it really huge realistically it would be never paid anyway so nothing usefull since the net worth of this kind of people is only on the paper. So if you slap a 100 billion file to Musk he will never pay because he has not the money to pay even if technically he is worth way more than that. Jail time instead is something that even Musk can experience. In general I like laws that are as objective as possible, I think that a privacy law should be written so that it is very objectively overbearing, but that has a smaller fine associated with it. This way the law is very clear on right and wrong, while also giving the businesses time and incentive to change their practices without having to sink large amount of expenses into lawyers to review every minute detail, which is the logical conclusion of the one infraction bankrupt system that you seem to be supporting. Then you write a law that explicitally state what you can do and what is not allowed is forbidden by default.
  • 153 Stimmen
    4 Beiträge
    24 Aufrufe
    J
    Agreed - the end of the article does state compiling untrusted repos is effectively the same as running an untrusted executable, and you should treat it with the same caution (especially if its malware or gaming cheat adjacent)
  • 39 Stimmen
    15 Beiträge
    60 Aufrufe
    C
    I believed they were doing such things against budding competitors long before the LLM era. My test is simple. Replace it with China. Would the replies be the opposite of what you've recieved so far? The answer is yes. Absolutely people would be frothing at the mouth about China being bad actors. Western tech bros are just as paranoid, they copy off others, they steal ideas. When we do it it's called "innovation".
  • Researchers develop recyclable, healable electronics

    Technology technology
    3
    1
    15 Stimmen
    3 Beiträge
    25 Aufrufe
    T
    Isn't the most common failure modes of electronics capacitors dying, followed closely by heat in chips? This research sounds cool and all.
  • 163 Stimmen
    15 Beiträge
    60 Aufrufe
    L
    Online group started by a 15 year old in Texas playing Minecraft and watching extreme gore they said in this article. Were they also involved in said sexual exploiting of other kids, or was that just the spin offs that came from other people/countries? It all sounds terrible but I wonder if this was just a kid who did something for attention and then other perpetrators got involved and kept taking it further and down other rabbit holes. Definitely seems like a know what your kid is doing online scenario, but also yikes on all the 18+ members who joined and participated in such.
  • 0 Stimmen
    6 Beiträge
    34 Aufrufe
    P
    I applaud this, but I still say it's not far enough. Adjusted, the amount might match, but 121.000 is still easier to cough up for a billionaire than 50 is for a single mother of two who can barely make ends meet