Vibe coding service Replit deleted production database
-
He was vibe-coding in production. Am I reading that right? Sounds like an intern-level mistake.
You didn't read closely enough.
“Replit QA’s it itself (super cool), at least partially with some help from you … and … then you push it to production — all in one seamless flow.”
Replit is an agent that does stuff for you including deploying to production. If someone don't want to use a tool like that, I don't blame you, but it was working as it is supposed to. It's a whole platform that doesn't cleanly separate development and production.
-
You didn't read closely enough.
“Replit QA’s it itself (super cool), at least partially with some help from you … and … then you push it to production — all in one seamless flow.”
Replit is an agent that does stuff for you including deploying to production. If someone don't want to use a tool like that, I don't blame you, but it was working as it is supposed to. It's a whole platform that doesn't cleanly separate development and production.
Replit is an agent that does stuff for you including deploying to production.
Ahahahahahhahahahahhahahaha, these guys deserve a lost database for that, Jesus.
-
There's a lot of other expenses with an employee (like payroll taxes, benefits, retirement plans, health plan if they're in the USA, etc), but you could find a self-employed freelancer for example.
Or just get an employee anyways because you'll still likely have a positive ROI. A good developer will take your abstract list of vague requirements and produce something useful and maintainable.
the employee also gets to eat and have a place to live
which is nice
-
Having read the entire thread, I can only assume this to be sarcasm.
-
My god….
-
All I see is people chatting with an LLM as if it was a person. “How bad is this on a scale
of 1 to 100”, you’re just doomed to get some random answer based solely on whatever context is being fed in the input and that you probably don’t know the extent of it.Trying to make the LLM “see its mistakes” is a pointless exercise. Getting it to “promise” something is useless.
The issue with LLMs working with human languages is people eventually wanting to apply human things to LLMs such as asking why as if the LLM knows of its own decision process. It only takes an input and generates an output, it won’t be able to have any “meta thought” explanation about why it outputted X and not Y in the previous prompt.
Yeah the interaction are pure waste of time I agree, make it write an apology letter? WTF! For me it looks like a fast track way to learn environment segregation, & secret segregation. Data is lost, learn from it and there are tool already in place like git like alembic for proper development.
-
Most of those expenses are mitigated by the fact that companies buy them in bulk on huge plans.
There's no bulk rate on payroll taxes or retirement benefits (pensions or employer 401k match). There can be some discounts on health insurance, but is not very much and those are at orders of magnitude. So company with 500 employees will pay the same rates as 900. You get partial discounts if you have something like 10,000 employees.
If you're earning $100k gross as an employee, your employer is spending $125k to $140k for their total costs (your $100k gross pay is included in that number).
Large companies also make massive profits because of the scale they work on. Matching 401(k) contributions? It doesn’t need to be an order of magnitude larger for it to make a huge difference. Simply doubling my 401(k) is a big deal.
And of course they get a “ball rate“ on payroll taxes, especially for companies who have over 1000 employees or over 5000 over 10,000. They experienced this by having a lower tax rate for larger businesses.
Not to mention that they often pay more and pay a steady wage due to the fact they can afford it. Freelance contractors make less, and work isn’t guaranteed to be steady.
Businesses, particularly word businesses, operate on much larger profit margins than most of any freelance contractor.
-
This post did not contain any content.
Title should be “user give database prod access to a llm which deleted the db, user did not have any backup and used the same db for prod and dev”. Less sexy and less llm fault.
This is weird it’s like the last 50 years of software development principles are being ignored. -
This whole thread reads like slop.
-
He was vibe-coding in production. Am I reading that right? Sounds like an intern-level mistake.
He had one db for prod and dev, no backup, llm went in override mode and delete it dev db as it is developing but oops that is the prod db. And oops o backup.
Yeah it is the llm and replit’s faults. /s
-
Hahahahahahahahahahahaha AHAHAHAHAHAHAHhahahaH
-
This post did not contain any content.
They ran dev tools in prod.
This is so dumb there's an ISO about it.
-
This post did not contain any content.
in which the service admitted to “a catastrophic error of judgement”
It’s fancy text completion - it does not have judgement.
The way he talks about it shows he still doesn’t understand that. It doesn’t matter that you tell it simmering in ALL CAPS because that is no different from any other text.
-
Title should be “user give database prod access to a llm which deleted the db, user did not have any backup and used the same db for prod and dev”. Less sexy and less llm fault.
This is weird it’s like the last 50 years of software development principles are being ignored.llms allowed them to glide all the way to the point of failure without learning anything
-
Corporations: "Employees are too expensive!"
Also, corporations: "$100k/yr for a bot? Sure."
Bots don't need healthcare
-
llms allowed them to glide all the way to the point of failure without learning anything
Exactly, if you read their twitter thread, they are learning about git, data segregation, etc.
The same article could have been written 20 years ago about someone doing shit stuff via excel macro when a lot of stuff were excel centric.
-
My god, that's a lot to process. A couple that stand out:
Comments proposing to use github as the database backup. This is Keyword Architecture, and these people deserve everything they get.
The Replit model can also send out communications? It's just a matter of time before some senior exec dies on the job but nobody notices because their personal LLM keeps emailing reports that nobody reads.
-
Corporations: "Employees are too expensive!"
Also, corporations: "$100k/yr for a bot? Sure."
It looked more like a one time development expense, instead of an ongoing salary.
-
in which the service admitted to “a catastrophic error of judgement”
It’s fancy text completion - it does not have judgement.
The way he talks about it shows he still doesn’t understand that. It doesn’t matter that you tell it simmering in ALL CAPS because that is no different from any other text.
Well, there was a catastrophic error of judgement. It was made by whichever human thought it was okay to let a LLM work on production codebase.
-
This post did not contain any content.
AI is good at doing a thing once.
Trying to get it to do the same thing the second time is janky and frustrating.I understand the use of AI as a consulting tool (look at references, make code examples) or for generating template/boilerplate code. You know, things you do once and then develop further upon on your own.
But using it for continuous development of an entire application? Yeah, it's not good enough for that.