Half of companies planning to replace customer service with AI are reversing course
-
Phone menu trees
I assume you mean IVR? It's okay to be not familiar with the term. I wasn't either until I worked in the industry. And people that are in charge of them are usually the dumbest people ever.
people that are in charge of them are usually the dumbest people ever.
I think that's actively encouraged by management in some areas: put the dumbest people in charge to make the most irritating frustrating system possible. It's a feature of the system.
Some of the most irritating systems I have interacted with (government disability benefits administration) actually require "press 1 for X, press 2 for y" and if you have your phone on speaker, the system won't recognize the touch tones, you have to do them without speakerphone.
-
Yeah but these pesky workers cut into profits because you have to pay them.
They're unpredictable. Every employee is a potential future lawsuit, they can get injured, sexually harassed, all kinds of things - AI doesn't press lawsuits against the company, yet.
-
It is important to understand that most of the job of software development is not making the code work. That's the easy part.
There are two hard parts::
-Making code that is easy to understand, modify as necessary, and repair when problems are found.
-Interpreting what customers are asking for. Customers usually don't have the vocabulary and knowledge of the inside of a program that they would need to have to articulate exactly what they want.
In order for AI to replace programmers, customers will have to start accurately describing what they want the software to do, and AI will have to start making code that is easy for humans to read and modify.
This means that good programmers' jobs are generally safe from AI, and probably will be for a long time. Bad programmers and people who are around just to fill in boilerplates are probably not going to stick around, but the people who actually have skill in those tougher parts will be AOK.
A good systems analyst can effectively translate user requirements into accurate statements, does not need to be a programmer. Good systems analysts are generally more adept in asking clarifying questions, challenging assumptions and sussing out needs. Good programmers will still be needed but their time is wasted gathering requirements.
-
My current conspiracy theory is that the people at the top are just as intelligent as everyday people we see in public.
Not that everyone is dumb but more like the George Carlin joke "Think of how stupid the average person is, and realize half of them are stupider than that.”
That applies to politicians, CEOs, etc. Just cuz they got the job, doesn't mean they're good at it and most of them probably aren't.
Absolutely. Wealth isn't competence, and too much of it fundamentally leads to a physical and psychological disconnect with other humans. Generational wealth creates sheltered, twisted perspectives in youth who have enough money and influence to just fail upward their entire lives.
"New" wealth creates egocentric narcissists who believe they "earned" their position. "If everyone else just does what I did, they'd be wealthy like me. If they don't do what I did, they must not be as smart or hard-working as me."
Really all of meritocracy is just survivorship bias, and countless people are smarter and more hard-working, just significantly less lucky. Once someone has enough capital that it starts generating more wealth on its own - in excess of their living expenses even without a salary - life just becomes a game to them, and they start trying to figure out how to "earn" more points.
-
There's awesome AI out there too. AlphaFold completely revolutionized research on proteins, and the medical innovations it will lead to are astounding.
Determining the 3d structure of a protein took yearsuntil very recently. Folding at Home was a worldwide project linking millions of computers to work on it.
Alphafold does it in under a second, and has revealed the structure of 200 million proteins. It's one of the most significant medial achievements in history. Since it essentially dates back to 2022, we're still a few years from feeling the direct impact, but it will be massive.
Sure. And AI that identifies objects in pictures and converts pictures of text into text. There's lots of good and amazing applications about AI. But that's not what we're complaining about.
We're complaining about all the people who are asking, "Is AI ready to tell me what to do so I don't have to think?" and "Can I replace everyone that works for me with AI so I don't have to think?" and "Can I replace my interaction with my employees with AI so I can still get paid for not doing the one thing I was hired to do?"
-
That's part of the problem isn't it? "AI" is a blanket term that has recently been used to cover everything from LLMs to machine learning to RPA (robotic process automation). An algorithm isn't AI, even if it was written by another algorithm.
And at the end of the day none of it is artificial intelligence. Not to the original meaning of the word. Now we have had to rebrand AI as AGI to avoid the association with this new trend.
“AI” is a blanket term that has recently been used to cover everything from LLMs to machine learning to RPA (robotic process automation).
Yup. That was very intentionally done by marketing wanks in order to muddy the water. Look! This
computer program, er we mean "AI" can convert speech to text. Now, let us install it into your bank account." -
Defining contextual relationship between words sounds like predicting the next word in a set, mate.
Only because it is.
-
Defining contextual relationship between words sounds like predicting the next word in a set, mate.
Not at all. It's not "how likely is the next word to be X". That wouldn't be context.
I'm guessing you didn't watch the video.
-
There's awesome AI out there too. AlphaFold completely revolutionized research on proteins, and the medical innovations it will lead to are astounding.
Determining the 3d structure of a protein took yearsuntil very recently. Folding at Home was a worldwide project linking millions of computers to work on it.
Alphafold does it in under a second, and has revealed the structure of 200 million proteins. It's one of the most significant medial achievements in history. Since it essentially dates back to 2022, we're still a few years from feeling the direct impact, but it will be massive.
Determining the 3d structure of a protein took yearsuntil very recently. Folding at Home was a worldwide project linking millions of computers to work on it.
Alphafold does it in under a second, and has revealed the structure of 200 million proteins. It's one of the most significant medial achievements in history. Since it essentially dates back to 2022, we're still a few years from feeling the direct impact, but it will be massive.
You realize that's because the gigantic server farms powering all of this "AI" are orders of magnitude more powerful than the sum total of all of those idle home PC's, right?
Folding@Home could likely also do in it in under a second if we threw 70+ TERAwatt hours of electricity at server farms full of specialzed hardware just for that purpose, too.
-
If Bezos thinks people are just going to forget about not getting a $65 item that they paid for and still shop at Amazon, instead of making sure they either get their item or reverse the charge, and then reduce or stop shopping on Amazon but of his ridiculous hassles, he is an idiot.
The airline industry does this with hundreds of dollars worth of airplane tickets all the time.
-
This post did not contain any content.
But but but, Daddy CEO said that RTO combined with Gen AI would mean continued, infinite growth and that we would all prosper, whether corposerf or customer!
-
Man, if only someone could have predicted that this AI craze was just another load of marketing BS.
/s
This experience has taught me more about CEO competence than anything else.
Almost like those stupid monkey drawings that were "worth money." Lmao.
-
I was thinking about this the other day and don't think it would happen any time soon. The people who put the CEO in charge (usually the board members) want someone who will make decisions (that the board has a say in) but also someone to hold accountable for when those decisions don't realize profits.
AI is unaccountable in any real sense of the word.
AI is unaccountable in any real sense of the word.
Doesn't stop companies from trying to deflect accountability onto AI. Citations Needed recently did an episode all about this: https://citationsneeded.medium.com/episode-217-a-i-mysticism-as-responsibility-evasion-pr-tactic-7bd7f56eeaaa
-
AI is unaccountable in any real sense of the word.
Doesn't stop companies from trying to deflect accountability onto AI. Citations Needed recently did an episode all about this: https://citationsneeded.medium.com/episode-217-a-i-mysticism-as-responsibility-evasion-pr-tactic-7bd7f56eeaaa
I suppose that makes perfect sense. A corporation is an accountability sink for owners, board members and executives, so why not also make AI accountable?
I was thinking more along the lines of the "human in the loop" model for AI where one human is responsible for all the stuff that AI gets wrong despite it physically not being possible to review every line of code an AI produces.
-
A good systems analyst can effectively translate user requirements into accurate statements, does not need to be a programmer. Good systems analysts are generally more adept in asking clarifying questions, challenging assumptions and sussing out needs. Good programmers will still be needed but their time is wasted gathering requirements.
Most places don't have all good system analysts.
-
And a lot of burnt carbon to get there
Have you ever played a 3D game
-
A good systems analyst can effectively translate user requirements into accurate statements, does not need to be a programmer. Good systems analysts are generally more adept in asking clarifying questions, challenging assumptions and sussing out needs. Good programmers will still be needed but their time is wasted gathering requirements.
What is a systems analyst?
I never worked in a big enough software team to have any distinction other than "works on code" and "does sales work".
The field I was in was all small places that were very specialized in what they worked on.
When I ran my own company, it was just me. I did everything that the company needed to take are of.
-
What is a systems analyst?
I never worked in a big enough software team to have any distinction other than "works on code" and "does sales work".
The field I was in was all small places that were very specialized in what they worked on.
When I ran my own company, it was just me. I did everything that the company needed to take are of.
Systems analyst is like a programmer analyst without the coding. I agree, in my experience small shops were more likely to have just programmer analysts. Often also responsible for hardware as well.
If it's just you I hope you didn't need a systems analyst to gather requirements and then work with the programmer to implement them. If you did, might need another kind of analysis.
-
That can be accomplished with basic if-else decision tree. You don't need the massive resource sink that is AI
The kind of AI I mentioned isn’t a massive resource sink. I can run that sort of thing locally on my own computer. They don’t need supercomputers for level 1 material.
-
Whenever I call in to a service because it's not working, when I get stuck talking to a computer, I'm fucking furious. Every single AI implementation I've worked with has been absolute trash. I spam click zero and yell "operator" when it says it didn't hear me or asks for my problem, and I've 100% of the time made it through to a person. People also suck, but they at least understand what I'm saying and aren't as patronizing.
This was all via chat so much faster than the painful voice prompts. I agree those are terrible.