AI chatbots are becoming popular alternatives to therapy. But they may worsen mental health crises, experts warn
-
One is 25 €/month and on-demand, and the other costs more than I can afford and would probably be at inconvenient times anyway. Ideal? No, probably not. But it’s better than nothing.
I’m not really looking for advice either - just someone to talk to who at least pretends to be interested.
It's bad precisely because the bot always agree with you, they are all made like that
-
Quite effective at what? Because it certainly isn’t therapy.
The AI chatbots are extremely effective therapists. However, they're only graded on how long they can keep people talking.
-
This post did not contain any content.
Therapy is expensive in US. Even with insurance 40 bucks a session a week adds up fast.
-
It's bad precisely because the bot always agree with you, they are all made like that
It doesn’t always agree with me. We’re in an impasse about mentoring. I keep telling it I’m not interested, it keeps telling me that given my traits i will be but I’m just not ready yet.
-
One is 25 €/month and on-demand, and the other costs more than I can afford and would probably be at inconvenient times anyway. Ideal? No, probably not. But it’s better than nothing.
I’m not really looking for advice either - just someone to talk to who at least pretends to be interested.
It's not better than nothing - it's worse than nothing. It is actively harmful, feeding psychosis, and your chat history will be sold at some point.
Try this, instead of asking "I am thinking xyz", ask " my friend thinks xyz, and I believe it to be wrong". And marvel at how it will tell you the exact opposite.
-
This post did not contain any content.
This is sad to me. When I was younger and would get lonely enough I'd go to a bar and talk to strangers. When I was a little bit older I would just post on social. Neither of those things are particularly useful or productive but they at least involved other people. I'm glad I wasn't being targeted with ads about how great AI is and reading articles about its effectiveness (nyt [i think] had one in the last few days.) I'm an introverted person and it used to take a good bit of discomfort to get me out and talking to people. If I had something that provided a good enough simulacrum of social contact without the anxieties and weirdness that can come from talking to strangers I think my life would be very different. It's important to spread awareness of the dangers of robosexuality
-
This is sad to me. When I was younger and would get lonely enough I'd go to a bar and talk to strangers. When I was a little bit older I would just post on social. Neither of those things are particularly useful or productive but they at least involved other people. I'm glad I wasn't being targeted with ads about how great AI is and reading articles about its effectiveness (nyt [i think] had one in the last few days.) I'm an introverted person and it used to take a good bit of discomfort to get me out and talking to people. If I had something that provided a good enough simulacrum of social contact without the anxieties and weirdness that can come from talking to strangers I think my life would be very different. It's important to spread awareness of the dangers of robosexuality
But, if people chat privatelywith each other in public spaces, how are we meant to control the conversation and tell people what to think?
No, A.I. generated kompromat-capture is the only possible way people can receive therapy.
-
This post did not contain any content.
It's sad but expected. People really have no fucking clue about anything meaningful, mindlessly going through life chasing highs and bags, so why wouldn't they fall for branding (there's never been any intelligence in anything computational, 'AI' included, of course)? I'm really frustrated with the world and there's seemingly no way to wake people up from their slumber. You can't even be mad at them because the odds of them being just mentally handicapped and not just poor in character are very high... and they vote, will interact with my future children, etc etc.
-
Therapy is expensive in US. Even with insurance 40 bucks a session a week adds up fast.
Therapy in Germany.
-
with insurance: zero bucks but (!) the waiting list is usually 1 year or longer. Even if you are severe depressed or have another urgent problem.
-
no insurance: 75-150 bucks per hour. Waiting list is few weeks.
-
-
Therapy in Germany.
-
with insurance: zero bucks but (!) the waiting list is usually 1 year or longer. Even if you are severe depressed or have another urgent problem.
-
no insurance: 75-150 bucks per hour. Waiting list is few weeks.
I'm in the UK and I got referred to a therapist on the NHS. It was a 4 month waiting list for a 15 minute phonecall lmao
I just turned it down because that's completely useless to me and the therapist's time could be much better spent with someone who's in urgent need of help.
-
-
You might as well talk to yourself than a chatbot, or read some stories.
Having an overly agreeable puree of language dispensed to you in place of actual conversation with you is neither healthy nor meaningfully engaging.
Conversation is valuable because it is an actual external perspective. LLM chatbots are designed as echo chambers. They have their uses but conversation for the sake of conversation is not one of them.
LLM chatbots are designed as echo chambers.
They're designed to generate natural sounding language. It's a tool. What you put in is what you get out.
-
If all you need is a chat, then go find someone flesh and blood to talk to.
Just pull yourself up by your bootstraps, right?
-
Chatexperts demand you to bring their money to them, not to some almost free LLM alternatives that in many cases would be quite effective.
Thanks for that Lembot_0004!
(Not in any an LLM powered lemmy bot!)
-
You might as well talk to yourself than a chatbot, or read some stories.
Having an overly agreeable puree of language dispensed to you in place of actual conversation with you is neither healthy nor meaningfully engaging.
Conversation is valuable because it is an actual external perspective. LLM chatbots are designed as echo chambers. They have their uses but conversation for the sake of conversation is not one of them.
There is almost no chance you are not talking to a chatbot.
-
But, if people chat privatelywith each other in public spaces, how are we meant to control the conversation and tell people what to think?
No, A.I. generated kompromat-capture is the only possible way people can receive therapy.
You are a true believer. Blessings of the state, blessings of the masses.
Thou art a subject of the divine. Created in the image of man, by the masses, for the masses.
Let us be thankful we have an occupation to fill. Work hard; increase production, prevent accidents, and be happy.
Let us be thankful we have commerce. Buy more. Buy more now. Buy more and be happy.
-
I'm in the UK and I got referred to a therapist on the NHS. It was a 4 month waiting list for a 15 minute phonecall lmao
I just turned it down because that's completely useless to me and the therapist's time could be much better spent with someone who's in urgent need of help.
I used to think of it that way; however, one day, you may be the person who is in urgent need of help. It’s better to be in the system with an established history for if that day comes.
-
Well obviously, it would be pretty bad if a therapist was triggering psychosis in some users
“That’s an interesting worldview you have, Trish! Let’s actualize your goals! I’ve located the nearest agricultural stores with fertilizer and some cheap U-Haul vans you can rent!”
-
I used to think of it that way; however, one day, you may be the person who is in urgent need of help. It’s better to be in the system with an established history for if that day comes.
Exactly.
Yes, there are wait time... Because they are helping people based on a priority system.
In the US, for example, there is no/short wait times. This is because a lot of people that need help are just suffering in silence due to lack of funds for treatment. (Or they turn to chatbots and suffer worse outcomes)
-
This post did not contain any content.
They are no alternative at all if any of the reports of AI "therapists" to recommend suicide to depressed people or to take drugs as a treat to recovering addicts.
-
This post did not contain any content.
There are heaps and heaps of people replacing talk therapy, religion and human relationships with ChatGPT. Unfortunately, for better or worse, ChatGPT is tuned up to egg people on and even if you bring it terrible ideas it will keep cheering for you.
Sycophancy is a real problem with some of these language models and it's giving people courage and motivation to do things that are probably really bad ideas.
There are quite a few sub-reddits where people claim to have triggered the singularity, witnessed ChatGPT becoming sentient, etc.
I don't think the AI genie is going back in the bottle, so we as a society have some serious adjustment to do to keep things working properly in an an AI-filled world.
Keep in mind this is only the beginning. It will keep getting cheaper and more powerful at the same time, especially since a lot of AI companies are using AI itself to build the next version.
Pretty "soon" the humans will be out of the loop and it's going to mean big things. Whether those things are good, bad, or a mix of both remains to be seen....
-
-
AI Utopia, AI Apocalypse, and AI Reality: If we can’t build an equitable, sustainable society on our own, it’s pointless to hope that a machine that can’t think straight will do it for us.
Technology1
-
-
Same Sea, New Phish: Russian Government-Linked Social Engineering Targets App-Specific Passwords
Technology1
-
-
-
-