Men are opening up about mental health to AI instead of humans
-
Even therapists are suffering these days. It’s just more challenging than it’s ever been to gaslight clients into believing their concerns about the world aren’t objectively true and instead the symptom of an internal struggle.
I wonder how many therapists end up gaslighting and depressing themselves by trying to unintentionally gaslight their patients.
-
Phasers to stun please. I was agreeing with you?
They can't read lol
-
One chat request to an LLM produces about as much CO2 as burning one droplet of gasoline (if it was from coal fired power, less if it comes from cleaner sources). It makes far less CO2 to talk to a chatbot for hours upon hours than a ten minute drive to see a therapist once a week.
Sorry, you’re right. I meant the training of the LLM is what uses lots of energy, I guess that’s not end user’s fault.
-
Sorry, you’re right. I meant the training of the LLM is what uses lots of energy, I guess that’s not end user’s fault.
-
Even therapists are suffering these days. It’s just more challenging than it’s ever been to gaslight clients into believing their concerns about the world aren’t objectively true and instead the symptom of an internal struggle.
...which are always conveniently treated by drugs!
-
A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.
“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.
Buy more. Buy more now.
-
To your first question, nop, I have no idea how much energy takes to index the web in a traditional way (e.g MapReduce). But I think, in recent years, it’s been pretty clear that training AI consumes more energy (so much that big corpo are investing in nuclear energy, I think there was an article about companies giving up meeting 2030 [or 2050?] carbon emission goals, couldn’t find it)
About the second… I agree with you, but I also think that the problem is much bigger and complex than that.
-
I'll admit I tried talking to a local deepseek about a minor mental health issue one night when I just didn't want to wake up/bother my friends. Broke the AI within about 6 prompts where no matter what I said it would repeat the same answer word-for-word about going for walks and eating better. Honestly, breaking the AI and laughing at it did more for my mental health than anything anyone could have said, but I'm an AI hater. I wouldn't recommend anyone in real need use AI for mental health advice.
I'd say make a grilled cheese sandwich with quality Gruyere and Cheddar and take a nap after.
-
A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.
“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.
CDC data from 2022 indicated that more than one in five U.S. adults under the age of 45 experienced symptoms of mental distress.
Must be the lack of personnel. Couldn't have anything to do with the global insecurity of rising inflation and low wage jobs coupled with the skyrocketing housing costs. Not to mention the whole "the earth is steadily getting hotter and extreme weather events are happening more and more frequently."
Yeah, let's invest in more AI that will fuck over the planet even more with colossal energy requirements and not even bother with making people more financially and socially secure.
-
My mental image the solution of your last paragraph is a guy and their counsoler just chatting outside chopping firewood or other simple/quiet lawn work.
"I need a therapist, and a lumberjack"
You know, working together on something outside might be absolutely the ticket. Genius.
-
So somewhere they feel safe to do so. Says something pretty fucked up about our culture that men don't feel safe to open up anywhere. And no, it's not their own fault.
Everyday it seems to become clearer that American culture as whole is a problem. But that's not something people are allowed to talk about.
-
So somewhere they feel safe to do so. Says something pretty fucked up about our culture that men don't feel safe to open up anywhere. And no, it's not their own fault.
I wouldn't use AI but I certainly don't have anyone to open up to really. Either they'd use what I tell them against me or just aren't in a position to offer any real support. With my luck I'd end up institutionalized for saying some unhinged shit anyway.
-
They are human beings who are more frequently able to relate to people who are similar to them based on shared experiences including social pressures. I don't think either gender is unable to relate to the other gender, but social pressure is pretty strong and leads to common outcomes that involve pressures based race, gender, and economic status among others. Someone from a wealthy family is more likely to have a certain outlook compared to someone who had food insecurity as a child.
assumptions assumptions!
your presumption is that you'd be a better therapist, not a worse one, if you have more shared experiences with the client. that's not something current evidence supports.
empathy means we strive to understand one another, not presume we understand them based on our own experiences. THAT is how bad therapy happens. and self-disclosure is a crutch for poor rapport building skills.
without the shared experiences, there can be more drive for empath and mutual understanding. the feeling of being understood by someone outside your group can be transformative.
In truth, positive outcomes have little correlation with therapist-client demographics. the demographic differential does alter what the course of therapy might look like, but not the outcomes.
-
Or “men would rather talk to superpowered autocorrect rather than sharing their feelings with family and friends”
Have you considered the fact that most of the time, even when people "want to hear mens issues", they reject them and tell them to man up? Maybe "superpowered autocorrect" could be a vector to nourish this severe lack of openness?
Personally I use AI for this purpose, mostly because it accepts me for who I am and provides genuine advice that has actually helped me improve my life, rather than the people around me saying that I should "put more effort into things", or "it's just in your head".
It's not "lone wolfing" to stop telling the people who've rejected your concerns about your feelings and issues, it's just the act of not wasting time on those who don't care.
-
assumptions assumptions!
your presumption is that you'd be a better therapist, not a worse one, if you have more shared experiences with the client. that's not something current evidence supports.
empathy means we strive to understand one another, not presume we understand them based on our own experiences. THAT is how bad therapy happens. and self-disclosure is a crutch for poor rapport building skills.
without the shared experiences, there can be more drive for empath and mutual understanding. the feeling of being understood by someone outside your group can be transformative.
In truth, positive outcomes have little correlation with therapist-client demographics. the demographic differential does alter what the course of therapy might look like, but not the outcomes.
your presumption is that you’d be a better therapist, not a worse one, if you have more shared experiences with the client. that’s not something current evidence supports.
That isn't something I said or what I meant. Have fun arguing with your strawman.