People Are Being Involuntarily Committed, Jailed After Spiraling Into "ChatGPT Psychosis"
-
Who are these people? This is ridiculous.
I guess with so many humans, there is bound to be a small number of people who have no ability to think for themselves and believe everything a chat bot is writing in their web browser.
People even have romantic relationships with these things.
I dont agree with the argument that chat gpt should "push back". They have an example in the article where the guy asked for tall bridges to jump from, and chat gpt listed them of course.
Are we expecting the llm to act like a psychologist, evaluating if the users state of mind is healthy before answering questions?
Very slippery slope if you ask me.
ffs, this isn't chatgpt causing psychosis. It's schizo people being attracted like moths to chatgpt because it's very good at conversing in schizo.
-
Who are these people? This is ridiculous.
I guess with so many humans, there is bound to be a small number of people who have no ability to think for themselves and believe everything a chat bot is writing in their web browser.
People even have romantic relationships with these things.
I dont agree with the argument that chat gpt should "push back". They have an example in the article where the guy asked for tall bridges to jump from, and chat gpt listed them of course.
Are we expecting the llm to act like a psychologist, evaluating if the users state of mind is healthy before answering questions?
Very slippery slope if you ask me.
hi, they're going to be in psychosis regardless of what LLMs do. they aren't therapists and mustn't be treated as such. that goes for you too
-
These are the same people who Google stuff then believe every conspiracy theory website they find telling them the 5G waves mind control the pilots to release the chemtrails to top off the mind control fluoride in the water supplies.
They honestly think the AI is a sentient super intelligence instead of the Google 2 electric gargling boogaloo.
being a sucker isn't the same as being in psychosis
-
ffs, this isn't chatgpt causing psychosis. It's schizo people being attracted like moths to chatgpt because it's very good at conversing in schizo.
indeed, though I could do without using disparaging language for one of the most vulnerable populations in the medical world.............
-
Like with every other thing there is: if you don't know how it basically works or what it even is, you maybe should not really use it.
And especially not voice an opinion about it.
Furthermore, every tool can be used for self-harm if used incorrectly. You shouldn't put a screwdriver in your eyes. Just knowing what a plane does won't make you an able pilot and will likely result in dire harm too.Not directed at you personally though.
Agreed, for sure.
But if Costco modified their in-store sample booth policy and had their associates start offering free samples of bleach to children - when kids start drinking bleach we wouldn't blame the children; we wouldn't blame the bleach; we'd be mad at Costco.
-
Agreed, for sure.
But if Costco modified their in-store sample booth policy and had their associates start offering free samples of bleach to children - when kids start drinking bleach we wouldn't blame the children; we wouldn't blame the bleach; we'd be mad at Costco.
Yes, but also no. Unmonitored(!) Children are a special case. Them being clueless and easy victims is inherent by design.
You can't lay any blame on them, so they kinda make an unfair argument.
Can't blame a blind person for not seeing you. -
ffs, this isn't chatgpt causing psychosis. It's schizo people being attracted like moths to chatgpt because it's very good at conversing in schizo.
CGPT literally never gives up. You can give it an impossible problem to solve, and tell it you need to solve it, and it will never, ever stop trying. This is very dangerous for people who need to be told when to stop, or need to be disengaged with. CGPT will never disengage.
-
We were warned years ago!
-
Yes, but also no. Unmonitored(!) Children are a special case. Them being clueless and easy victims is inherent by design.
You can't lay any blame on them, so they kinda make an unfair argument.
Can't blame a blind person for not seeing you.You're saying people with underdeveloped mental and social skills are somehow never analagous in any way at all to children? There are full grown neurotypical and clinically healthy adults that are irresponsible enough to be analagous to children, but a literal case of someone trusting an untrustworthy authority due to a lapse of critical thinking skills … bears no resembalance at all to child-like behavior, at all?
Wow. That's kind of some ivory tower stuff right there.
-
Who are these people? This is ridiculous.
I guess with so many humans, there is bound to be a small number of people who have no ability to think for themselves and believe everything a chat bot is writing in their web browser.
People even have romantic relationships with these things.
I dont agree with the argument that chat gpt should "push back". They have an example in the article where the guy asked for tall bridges to jump from, and chat gpt listed them of course.
Are we expecting the llm to act like a psychologist, evaluating if the users state of mind is healthy before answering questions?
Very slippery slope if you ask me.
Prompt: "Certain people can be harmed by current LLMS."
OP: "Those people are stupid. Problem solved."
I hope you can recognize that you are expressing a defense mechanism, not rational thought.
-
CGPT literally never gives up. You can give it an impossible problem to solve, and tell it you need to solve it, and it will never, ever stop trying. This is very dangerous for people who need to be told when to stop, or need to be disengaged with. CGPT will never disengage.
i had not considered this and now I'm terrified
-
You're saying people with underdeveloped mental and social skills are somehow never analagous in any way at all to children? There are full grown neurotypical and clinically healthy adults that are irresponsible enough to be analagous to children, but a literal case of someone trusting an untrustworthy authority due to a lapse of critical thinking skills … bears no resembalance at all to child-like behavior, at all?
Wow. That's kind of some ivory tower stuff right there.
"Bears no resemblance" != the same
If you're too lazy to think critically this is child-like, yes. But you are basically able to. If you aren't, for whatever reason, then you can't be blamed for not knowing better.
Otherwise I don't get your point.