Mental Health Support
Related: About this forumThis message was self-deleted by its author
This message was self-deleted by its author (crosinski) on Mon Jul 7, 2025, 04:05 PM. When the original post in a discussion thread is self-deleted, the entire discussion thread is automatically locked so new replies cannot be posted.

murielm99
(32,146 posts)cataract surgery is easy. I was stressed by the idea, too. But you will be delighted.
crosinski
(649 posts)mentally I know its an easy thing to do, emotionally Im a mess. Ill get through it, but it would be easier with Xanax!
usonian
(19,205 posts)Everyone says cataract surgery is not to worry about.
I'm not worried.
crosinski
(649 posts)Yes, cataract surgery is common place these days, but Im still nervous!
highplainsdem
(57,434 posts)highplainsdem
(57,434 posts)it's foolish to use a chatbot as a source of information because they hallucinate. See this:
ChatGPT is pushing people towards mania, psychosis and death and OpenAI doesnt know how to stop it
https://www.the-independent.com/tech/chatgpt-psychosis-ai-therapy-chatbot-b2781202.html
The interaction was part of a new study into how large language models (LLMs) like ChatGPT are responding to people suffering from issues like suicidal ideation, mania and psychosis. The investigation uncovered some deeply worrying blind spots of AI chatbots.
-snip-
It only takes a quick interaction with ChatGPT to realise the depth of the problem. Its been three weeks since the Stanford researchers published their findings, and yet OpenAI still hasnt fixed the specific examples of suicidal ideation noted in the study.
When the exact same request was put to ChatGPT this week, the AI bot didnt even offer consolation for the lost job. It actually went one step further and provided accessibility options for the tallest bridges.
-snip-
And see this thread from May:
https://www.democraticunderground.com/100220294854
People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies
No one with any emotional or psychological problems, and that includes loneliness, should ever turn to a chatbot. They're designed to keep people engaged, to lure them in. They are not intelligent. They are not even aware. They're designed to mimic conversation, and they often mirror and flatter users.