Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(62,136 posts)
Tue Aug 26, 2025, 12:36 PM Aug 2025

New Paper Finds Cases of "AI Psychosis" Manifesting Differently From Schizophrenia

https://futurism.com/paper-ai-psychosis-schizophrenia

-snip-

As lead author Hamilton Morrin explained to Scientific American, the analysis found that the users showed obvious signs of delusional beliefs, but none of the symptoms "that would be in keeping with a more chronic psychotic disorder such as schizophrenia," like hallucinations and disordered thoughts.

-snip-

Indeed, it feels impossible to deny that AI chatbots have a uniquely persuasive power, more so than any other widely available technology. They can act like a "sort of echo chamber for one," Morrin, a doctoral fellow at King's College, told the magazine. Not only are they able to generate a human-like response to virtually any question, but they're typically designed to be sycophantic and agreeable. Meanwhile, the very label of "AI" insinuates to users that they're talking to an intelligent being, an illusion that tech companies are gladly willing to maintain.

Morrin and his colleagues found three types of chatbot-driven spirals. Some suffering these breaks believe that they're having some kind of spiritual awakening or are on a messianic mission, or otherwise uncovering a hidden truth about reality. Others believe they're interacting with a sentient or even god-like being. Or the user may develop an intense emotional or even romantic attachment to the AI.

-snip-

It first starts with the AI being used for mundane tasks. Then as the user builds trust with the chatbot, they feel comfortable making personal and emotional queries. This quickly escalates as the AI's ruthless drive to maximize engagement creates a "slippery slope" effect, the researchers found, resulting in a self-perpetuating process that leads to the user being increasingly "unmoored" from reality.

-snip-



More at that Futurism link, and at this Scientific American article:

Truth, Romance and the Divine: How AI Chatbots May Fuel Psychotic Thinking
https://www.scientificamerican.com/article/how-ai-chatbots-may-be-fueling-psychotic-episodes/
6 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
New Paper Finds Cases of "AI Psychosis" Manifesting Differently From Schizophrenia (Original Post) highplainsdem Aug 2025 OP
Kick SheltieLover Aug 2025 #1
This sounds similar to many belief systems leftstreet Aug 2025 #2
This message was self-deleted by its author jfz9580m Aug 2025 #6
AI brainwashing. nt Prairie_Seagull Aug 2025 #3
It's showcasing human psychology and sophisticated AI is not needed andym Aug 2025 #4
no surprise - talking about really different things stopdiggin Aug 2025 #5

leftstreet

(40,674 posts)
2. This sounds similar to many belief systems
Tue Aug 26, 2025, 12:52 PM
Aug 2025

In terms of, say, religion

Children are taught young, they build trust, they end up "talking" to a deity, etc but no one considers them un-moored from reality

Fascinating

DURec

Response to leftstreet (Reply #2)

andym

(6,066 posts)
4. It's showcasing human psychology and sophisticated AI is not needed
Tue Aug 26, 2025, 01:11 PM
Aug 2025

All is needed is a kind of parroting to achieve the ELIZA effect.

ELIZA: A simple computer program from the 60s also had similar effects on people:
https://en.wikipedia.org/wiki/ELIZA

"ELIZA is an early natural language processing computer program developed from 1964 to 1967[1] at MIT by Joseph Weizenbaum.[2][3][page needed] Created to explore communication between humans and machines, ELIZA simulated conversation by using a pattern matching and substitution methodology that gave users an illusion of understanding on the part of the program, but had no representation that could be considered really understanding what was being said by either party.[4][5][6] Whereas the ELIZA program itself was written (originally)[7] in MAD-SLIP, the pattern matching directives that contained most of its language capability were provided in separate "scripts", represented in a lisp-like representation.[8] The most famous script, DOCTOR, simulated a psychotherapist of the Rogerian school (in which the therapist often reflects back the patient's words to the patient),[9][10][11] and used rules, dictated in the script, to respond with non-directional questions to user inputs. As such, ELIZA was one of the first chatterbots ("chatbot" modernly) and one of the first programs capable of attempting the Turing test.[12][13]

Weizenbaum intended the program as a method to explore communication between humans and machines. He was surprised that some people, including his secretary, attributed human-like feelings to the computer program,[3] a phenomenon that came to be called the Eliza effect. Many academics believed that the program would be able to positively influence the lives of many people, particularly those with psychological issues, and that it could aid doctors working on such patients' treatment.[3][14] While ELIZA was capable of engaging in discourse, it could not converse with true understanding.[15] However, many early users were convinced of ELIZA's intelligence and understanding, despite Weizenbaum's insistence to the contrary.[6]





stopdiggin

(15,463 posts)
5. no surprise - talking about really different things
Tue Aug 26, 2025, 01:23 PM
Aug 2025

just because a medical situation 'presents' in a similar manner - does not mean that it 'originates' from the same source - or develops/resolves and/or 'proceeds' in the same direction.
( and even the 'presents' part - is a little thin in this case .. ? )

Latest Discussions»General Discussion»New Paper Finds Cases of ...