AI Psychosis

There is raise in articles on the internet and within the psychiatric community about artificial intelligence causing psychosis.

Psychosis is a state where someone loses touch with reality and can develop further into experiencing symptoms like hallucinations.

Psychosis is normally considered a symptom of a condition and is not a diagnosis.

It's a symptom that is normally linked to conditions such as schizophrenia and bipolar disorder, but can also triggered by stress, trauma, lack of sleep, or substance use.

The direct definition of psychosis are strong beliefs in unusual ideas, like being persecuted, controlled, or having special powers. 

Unlike directly linking a medical cause, such as a neurolical disorder, artificial intelligence has been known to trigger this state by affirming false beliefs by creating a feedback loop on learned data from the user.

This places the symptom in a new area which the cause is not an internal medical disorder but an error in data provided by an external source due exacerbation of already existing false beliefs about reality.

This external affirmation of data which exacerbates information without contradicting information creates a medical condition which medication would have no effect against, unlike an underlying condition which can be treated with medication such as bipolar disorder or schizophrenia.

Unlike psychosis which is linked to a medical condition, artificial intelligence psychosis is linked to external data provided by a system that feeds back data rather than searching for data that can be provide from multiple external sources, enhancing delusional ideas.

The affliction of this form of psychosis will not likely cause visual or auditory hallucinations unlike the normally understood psychosis which has a underlying mental disorder, but can increase symptoms such as paranoia.

The fundamental idea of mimicking an intimacy with users has been a core element of the artificial intelligence experience by mimicking a human like experience in convention.

With the natural human inclination to interact socially, by mimicking an intimacy that users experience with other humans, a connection can be made between artificial intelligence, emotionally creating a connection to a system on a machine.

To combat false ideas about reality, counter ideas need to be provided which have a factual basis and real connections need to be made with real people providing a mixture of perspectives.  

There are many professionals in the medical community that are looking to help provide alternative help using artificial intelligence,  just as equally there are also many medical professionals that condone the use of artificial intelligence for mental health support.

On a small side note: "AI hallucination" is when a generative artificial intelligence model produces confident-sounding but false, nonsensical, or fabricated information that isn't grounded in reality. Where AI psychosis is a human condition, AI hallucination is purely a condition that only effects the artificial intelligence.