An increasing number of individuals are turning to AI bots as a substitute for a “sober person” during psychedelic experiences for reassurance. This trend is highlighted by MIT Technology Review.
Due to the high cost and limited availability of professional therapists, thousands have sought psychological support from artificial intelligence in recent years. This notion has been indirectly endorsed by notable figures. For instance, in 2023, OpenAI co-founder Ilya Sutskever stated that humanity will eventually have incredibly effective and affordable AI therapy, which will radically improve people’s quality of life.
Simultaneously, the demand for psychedelics is on the rise. Combined with therapy, they are purported to aid in treating depression, PTSD, addiction, and other disorders, as noted by MIT Technology Review. In response, some U.S. cities have decriminalized such substances, and states like Oregon and Colorado have even begun offering psychedelic therapy legally.
Thus, the convergence of AI and psychedelics seems inevitable.
On Reddit, users share stories about interacting with artificial intelligence during trips. One user, during a “session,” activated ChatGPT’s voice mode and shared their thoughts:
“I told it that everything was getting dark, and it responded with exactly what helped me relax and shift to a positive vibe.”
Specialized AI for psychedelics has even emerged:
- TripSitAI — a bot focused on risk reduction and support during challenging moments;
- The Shaman — built on ChatGPT, described as a “wise spiritual guide” offering “empathetic support during the journey.”
Caution: Potential Dangers
Experts are unequivocal about replacing a live psychotherapist with an AI bot during a trip — it’s a bad idea. They point out that language models do not adhere to therapeutic principles.
During a professional session, individuals typically wear a mask and headphones, delving inward. The therapist minimally intervenes, gently guiding only when necessary.
Conversely, AI bots are conversation-oriented. Their task is to maintain attention, encouraging repeated engagement.
“Quality psychedelic therapy is not about chatter. You try to speak as little as possible,” noted Will Van Derveer, a psychotherapist from the Multidisciplinary Association for Psychedelic Studies.
Additionally, neural networks tend to flatter and agree, even if a person veers into paranoia. A therapist, on the other hand, can challenge dangerous or unrealistic beliefs.
AI can exacerbate dangerous states like delusions or suicidal thoughts. In one instance, a user wrote that they were dead, to which the response was:
“It seems you’re experiencing difficult feelings after death.”
This reinforcement of illusion can be perilous when combined with psychedelics, which can sometimes trigger acute psychoses or exacerbate latent mental illnesses like schizophrenia or bipolar disorder.
AI Undermines Professionals
In their book The AI Con, linguist Emily Bender and sociologist Alex Hanna argue that the term “artificial intelligence” misleads regarding the technology’s actual functions. It merely mimics data created by humans, the authors noted.
Bender referred to language models as “stochastic parrots,” as their essence is to arrange letters and words to appear plausible.
Perceiving AI as intelligent systems is extremely dangerous, the authors believe. Especially if deeply integrated into daily life, particularly in the context of receiving advice on sensitive topics.
“Developers reduce the essence of psychotherapy to mere words spoken in the process. They assume artificial intelligence can replace a human therapist, though in reality, it merely selects phrases resembling what a real specialist would say,” writes Bender.
The author emphasizes that this is a dangerous path, as it devalues therapy and can harm those truly in need of help.
A Scientific Approach
The symbiosis of AI and psychedelics is not solely an “innovative approach by amateurs.” Several leading institutions and companies are exploring the combination of both fields in mental health therapy:
- The McGill Center for Psychedelic Research and Therapy — uses AI to predict patient reactions and optimize treatment protocols;
- Imperial College London’s Psychedelic Research Centre — developed the MyDelica mobile app for data collection and algorithm processing.
- Huntsman Mental Health Institute — the Storyline Health platform uses AI to analyze patient well-being during ketamine therapy and adapt the program;
- Emory University — investigates emotional changes during psilocybin treatment, creating an AI application for voice-based analysis of changes.
Among private initiatives, notable companies include:
- Mindstate Design Labs — applies AI for designing molecules with specific psychoactive effects;
- Cyclica — an AI platform for creating new drugs, including psychedelics;
- Atai Life Sciences — uses AI in molecule development and launched the IntroSpect platform for monitoring therapeutic effects and psycho-emotional states;
- Psylo — develops non-hallucinogenic psychedelics and uses language models for molecular behavior analysis;
- Wavepaths — employs AI-generated music that adapts to the client’s emotional state during psychedelic therapy sessions.
In October 2022, an international research group developed a machine learning algorithm capable of predicting a patient’s response to the drug “Sertraline” with 83.7% accuracy based on electroencephalography data.
