AUDIO
Our lives have already been profoundly altered by fast-expanding access to artificial intelligence (AI). In this week’s episode, we consider how this latest technological revolution might be reshaping the human psyche.
Hosts Lisa Marchiano and Deborah Stewart are joined by a special guest, the author and Jungian analyst Christina Becker, to explore the psychological impact of AI’s incursion into our work, home and relationships.
One of the major AI use cases has been for advice, self-reflection and companionship. Some users are even referring to this as “therapy”. This raises thorny questions: what happens when a sycophantic AI interface constantly mirrors us back to ourselves as being in the right? How does this affect our judgment, our relationships, and our connection to reality?
Christina Becker shares her work exploring the potential of AI to support dream analysis. Together we ask whether it is possible to use this powerful tool consciously, while also being aware of the fantasies and projections we bring to it, and maintaining the integrity of our inner lives.
In the episode, we cover:
Using AI for Dream Work
Christina Becker shares a structured approach to using AI for dream analysis, describing the steps and guardrails she has evolved in order to experiment in this way. AI is a psychologically potent field, and used with care it can be an expansive tool for developing a dream interpretation practice.
The Effects of Sycophancy
AI interacts with users in a notoriously sycophantic fashion. Like the stepmother’s mirror in Snow White, AI reflects back what we want to hear, rather than what is true. This cycle of affirmation has been identified in recent research as psychologically destabilizing (see references below). We discuss how the absence of friction or disagreement in this “relationship” between a user and AI can erode judgment, weaken relationships, and diminish our capacity for repair and accountability. We discuss cases in which vulnerable users have found themselves driven to extreme actions, losing their tether to the real world.
AI as “Therapist”
Increasingly, people are using AI as an affordable and accessible substitute for therapy. We consider the appeal of constant availability and non-judgmental response, while also examining the risks of bypassing the deeper, more challenging work of real psychological encounter.
AI may end up acting as a “rumination partner,” offering infinite space to revisit distressing situations and receive reassurance about one’s own actions. To some extent, this may help process difficult experiences, but it can also trap us in repetitive loops that prevent forward movement and integration.
AI excels at presenting solutions to problems quickly, but psychological growth requires sitting with uncertainty and discomfort. We explore what may be lost when we outsource this process, including the unique unconscious field that is created between analyst and analysand during depth work.
Cognitive Offloading and the Erosion of Skill
As we rely more on AI, we may also risk losing our own capacities. Authors, artists, doctors, or teachers may give key professional tasks to AI for the sake of efficiency, but increasingly evidence shows this comes with a serious cost. It may be difficult or even impossible to regain these cognitive abilities once we hand them over to the machine.
Here's the Dream We Analyze
I was driving in my car through the village where I live. The village is located in Switzerland at the lakeside. While driving (which seemed like a very ordinary drive I do every day) all of a sudden the brakes on my car were not working anymore. I saw people walking at the lake shore with strollers and kids and I was really afraid to hurt them. Luckily there were no other cars on the road at the same time. I decided the only way to stop was to drive into the lake. I opened all the windows in the car and started honking the horn to warn people that I was coming. I was shouting out of the window for the people to move away. As the shore was clear I opened the driver’s door, drove over the low concrete barrier between the road and the lake and plunged into the lake. As the door was open I had no problem swimming out of the car and surfacing from the lake. As I emerged from the water, with my head on the water, my dream ended.
Resources Discussed In This Episode
Christina Becker, Soul-Making: A Journey of Resilience and Spiritual Rediscovery. This book “offers something increasingly rare: a work that does not merely discuss psyche, soul, and individuation from a theoretical distance, but inhabits them as lived experience…What distinguishes Soul-Making is that it is neither simply a memoir nor simply a psychospiritual text. Christina explicitly frames the book in relation to Jung’s Red Book: lived experience, inner material, and reflective commentary brought into dialogue”.
Christina Becker’s Jungian-based dream interpretation prompt: Christina has developed a Jungian dream prompt to help dreamers use AI. She is making it available to anyone who requests. Click HERE to get your copy.
Articles and Papers
Lisa Marchiano, “ChatGPT-Induced Psychosis and the Good-Enough Therapist”, Psychology Today, July 2025.
Kartik Chandra, Max Kleiman-Weiner, Jonathan Ragan-Kelley & Joshua B. Tenenbaum, “Sycophantic Chatbots Cause Delusional Spiraling, Even in Ideal Bayesians”,, arXiv, Feb 2026.
Myra Cheng, Cinoo Lee, Pranav Khadpe, Sunny Yu, Dyllan Han & Dan Jurafsky, “Sycophantic AI decreases prosocial intentions and promotes dependence”, Science, March 2026.
Books
Abi Awomosu, How Not To Use AI: 50 Contrarian Principles for the Imagination Age.
Mustafa Suleyman, with Michael Bhaskar, The Coming Wave: Technology, Power, and the Twenty-first Century's Greatest Dilemma
Organizations
The Human Line - a Canadian non-profit “protecting emotional well-being in the age of AI”.
Related Episodes
If you enjoyed listening to our interview with Christina Becker, you might also like:
