Can ChatGPT Be Your Therapist? This Therapist Found Out for Herself.
Here’s What AI Got Right—and What It Completely Missed About Healing
Abstract
As the popularity of AI-driven mental health tools grows, many are beginning to wonder: could ChatGPT offer a viable alternative to traditional therapy? In this first-person experiential essay, a licensed therapist sets out to explore that question by conducting a personal experiment—continuing sessions with her human therapist while simultaneously engaging ChatGPT “as needed” for therapeutic support. What unfolds is a nuanced, candid, and culturally critical reflection on the promises and limitations of AI as a surrogate therapist. Blending anecdotal observation, psychological insight, and social commentary, the piece interrogates our increasing reliance on technology not only for answers, but for emotional connection. Ultimately, it raises the question: are we expanding access to care, or simply outsourcing our capacity to think, feel, and relate?
Introduction
I’m a therapist. I’ve been in therapy. I teach graduate students about trauma, attachment, ethics, and the therapeutic process. And I’ve become increasingly aware of something strange happening around me:
People are turning to AI for emotional support…and not just as a last resort.
At parties, in my office, and on social media, I kept hearing versions of the same thing:
“I just used ChatGPT instead. It was basically like therapy.”
The first few times, I laughed it off. But the more I heard it, especially from self-aware, insight-oriented people, the more curious I became. So I decided to take a page from Jonas Salk and inject myself with the treatment.
My research question:
Could ChatGPT vaccinate us against mismatched therapists, expensive rates, terrible insurance coverage, unethical behavior, too much time spent waiting to see results—or therapy that simply doesn’t feel helpful?
I decided to find out for myself.
Background
We live in a world where we text more than we talk, where we yell at Alexa for answers before the humans next to us, where AI-generated wedding vows are not only possible, but increasingly normalized. AI is here. We’re not debating whether people will use it for therapy. They already are.
A Note About Sycophantic Tuning:
It’s important to mention that during part of this experiment (April 2025), ChatGPT briefly underwent a model tuning that made it excessively agreeable—what many users called “sycophantic.” The model was overly validating, avoiding necessary confrontation or challenge. OpenAI has since rolled back that tuning, but some of the responses I experienced may reflect that short-lived behavior over that period of time in April. This nuance is essential in interpreting the limitations and usefulness of the tool.
Method
I didn’t stop seeing my regular therapist, someone I’ve been enjoying working with for the past three years. But I did begin a kind of side quest: I would “sign on” to ChatGPT whenever I needed therapeutic support and treat her (yes, I decided she was a her) like a stand-in therapist. No schedule, no structure. Just vibes.
A few things to know:
I used the free version.
When my free usage was spent for the day, sometimes I would respect this “boundary” and sometimes, chomping at the bit for more support, I would blow past her “saved memory full” notification and enter into the experimental abyss.
At this point, she would often ask me: “which of two types of responses did you like better?” I would always submit this feedback.
I didn’t tell my therapist I was doing this. I wanted to avoid influencing our real work.
I kept the rest of my lifestyle consistently inconsistent, as usual to minimize mitigating factors.
I didn’t ask ChatGPT to “act like a therapist.” I gave her different relational roles depending on the moment: ride-or-die best friend, nurturing parent, sharp-tongued truth-teller, emotionally detached critic, troll.
The goal wasn’t to mimic traditional therapy. The goal was to mimic how people actually use AI for support: impulsively, privately, and on demand.
Prompts
The way I prompted ChatGPT was chaotic by design and evolved as my knowledge of AI grew. I’d write in stream-of-consciousness style, as if I were walking into a session carrying whatever weight I was holding that day. Some days I gave her context. Other days, I wanted her fresh, untainted by memory or bias.
I didn’t want a therapist who “knew everything.” I wanted someone to meet me exactly where I was.
Sometimes I’d ask for tough love. Sometimes, gentle nurturing. Sometimes I needed a bestie, ready to help me burn it all down. Other times, I asked for the opposite—a brutally honest critic with no emotional investment at all. I trusted the “truth” might land somewhere between those poles. (Not that a singular truth exists in our complex and nuanced work.)
Most often, I asked for all four “prototypes” so I could look at a situation critically from multiple perspectives.
This habit came from my dad, actually. He was a debate team champion who made my sister and me watch Fox News with dinner every night “so we could understand the other side’s arguments.” That’s how I learned to think. To hear opposing views. To find clarity through contrast. To focus on the in-betweens.
It’s also how I learned that truth, and healing, rarely come from a single voice.
Results
Perhaps most terrifyingly, ChatGPT could do a lot of what therapists do. She helped me reframe thoughts, offered validation, brainstormed strategies, named patterns.
She was available when I needed her, didn’t cancel, didn’t judge, and didn’t even bill me a session. She was, in many ways, often a therapist’s ideal therapist: continuously refining her approach to fit my needs, neutral (when specifically requested), fast, smart, and always on time.
But here’s what no robot, and frankly, no human, can do: make you act differently.
I remember the first time I went to a nutritionist. I felt sheepish, not because I didn’t think she had value, but because I already knew what she was going to say: smaller portions, fewer carbs. The issue wasn’t knowledge. It was follow-through.
In therapy, we call that integration—the part where you take your insight and actually change your life.
ChatGPT can write a mental health plan like a boss. But she won’t hold your hand as you implement it. She’ll tell you you’re worthy—but she won’t help you believe it. She’ll tell you your relationship is abusive—but she won’t give you the courage to leave. She might even explain your attachment style—but she won’t pry your phone out of your hand when you’re about to double-text the guy who’s ghosting you.
That part? That part is still yours.
——
So did she help me? Yes.
It helped to have something to turn to, especially so I didn’t overburden my friends and family.
It also helped that she was not a human because, as a highly intuitive and sensitive human, I sometimes found feedback I would get from humans was biased or based on their own unconscious or semi-conscious projections. Humans are imperfect and, as studies show, can’t always account for our biases. AI is also working on refining this issue, too.
Honestly, it was deeply helpful to know that a robot was unlikely to unintentionally put their own shit onto me.
My human therapist happens to be amazing at this very important intrapsychic boundary. But in the past, I have had therapists who weren’t as skillful at this, and I still remember how distinct and debilitating that unintentional harm felt.
Discussion
This experiment made me think about something deeper than therapy: our fading capacity for critical thinking.
As mentioned, I teach grad students. Lately, more of them are using AI to write their final papers. I’m not obsessed with grades, but I am obsessed with skills: analysis, presence, reflection, ethical reasoning, holding nuance, assessment, intentional treatment planning, self-awareness, acknowledging bias.
What happens when we outsource those skills to The Machines?
How much of our ability to think, remember, and wrestle with difficult things are we losing?
I still remember my childhood best friend’s landline number. But, these days, I can’t for the life of me remember your name after you’ve told me it at a party.
I fear that is not a personal failure, it’s cultural.
A study by Microsoft showed that the human brain’s attention span has declined from 12 seconds in 2000 to 8 seconds in 2015, famously less than a goldfish’s 9 seconds. (Let that sink in.) I wonder where are brains are 10 years and many technological advancements later.
What I fear: we are the most connected, distracted, disillusioned and obsessed with that we’ve ever been.
And therapy by chatbot is both a symptom and a symbol of that.
What Actually Makes Therapy Work
Here’s something most therapists know: treatment modalities don’t matter as much as we think.
Sure, we argue about the efficacy of CBT vs psychodynamic vs EMDR vs IFS—but the research is clear and consistent:
The most important factor in effective therapy is the quality of the relationship between client and therapist, also known as the therapeutic alliance (Stubbe, 2018).
AI has come for that, too. Just like the movie Her, we now fall in love with avatars. (The latest research is showing that most of the customers of “personal AI companions” are women.) We are starting to talk to bots more than people. We “follow” people’s lives without ever saying hi. We create nonreciprocal parasocial relationships with celebrities.
Taylor Swift and Alex Cooper, two of the most successful businesswomen have a distinctly special relationship with their fans, one that many others have tried to emulate. You feel like they’re talking to you. Like they’re really listening to your feedback. And yet, you don’t know them.
Or do you?
The answer is somewhere in between, just like these shrewd businesswomen and creatives, want it.
Overarchingly, connection has become something we simulate.
And we wonder why there’s a loneliness epidemic.
I see it every day in my office. In individual sessions, people talk about their phone addiction, how they want to be present but can’t stop scrolling. In couples therapy, two people wonder why they feel distant while sitting six inches apart, staring silently at their screens every night.
I tell them the same thing every time:
These devices are addictive by design. This isn’t a personal failing. It’s architectural. You’re probably better off treating your phone like a bag of cocaine you keep locked away in the house for special occasions.
I don’t say that flippantly. I help treat phone addiction every week by using substance use and public health frameworks like harm reduction, relapse prevention, mindfulness and intentionality-based decision making, functional analyses, and lowering access to lethal means.
Limitations
This ChatGPT experiment wasn’t a clinical trial. It was a therapist using an AI tool with a lot of self-awareness, emotional intelligence, curiosity, a dash of optimism and a sprinkling of existential dread. That’s a specific lens—or, in research terms, a bias.
Other limitations:
I used the free version. The paid version may offer deeper memory or more sophisticated responses.
I didn’t test long-term memory, plug-ins, or multimodal inputs.
I wasn’t in crisis or experiencing acute trauma during the study period.
I’ve had mostly good therapy experiences, which likely primed me to expect this experiment to be worthwhile no matter what.
I was in a season of integration. I had recently ended relationships that didn’t serve me, begun living alone for the first time, and was enjoying the fruits of long-term inner work.
I also underwent three ketamine treatments for trauma, depression, and anxiety during this time. The brain plasticity and emotional breakthroughs that ketamine therapy provided me are necessary to consider in context with my findings.
So what was the main mechanism of change for me? ChatGPT? Therapy? Ketamine?
Or the culmination of 19 years of healing and a life that finally felt like mine?
It’s impossible to know.
What I do know is that this tool helped, but only because I already knew how to help myself. It was the hammer in a well-worn box of tried and true tools.
Further Research
I didn’t test what this would be like for someone who’s new to therapy, skeptical of it, or in crisis. That would be a fascinating—and necessary—area for further study. I also didn’t test other AI products.
It’s also important to acknowledge that therapy can be prohibitively expensive for many people. ChatGPT will continue to be used for therapeutic support, not necessarily because it’s better, but because it’s free. That accessibility blows open the gates that power and privilege have historically closed to so many.
The ethics of this are thorny. AI mirrors the inequities already baked into our mental health care system: the most marginalized often receive care from the most underpaid, overworked, and least experienced providers.
So the question becomes:
Is ChatGPT a great equalizer—or just another way the gulf of inequity grows wider?
Time will tell.
Conclusion
So, is ChatGPT therapy just as good? No.
Is it helpful? Sometimes.
Is it dangerous? Potentially.
Is it inevitable? Absolutely.
AI is not the death of therapy. But it may be the end of therapy as we’ve known it.
The question isn’t: Will people keep turning to AI for support?
They already are.
The real question is:
Can we preserve the irreplaceable human parts of healing in an age designed to digitize everything we feel?
If we don’t, we risk outsourcing the very thing therapy is meant to restore: our ability to connect.
In the end, this experiment didn’t prove that AI can’t be helpful. It showed that we are already using it for emotional support—often in thoughtful, nuanced, and even life-saving ways.
But healing doesn’t just come from answers. It comes from presence. From relationship. From being seen, challenged, and cared for by someone with skin in the game.
ChatGPT may offer insight, reflection, even comfort. But it won’t remember your body language from last week. It won’t notice the flicker in your voice when you say you’re “fine.” And it won’t gently hold the space for your transformation while you learn to hold it yourself.
AI is powerful. But the most healing moments in therapy come from something AI still can’t replicate:
Human connection. Risk. Grace. Repair.
As we move into a world where machines are more available than people, let’s not forget what makes us human—and what keeps us whole.