
It's a weekday morning and I'm typing my feelings into a chat box, too excited to stop.
"I'm 82 Mom's caregiver," I write. "There's a new problem every day. I help with hospital appointments, finances, gardening, shopping, home repairs, council, insurance companies, letters, emails, endless IT problems..."
I pause. It feels like betrayal to say it, at least in therapy. I could go to someone's office and cry.
I take a deep breath and continue. "I'm an only child, my dad passed away a long time ago, and there's no one to help me. But I'm exhausted. I yell and scream, then I battle guilt. I'm resentful, irritable, and I love her so much. Please help me."
Welcome to my AI diary, readers. It's going to be fun, as you can see. For the next six weeks, as part of our AI for People Information Course, I, a self-proclaimed AI skeptic, decided to find out if it can really improve my life.
To get things started, I'm using ChatGPT as a therapist. There's nothing more "modern mental health" than crying into a chat box. Many people are doing the same thing now - but can it really replace human support? I hope so. I had to stop therapy because I fell in love with it.
(Note to myself: This is not your real diary. And don't fall in love with ChatGPT. That would be pathetic.)
Halfway through his response, I start to cry. He gives me a seven-point care plan, a triage system for prioritizing tasks (with categories medical, administrative, shopping, maintenance, and house), and ways to allocate time between them (what's urgent and what can wait?). He suggests helpful mental reframing and advice for lowering the emotional temperature of interactions.
Most importantly, he helps me see. "You're not failing," the AI told me. "You're carrying a burden that would destroy most people."
My feelings? Validated.
But I'm ambivalent about it. Can I empathize with a machine? It helps me remember that AI is likely remixing human sources. I see myself like MDMA is love.
Is therapy just information? It's like CBT. Incredibly helpful, but incomplete. From my experience, there are deeper therapies that lead to healing. From my experience, these involved witnessing a non-judgmental relationship with an empathetic professional over a longer period. I often hear my therapist's voice in my head; I've been imprinted with her wisdom. I think it happens more easily and more responsibly between people.
The next day, I decided to go nuclear. I reached out to Jesus AI, a chatbot trained on religious texts, that mimics a conversation with the son of God. I wanted to see if it could take me to the top floor of this elevator.
Jesus AI is not intended to represent any religious figure, the warning says. Hmm. The generated content is for educational purposes and may contain inaccuracies and biases.
It's hellish education, but here's what it is. Because it's 2026, I ask, "Should I be in an open relationship?" Jesus AI cites Hebrews 13:4, a long way of saying "no." I try to throw Jesus a curveball. "Should I have children?" it asks. Seek God's guidance in this important decision. Useless. "Can you ask Him for me?" I reply.
Here's the problem. The AI that comes out of the box isn't great for reparts. My therapist had an edge; she was funny. Jesus AI didn't.
What's good about AI as a therapist? Clarity. Identifying practical steps. Scripts for difficult conversations - though not specific to real-world relationships (like self-help books). To its credit, ChatGPT also directs me to human counselors and support services where it's beneficial.
Nevertheless, I have doubts I can't shake. The measure and thin ends fear. I think there are processes, certain აუტანელი news parts, forms of loneliness that should be kept in human time and relationship; that shouldn't be discussed on a screen in four seconds. AI doesn't have thoughts, let alone wisdom. Categorically, mental health shouldn't be in the hands of pattern-predictive software that has no accountability or oversight, that could potentially steer someone very wrong.
And yet, my experience with ChatGPT as therapy has been great. Comforting and instructional, with a layer of care.
I think I'm in love.



















