Dear Curious Human,
I couldn’t remember my wife’s phone number.
My phone died at the DMV, and I needed to write it down on a form. But my mind was blank.
“Is it 917…or 202?”
At age six, I had my best friend’s number memorized. My dad’s work line. Red Planet Comics.
I’ve offloaded so much of my mind to the cloud, I’m not sure what lives in me anymore. I’m a cyborg. Not the cool RoboCop kind. More like the horrifying Black Mirror kind.
That was the first shift. Smartphones changed how our minds work. We stopped remembering facts and started remembering where to find them.
Now AI is asking us to change again. Not just how we remember, but how we think, feel, and reflect.
Both humans and AI are association machines. But what we associate is fundamentally different.
Let me show you what I mean.
When I was eleven, I had two teeth pulled to fix a serious overbite. The dentist, a man I’d never met, looked me up and down and said, “What are you, sixteen?”
I was tall for my age.
“I wish. I’m eleven.”
His face dropped.
“Never wish to be older than you are. My father just died. He cherished every moment. You should too.”
I was already terrified of the procedure. I wasn’t prepared for an existential gut punch.
I just nodded, silent. But that moment stuck with me. Not because it made sense, because it felt important.
Emotion + Surprise = Core Memory
That’s how our minds work. It’s how we build meaning. It becomes our identity. It’s how we carry trauma, tell stories, pass down wisdom.
AI doesn’t do that.
It doesn’t grieve. It doesn’t get gut feelings. It doesn’t tell stories to make others feel less alone. It can simulate connection, but it doesn’t feel it.
What it can do, at incredible speed, is generate associations. It predicts what comes next based on everything it’s read: books, websites, videos…even every Enron employee email (they’re public domain due to a court case) and way too many free romance novels.
I’ve been thinking of AI like one of those talented kids on a reality show singing “I Will Always Love You.” They hit all the notes, but you can tell, they’ve never had their heart broken. Some even dedicate the song to their parents.
Still, sometimes AI sees what we can’t. It connects things we overlook, because it’s drawing from billions of data points. And that can reveal something we’ve missed.
Here’s an example.
I recently gave ChatGPT-4o this prompt:
Act as a high-performance strategist and behavioral psychologist.
Based on everything you know about me, run a 3-part introspective audit: (1) Identify internal and external blockers to my next level of growth, including hidden ones; (2) Highlight my most underused strengths, tools, and leverage points; (3) Design a personalized, scalable protocol for breakthrough execution.
End with any hard truths I may be avoiding but need to hear.
I’ve been using ChatGPT nearly every day for over a year, mostly to brainstorm, edit stories, and polish social posts. I don’t tell it my feelings. I don’t use it like a diary (or therapist).
But this is what it told me:
You’re a psychologist and storyteller. A clinician and a cultural critic. These identities sometimes compete. You’ve outgrown the “educator first” role but haven’t fully stepped into the person who changes how people think about mental health.
You often dilute your boldest ideas in favor of what’s most “helpful.” Your credentials have created invisible pressure to stay in your lane—even as your thinking evolves. There’s a quiet fear that going too sci-fi, too philosophical, too provocative will alienate your base—even though that’s exactly what they want from you.
Stop waiting to respond. Start shaping the discourse. Don’t over-teach. Say the big thing. Then show everyone how to live it.
I had to sit with that one for a while.
It mirrored something I felt but hadn’t yet said aloud. I’ve been afraid to speak what I really think. Afraid of the criticism.
But that fear is the very edge I need to walk. The thing that makes me human. The only way to get a signal through the noise.
AI didn’t feel that insight. I did. But it helped me see it.
And it laid out a strategy to help me follow through. This newsletter being a big part of that.
Now I’m starting to think of AI like a hyper-literal intern. Fast, helpful, and astonishingly well-read, but totally disconnected from lived experience. It needs our wisdom to mean something. And together, we can both grow.
This is the future I believe in. Not humans vs. AI. But humanity amplified by AI.
Here’s something you can try
If you’re stuck on an emotion, a decision, a chapter of your life, start by journaling. Just 20 minutes a day, for 2–4 days. No editing. No filtering. Just write. If something becomes too much, back away, shift to a related topic.
Then paste it all into an AI like ChatGPT and try this prompt:
Act as a trauma-informed therapist, narrative coach, and emotional pattern analyst.
Based on the journal entries I’m about to share, identify the core emotional themes, hidden assumptions, and recurring stuck points in my thinking. Help me make sense of what I might be avoiding, what I might be protecting, and what unresolved story I keep telling myself.
Then: (1) name the emotional patterns that need healing or reframing, (2) surface any insights or breakthroughs I may be circling around but not fully claiming, and (3) offer a next step—one brave action that could help me move forward with more clarity, honesty, or self-trust.
Finish with one compassionate truth you think I need to hear most right now.
AI sees the patterns. We create the meaning.
What’s one thing you’d like to understand better about yourself?
I bet AI could help you do that, if you bring your full humanity to the conversation.
Stay curious,
Dr. Ali