This week on The Intersect I’m in conversation with Dr. Grace Blest-Hopley, a neuroscientist who is the co-founder of Hystelica, a UK-based company breaking new ground in psychedelic research by centering women’s biology.
I reached out to Grace because I kept seeing stories about AI stepping in as a ‘trip sitter.' That idea felt wild to me. Could a machine really stand in the room when your mind is cracked wide open? When you may need a human the most? Since Grace is asking how hormones and lived experience shape the way psychedelics affect us, I wanted to hear what she had to say.
Overall, Grace pointed me to the idea of AI as a tool for preparation and reflection. Think about it: our cycles, moods and energy states shift daily. Most of us can’t remember exactly how we felt last week let alone last month. But with the right inputs, AI can help track and analyze those fluctuations.
A quick note of your symptoms in an app or spreadsheet can become data you can bring to your doctor. AI can surface patterns, summarize trends and perhaps give you a clearer sense of when you might feel more open or more fragile heading into a psychedelic experience (or just, daily life!).
If you’re still asking, can AI be your ‘trip sitter’ …

Enable 3rd party cookies or use another browser
Grace uses a vivid analogy: sometimes on psychedelics you’re like a curious toddler -- open, chatty, easy. In those moments AI might be fine company. Other times you’re more like a crying toddler -- emotionally dysregulated and in need of someone’s hand.
No current rendering of a chatbot can give you that kind of safety.
Full episode here: Should AI guide your psychedelic journey? A neuroscientist reveals the key to a good trip
Some of what I am reading this week:
The day ChatGPT went cold
OpenAI’s latest release of GPT-5 sparked backlash after users felt the bot had turned colder and less empathetic, with some describing it as ‘wearing the skin of my dead friend.’ This pushed OpenAI to bring back the old model for paying customers, a sign that people are grieving chatbots like … lost companions.
By Dylan Freedman (New York Times)
AI is a mass-delusion event
Three years into the AI boom, Charlie Warzel argues the biggest side effect is a collective sense of ‘unreality.’ A mix of grief, shock and disbelief that leaves people wondering if they’re losing their minds. Three years in, the cultural impact of generative AI may be less about efficiency and more about disorientation, what the author calls ‘psychosis as a service.’
By Charlie Warzel (The Atlantic)
The Wall Street AI vibe shift?
Are Wall Street and Silicon Valley suddenly wobbling on their AI obsession? As models stumble and hype cools, some are calling it an early correction sign or the first hints of an AI winter. Given what’s at stake, it’s going to be a race to ensure this isn’t the case.
By Allison Morrow (CNN)
Google reveals AI’s energy footprint
This week Google began disclosing to users how much water, energy and emissions are tied to Gemini AI prompts and is also urging industry-wide standards. Gen Z is already being vocal about this energy-use issue so we should expect the volume on this to go up.
By Ben Geman (Axios)