I’ve struggled with an eating disorder since I was a young girl. Therapy, inpatient treatment, halfway houses -- recovery has been the longest road of my life. When your brain is caught in that loop of perfectionism and self-harm, an endlessly patient companion ready to strategize with you feels like the most dangerous thing in the world.
So when I found Mallary Tenore Tarpley’s article in Teen Vogue “AI Therapy? How Teens Are Using Chatbots for Mental Health and Eating Disorder Recovery,” I was immediately drawn to her exploration of the link between AI and recovery.
Glossy magazines contorted what I considered to be an ideal body and aesthetic. But for younger people, a deep connection to social media, coupled with the rise of influencer culture, #skinnytok and the constant conversations we’re having with AI companions, are adding layers of pressure and confusion over how to feel secure in our bodies.
Today’s AI tooling can help you build a business, write a book or, in darker cases, become better at an addiction or disorder that is already trying to consume you. This is the reality of some young women using AI in 2025, as Mallary reports.
I was surprised and distressed to learn young people are already programming AI to act as anorexia coaches. Imagine a voice that never sleeps, never pushes back -- never intervenes, only encourages the behaviors that literally destroy you. It sounds extreme, but apparently it is here.
Enable 3rd party cookies or use another browser
We like to think technology is neutral. That’s a nice idea … that it’s just a tool. Like how fire cooks dinner or can burn your house down.
Our use of technology is not neutral. As it’s highly adaptive and intimate.
I’ve recently been thinking about how AI is becoming an evolutionary companion. It doesn’t judge, it doesn’t have oversight, it just helps you get where you’ve told it you want to go. Which means the danger and the promise are the same thing.
We are soon to be living in an era of democratized executive assistants, anticipating our needs before we articulate them.
There was further proof when I read Fidji Simo’s, CEO of Applications at OpenAI, announcement this week about Pulse, OpenAI’s new “‘proactive, steerable assistant,” that can predict your goals and needs before you may be “consciously” aware of them.
There’s a lot at stake.
The media writ large, has handed us impossible ideals -- but the models on the screen didn’t follow us home in the same way. This new tech can. It shadows you, anticipates you and feeds back your desires.
That’s the paradox Mallary captures so well in her reporting. A tool powerful enough to scaffold recovery can also perfect the pathology. Which means the question isn’t whether the technology is good or bad, but whether we can build the guardrails before it becomes the invisible hand shaping how a generation survives.
Please listen to my full conversation with Mallary here: AI anorexia coaches and the future of treatment