Apple's AI Health Coach Failed: What 'Mulberry' Cancellation Means for Mental Health Tech
Apple reportedly shelved its AI-powered health coaching service after years of development. Here's why even Apple couldn't crack AI mental health—and what it reveals about the industry.
Apple—the company that revolutionized smartphones, tablets, and smartwatches—just quietly admitted defeat in AI mental health. Their ambitious health coaching service, codenamed "Mulberry," is reportedly on life support.
According to Macworld, the project has faced "significant delays and potential cancellation after multiple setbacks." If Apple, with its nearly $3 trillion market cap and world-class AI research teams, can't crack this problem—what does that tell us?
What Was Mulberry?
Apple's vision was ambitious: an AI-powered health coach integrated into the Apple ecosystem. Using data from Apple Watch, iPhone, and Health app, Mulberry would provide personalized wellness guidance—including mental health support.
The concept made sense on paper:
- Unmatched data access — Apple has more health data than almost any company on Earth
- Hardware integration — Watch sensors could detect stress, sleep quality, heart rate variability
- Trusted brand — Users already trust Apple with sensitive health information
- Distribution — Instant access to billions of devices
So what went wrong?
Why AI Mental Health Is Harder Than It Looks
Apple's struggles reveal something important: AI mental health isn't a technology problem. It's a human problem.
1. Liability Is Terrifying
When your AI health coach gives fitness advice and someone pulls a muscle, that's a minor issue. When your AI mental health coach gives advice and someone hurts themselves, that's a lawsuit, a congressional hearing, and a PR nightmare.
Apple is notoriously risk-averse. Mental health AI carries risks they probably couldn't stomach.
2. "Good Enough" Isn't Good Enough
Apple's brand is built on polish. They don't ship products that feel half-baked. But AI mental health support is inherently imperfect—it can't replace human therapists, will sometimes give unhelpful responses, and needs to clearly communicate its limitations.
That messy reality doesn't fit Apple's "it just works" philosophy.
3. Mental Health Requires Specialization
Apple is good at many things. But building therapeutic AI requires deep expertise in psychology, clinical practice, crisis intervention, and cultural sensitivity. It's not something you can bolt onto existing health features.
The companies succeeding in AI mental health are laser-focused on it. It's their entire mission—not a feature in a broader health app.
4. Regulatory Uncertainty
The FDA is still figuring out how to regulate AI mental health tools. Apple likely didn't want to navigate that uncertainty, especially given ongoing scrutiny of tech companies and mental health.
What This Means for the Industry
Apple's retreat is both a warning and an opportunity:
The Warning
Big tech can't simply buy or build their way into mental health. The problem requires different expertise, different risk tolerance, and different metrics of success than consumer electronics.
Google Health has similarly struggled. Meta's mental health initiatives have been controversial. Amazon's healthcare ventures have stumbled.
Mental health isn't a feature. It's a mission.
The Opportunity
Apple stepping back leaves space for focused, mission-driven companies. The market isn't going to Big Tech—it's going to specialists who understand that mental health support requires:
- Authentic conversation — Not corporate-safe platitudes
- Deep memory — Understanding each user's unique situation
- Clear limitations — Honest about what AI can and can't do
- Crisis protocols — Knowing when to escalate to humans
What This Means for You
If you were waiting for Apple to solve AI mental health, that's not coming. The company that makes your phone isn't going to be your therapist.
But that's probably for the best.
Mental health support works best when it's the primary focus—not an add-on feature to sell more watches. When you're struggling at 3AM, you want an app built specifically for that moment, not a general wellness platform with mental health bolted on.
The Focused Players Are Winning
While Apple retreats, specialized AI mental health apps are gaining traction. Companies that do one thing well—emotional support through AI—are building the features users actually need:
- Voice-first interaction for natural conversation
- Memory that builds across sessions
- 24/7 availability when human support isn't accessible
- Honest acknowledgment of limitations
Apple's Mulberry failure isn't the end of AI mental health. It's evidence that the space belongs to specialists, not generalists.
The Bottom Line
Apple couldn't crack AI mental health. That's not a failure of technology—it's a sign that this space requires focus, expertise, and a different approach than building consumer electronics.
The future of AI mental health isn't Big Tech. It's companies built from the ground up for emotional support.
Stella is built for one purpose: being there when you need support. Voice-first, memory-enabled, designed for the moments Apple couldn't figure out. Learn more.
Struggling with anxiety? Stella remembers your triggers so you don't spiral the same way twice.
Get Early Access


