AI “griefbots” are chatbots designed to mimic the voice, messages, or personality of someone who has died and have recently been featured on TV shows like Lorraine. It’s easy to see the appeal. Who wouldn’t want one more conversation with someone they loved?

But while the idea sounds comforting, I don’t recommend griefbots as a tool for healing. Here’s why, and if you decide to try one anyway, how to protect yourself.

Why I Don’t Recommend Griefbots

1. There is no clinical proof that they help

We have research showing that structured online grief programmes (often CBT-based)  can reduce grief symptoms. But to date, no trials have proven that griefbots ease grief.

2. They can distort memories

AI generates convincing but false replies. That can blur precious real memories and create confusion.

3. Risk of prolonging grief

Chatting with a replica can keep people stuck in yearning or avoidance patterns linked to Complicated and Prolonged Grief Disorder.

4. Unsafe for young people

Children and teens may struggle to separate AI from reality, making it riskier for them. The accessibility of AI means that this age group can use these tools without adult support or knowledge.

5. Privacy concerns

Creating a griefbot means uploading sensitive, personal data. That raises consent and security issues. Be careful what you share.

If You Still Choose to Try One…

Set limits: No more than 10–15 minutes, 2–3 times a week.

Choose your moment: Avoid late-night or vulnerable times. Make sure you have someone you can turn to if you get distressed.

Ground yourself: Keep a reminder nearby — “This is AI, not my loved one.”

Talk it through: Debrief with a trusted person afterwards.

Protect your privacy: Don’t upload other people’s messages, switch off data-sharing, and delete logs regularly.

Know your stop-rules: Step away if you feel worse, dependent, or stuck.

 

Healthier Digital Alternatives

Evidence-based grief programmes (often CBT-guided).

Journaling or memory apps that support reflection.

AI for prompts, not replicas (e.g. letter-writing support rather than simulations).

Final Thought

Griefbots aren’t evil, but they aren’t harmless either. They should never replace human support, therapy, or the natural grieving process.

Grief is painful because love and loss are real. Healing comes not from recreating the person we’ve lost, but from learning how to carry their love forward into our lives.

 

Pin It on Pinterest