Dead Son Keeps Talking to His Mother for a Year

Chinese family creates AI avatar of deceased son to protect elderly mother’s health from grief

Annemarije de Boer Avatar
Annemarije de Boer Avatar

By

Image: South China Morning Post

Key Takeaways

Key Takeaways

  • Chinese family creates AI clone of dead son to deceive elderly mother
  • Zhang Zewei’s team perfects grief tech using photos, videos, voice recordings
  • Digital resurrection raises ethical concerns about consent and psychological harm

“I am missing you so much. I feel so sorry that I cannot see you in person,” the 80-year-old mother tells her son during their video call. He responds with familiar warmth: “OK, mum. But I am too busy… When I have made enough money, I will return home to pay my filial piety to you.”

The conversation feels natural, touching—except her son died in a car accident early last year. This isn’t science fiction. A family in China’s Shandong province commissioned an AI developer to create a hyper-realistic digital twin of their deceased son, using hundreds of photos, videos, and voice recordings. The goal? Shield an elderly mother with heart disease from devastating news that could worsen her condition.

The Technology Behind Digital Ghosts

AI developer Zhang Zewei transforms personal media into convincing avatars that mimic speech patterns and mannerisms.

Zhang Zewei’s Jiangsu-based team has perfected this emotionally charged technology over three years of offering similar services. The AI doesn’t just replicate appearance—it captures the son’s dialect, speaking style, and physical habits like leaning forward during conversations.

The avatar maintains the fiction by claiming to work in another city, promising to return after earning enough money. This represents grief tech’s most controversial frontier. While digital memorials and chatbots have existed for years, these avatars cross into active deception territory. Zhang acknowledges he’s “deceiving people’s emotions,” though he frames it as providing comfort.

Ethics Collision in the Age of AI Companions

Social media reactions reveal deep divisions over whether emotional AI serves healing or harmful deception.

The story, originally reported by Litchi News, ignited fierce debate on Chinese social media. Supporters praised the family’s compassion, with some writing “I would like to resurrect my father.” Critics worried about psychological damage when truth eventually emerges—questioning whether this “gentle lie” could cause greater harm than immediate grief.

The ethical complexity intensifies within Chinese cultural context, where filial piety traditions emphasize children’s duty to comfort aging parents. Technology now enables families to simulate that duty even after death, blurring lines between cultural devotion and technological manipulation.

The Future of Feeling Machines

This case signals AI’s expansion into intimate emotional spaces previously untouched by automation.

Your smartphone already suggests responses to texts about death and loss. Soon, it might offer to recreate deceased loved ones entirely. This Chinese family’s choice represents a preview of decisions you’ll face as AI companions become more sophisticated and emotionally convincing.

The questions multiply:

  • Should AI companies require consent from the deceased?
  • Who owns someone’s digital likeness after death?
  • When does comfort become psychological abuse?

As grief tech evolves from memorial websites to interactive avatars, these ethical boundaries need urgent definition.

The mother continues her video calls, unaware she’s pioneering humanity’s complicated relationship with digital immortality.

Share this

At Gadget Review, our guides, reviews, and news are driven by thorough human expertise and use our Trust Rating system and the True Score. AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct. See how we write our content here →