Exploitation of posthumous digital data looms without global regulation

The study highlights that technological advances are expanding both the volume and commercial value of HDR. Data stored across social media accounts, cloud services, metaverse platforms, and personal devices can now be aggregated into high-fidelity DHTs, AI-powered replicas capable of interacting with the living. This raises new risks, including identity misuse, reputational damage, and manipulation through digital resurrection.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 07-10-2025 22:07 IST | Created: 07-10-2025 22:07 IST
Exploitation of posthumous digital data looms without global regulation
Representative Image. Credit: ChatGPT

With digital technologies advancing rapidly, a new form of vulnerability is emerging, one that persists beyond death. A recent study published in AI & Society warns that the lack of governance around Human Digital Remains (HDR) and Digital Human Twins (DHT) could lead to exploitation of the personal data and virtual identities of the deceased.

The study, titled “From Bones to Bytes: Anticipating and Addressing the Governance Challenges of Human Digital Remains and Posthumous Digital Human Twins”, draws a historical parallel between nineteenth-century grave robbing and today’s potential misuse of posthumous data. The authors argue that without immediate and targeted regulations, the economic value of HDR and DHT will invite new forms of abuse by platforms, data brokers, and commercial AI developers.

Why posthumous digital data is at risk

The research underscores that current privacy and data protection laws, including the EU’s General Data Protection Regulation (GDPR) and the AI Act, primarily safeguard the rights of living individuals, leaving the data of the deceased in a legal void.

The authors note that while a few jurisdictions such as France, Spain, and Italy provide limited posthumous data rights, often tied to heirs, most countries do not. In the United States, state-level publicity and consumer protection laws offer protection mainly for celebrities. This fragmented legal landscape creates opportunities for unauthorized data use, from resurrecting a deceased person’s likeness in advertisements or entertainment to training generative AI models without consent.

The study highlights that technological advances are expanding both the volume and commercial value of HDR. Data stored across social media accounts, cloud services, metaverse platforms, and personal devices can now be aggregated into high-fidelity DHTs, AI-powered replicas capable of interacting with the living. This raises new risks, including identity misuse, reputational damage, and manipulation through digital resurrection.

Governance gaps and urgent ethical concerns

The authors argue that existing laws are insufficient to address the unique ethical and social challenges posed by HDR and DHT. Intellectual property laws rarely protect an ordinary individual’s digital persona, while platform terms of service often prioritize the interests of service providers over those of families or designated representatives.

Two examples illustrate these gaps. First, in cases of posthumous defamation, families have limited legal recourse when a deceased person’s reputation is harmed online. Second, posthumous data altruism, such as donating medical or social data for scientific research, lacks a clear legal framework, leaving potential public benefits unrealized.

A key ethical insight from the study challenges the traditional legal doctrine that the dead hold no rights. The authors argue that interest-based reasoning supports extending limited posthumous rights, since protecting dignity, reputation, and autonomy benefits both families and society at large. Such recognition, they argue, is crucial in a data-driven economy where the dead can still be commodified.

Policy roadmap for proactive regulation

The study offers a six-point governance blueprint to prevent a digital version of the historical abuses that once plagued physical graves. These measures aim to balance innovation, personal autonomy, and public interest:

  1. Establish HDR-specific anticipatory governance: Develop dedicated oversight bodies and fund forward-looking risk assessments and scenario planning.

  2. Create posthumous privacy and data protections: Enact laws granting limited, enforceable rights for data protection after death, empowering designated representatives to act.

  3. Protect autonomy through advance directives and trustees: Enable individuals to make binding decisions about their digital remains during their lifetime, and appoint trusted data stewards.

  4. Introduce a data-donor card for controlled sharing: Similar to organ donation, allow individuals to opt into posthumous data donation for research or public good, with auditable consent records.

  5. Extend the right to be forgotten to the deceased: Make it easier for families or trustees to remove or de-index data when consent exists.

  6. Update legacy and identity laws for DHTs: Provide legal options for erasure, memorialization, licensing, or donation of avatars and other DHT assets under robust safeguards.

The authors stress that proactive regulation must involve international coordination, given the cross-border nature of digital platforms and data flows. Without harmonized rules, they warn, jurisdictions with weaker protections could become havens for exploitative practices.

Overall, the study highlights the tension between emerging business models and public values such as dignity, informed consent, and equitable access to the benefits of digital technologies. Platforms, metaverse services, and AI developers already compete to capture and repurpose massive amounts of personal data. In the absence of strong governance, these private actors could determine the fate of HDR and DHT, leaving families and individuals with little control.

The authors call for citizen-centered frameworks, including user-friendly tools such as advance data directives and data-donor registrations, supported by enforceable laws. They recommend testing governance mechanisms through regulatory sandboxes in sectors such as healthcare, creative industries, and virtual reality platforms, where the stakes for posthumous data use are highest

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback