Future Tense

What Jack the Ripper’s Victims Can Teach Us About Digital Privacy

Life after death is real when it’s the persistence of your data on an unimaginable future internet.

A lamppost glitching on a dark London street.
Animation by Slate. Photo by George-Morris/iStock/Getty Images Plus.

One hundred thirty-two years ago, the serial killer now known as Jack the Ripper murdered five women in the Whitechapel and Spitalfields districts of East London. Today, the mortuary photos of the victims (and one crime scene photo) are strewn across blogs, websites, social media platforms, and YouTube videos. Sometimes, students of Victorian history use the images for educational reference, but other times online tabloids use the images as lurid clickbait.

Many of us labor under the assumption that we understand the risks of sharing and storing our personal data online, but we have limited insight into how technology will evolve in the next decade, let alone the next 132 years. Emerging technologies, like A.I.-powered virtual avatars, will introduce a host of legal and ethical complications we cannot foresee. The exponential growth of deceased people online will only compound the problem.

The past decade has seen a surge in the presence of dead people online. As of 2016, 30 million Facebook users had passed away, and an estimated 8,000 Facebook users die every day, which amounts to nearly 3 million new profiles of deceased users every year. Facebook has responded by creating memorialized accounts, which essentially freezes profiles in time and allows Facebook friends to leave comments. You can appoint a legacy contact to manage your account after you die, but according to a 2019 survey, two-thirds of Facebook users haven’t bothered to.

It’s easy to forget that the internet, given its ubiquity, is only 25 years old or so (depending on how you define “the internet”). It may appear that today’s digital mores are fixed, but they’re simply the default settings of a culture still learning to navigate a relatively new technology. By contrast, planning the allocation of one’s assets after death—that is, estate planning—stretches all the way back to the Roman Empire. Most people today with assets that can be passed on have some plan in place, whether it’s in the form of a will or life insurance. It’s only a matter of time before estate planning principles are regularly applied to digital assets. Everything goes digital eventually—like banking, for example, which has an equally long history.

Postmortem privacy is not limited to digital estate planning. It also concerns protecting your data against invasions of privacy from the technology companies that store and harness said data, as well as those who may lay claim to your personal data, such as family, and hackers. There is a growing malaise in popular culture about leaving an intractable digital footprint, darkly encapsulated by the “Delete My Browser History” meme. Many people don’t want their private information made public, their likeness used, or their accounts active, even after their deaths,

In 2016, a spam sexbot assumed control of the verified Twitter account of New York Times media columnist David Carr, to the dismay of his many followers, as Carr had passed away a year earlier. In a separate hack, Carr’s associated email address sent spam emails to its address book contacts. Though Carr’s accounts were restored within hours, the average user might not fare as well in a similar situation.

When accounts are vulnerable, so are digital assets, such as images and videos. In 2012, a couple of dating websites used the image of a deceased soldier in an online ad aimed at women. The soldier, Army 2nd Lt. Peter Burks, had been engaged to be married when he was killed in Iraq five years earlier.

Add to the mix a new wave of cybersecurity threats posed by 5G networks and emerging technologies like “deepfake” videos, which could be maliciously used to resurrect the dead, and a dystopian vision of postmortem data breaches starts to coalesce.

But postmortem data issues needn’t be dystopian to pose moral conundrums. Consider the simple question of whether a family should be able to access the email account of a deceased loved one who has left no clear instructions. That very question was answered by the Massachusetts Supreme Court in Ajemian v. Yahoo Inc., which ruled in favor of the decedent’s family. John Ajemian’s family gained access to his Yahoo email account in order to coordinate his memorial service and identify his assets, but what if his emails contained sensitive information he did not want his family to see?

Cases like Ajemian v. Yahoo paved the way for the Revised Uniform Fiduciary Access to Digital Assets Act, or RUFADAA, which gives account owners the ability to plan their digital estates. Because it’s a U.S. model law, states are free to interpret and implement the RUFADAA in different ways. This can create interstate conflicts, according to Edina Harbinja, senior lecturer in media and privacy law at the Aston Law School in Birmingham, England. (According to the National Conference of State Legislatures, 45 states have enacted some form of RUFADAA or UFADAA.) But RUFADAA’s most glaring limitation, Harbinja told me, is its potential conflict with the Electronic Communications Privacy Act, the federal statute cited by Yahoo in its attempt to block the transfer of Ajemian’s emails to his family. The statute prohibits electronic-communication companies from disclosing a person’s communications to third parties, even family, without their consent.

Most digital assets have one thing in common: They are controlled by intermediary contracts—Big Tech—and in the absence of applicable regulation, ownership of data defaults to said contracts in many jurisdictions. “It is still uncertain under which circumstances the 9th Circuit Court would grant access to the deceased’s personal communications stored on one of the tech giants’ servers under the ECPA,” said Harbinja, referring to the federal court whose jurisdiction includes Northern California, where the majority of the largest tech companies are based.

As the legislative zeitgeist has shifted toward protecting individual account owners, deceased users have been largely left behind. Governments have introduced consumer data protection laws, like the EU’s General Data Protection Regulation and the California Consumer Privacy Act, but these pieces of legislation were formed as responses to rampant, black-hat consumer data mining practices perpetrated by companies like Facebook and the defunct Cambridge Analytica. GDPR does provide some postmortem data protections, but how those protections are implemented varies widely across the European Union—EU states can exercise discretion as to whether they extend GDPR protections to the deceased. Many EU members have elected not to exercise that discretion due to the fact that deceased account owners cannot consent to the processing of data.

Most legislative advances in postmortem data protections are limited to digital asset transmission, or digital estate planning, like RUFADAA. This is a necessary step, but it’s only one piece of the puzzle.

Harbinja makes a case for “nuanced and comprehensive legislation, not only in the area of succession and estate planning, but also in privacy law, data protection, property, contract and criminal law, as these often interact and conflict when we think more deeply about digital assets.” It can be surprising that we are so far into the digital age without good regulation here. But passing new legislation takes time, especially in the context of technology that is in constant flux.

Consider the case of nonconsensual pornography, colloquially known as “revenge porn.” For years purveyors of digital pornography capitalized on the lack of legal precedent and accompanying murkiness to hide behind the defense of user-generated content. But today, laws protecting victims of nonconsensual pornography exist in 46 states, D.C., and one territory. The laws aren’t perfect, as we see in a recent Minnesota case of “revenge porn” harassment, but they’re better than they were 10 years ago. It’s important to note, however, that the civic and political will for nonconsensual pornography laws are greater than for postmortem data protection, since they deal with the rights of living persons who have suffered immediate and persistent injury. Still, a parallel can be drawn: It takes time for the law to catch up to inequities produced by emerging technologies.

The Whitechapel victims suffered the ultimate theft of consent, the taking of their lives, a theft of consent that has been reprised ad infinitum through the nonconsensual reproduction of their images online. The internet was created more than a century after their murders, but they have a digital legacy nonetheless. Theirs is an unlikely data protection cautionary tale and a clarion call for fortified postmortem privacy rights. Injustice invariably precedes law. The victims of Jack the Ripper will never get justice, but perhaps the egregious lack of consent they have endured will help make the case for greater accountability in the use of deceased persons’ digital assets for present and future generations.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.