“Social media’s greatest success might be in its ability to create virtual communities not easily accessed in the real world,” Clare Murphy wrote in a 2019 Dame article titled “Social Media is Changing the Way Americans Process Death.” In the piece, Murphy explores how digital connections allow us to grieve differently, often outside of typical family or friend groups: “Communities bonded by loss are common, and often form among people who wouldn’t typically connect.” But the dark side to grieving in online community with unknown others, one that’s unearthed when your grief is broadcast out of context to an audience that’s not bearing witness to it in good faith. Darker still is that the reality that your own death might all too easily become a source of content, made into viral fodder for an internet audience that has no idea who you were before death, but is desperately curious regardless.
That’s what happened to Ronnie McNutt. Early this September, I noticed TikTok users warning one another: If you see a video of a man sitting at a desk, keep scrolling. Users attempted to communicate with people they had never met, and never would, to attempt to help each other sidestep said video. If he’s at a desk, if he has a beard, keep scrolling. They warned each other that the image might appear out of nowhere in an otherwise typical video. You might be learning a new TikTok dance, or laughing at cats sprinting around a room, and then, suddenly, the video would appear. It was the 2020 take on the early 2000s jump-scare chain email. But it was worse, because it was a video of a death by suicide. And because it was real. On August 31, McNutt streamed his suicide on Facebook Live. Because Facebook had partially dismantled its usual team of content reviewers due to the COVID-19 pandemic, it claimed to have been unaware of the video, and the livestream stayed on Facebook for more than two hours—during which it was reuploaded to other social networks and shared in Reddit threads like r/MorbidReality. (It was quickly removed by Reddit moderators.)
The video found a home on TikTok, where users were able to splice clips from it into other videos and avoid detection by the platform’s algorithm. Like those early 2000s jump-scares, you might find yourself idly pursuing the site when, suddenly, the bait-and-switch occurred, pushing you from calm and inattentive to gripped with unexpected terror. In death, McNutt became inescapable. TikTok tried, and struggled, to stop the spread of the video. “On Sunday night, clips of a suicide that had originally been livestreamed on Facebook circulated on other platforms, including TikTok,” a TikTok spokesperson told CBS News. “Our systems, together with our moderation teams, have been detecting and removing these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.” It was later revealed that the graphic video apparently went viral because “of a coordinated raid from the dark web” that sought to reupload and reshare the content.
This isn’t the only instance of death on TikTok. In June 2020, a group of Seattle teenagers found two dead bodies, apparent victims of a homicide, and filmed the experience—including the bodies—on TikTok. A month later, the platform was critiqued for promoting videos that memed and joked about antisemitic death camps like Auschwitz. Accounts like Crime Scene Cleaning (CSC) film and circulate “decomp demolition” TikToks from homes where deaths occurred. These are easier than you might think to stumble on: Despite never engaging with death-related content or searching death-related terms on the app, I came across CSC entirely accidentally and suddenly found myself looking at blood splattered on a wall. It literally makes me sick to think about. The account has 2.7 million followers and is still active.
We’re still learning what to do with death, both online and off. In the United States, death and end-of-life doulas are very slowly on the rise, and their very existence calls into question the way that the United States handles death, both emotionally and logistically. With regard to suicide, a current cultural struggle finds journalists fighting over language and churches reckoning with how their own doctrine shapes attitudes about mental health. There’s a big difference between death positivity and submerging people in triggering material, but it’s a nuance that Americans are still meandering through. After all, we’re still calling each other “snowflakes” and intentionally muddying the concept of safe spaces. But as we struggle to find our footing around death, death is becoming part of the social-media landscape with disconcerting speed and scale. TikTok, a platform whose cultural standing was negligible in 2018, has experienced massive growth in the last year; Facebook continues to regularly shift its algorithm and content-moderation strategies in an attempt to remain relevant—but also to sidestep criticism.
Given the swiftness of these changes and the lack of transparency with which they’re implemented, it feels increasingly difficult to reach a place where our platforms are reflecting our current conversations rather than bullying us into them. The inability to avoid terrifying and triggering content isn’t new. Rather, it’s TikTok’s format that makes it so concerning. In 2020, TikTok is culture-building, and that requires us to recognize the risk that attends such a rapid proliferation of content—even more so given that platforms like Instagram have copied its format What does it mean that a TikTok clone is now in the hands of Facebook (which owns Instagram) given how poor a job that platform has done protecting its user base? Equally concerning is that TikTok is overwhelmingly used by young people: Sixty percent of TikTok users in the United States are between the ages of 16 and 24, but many are likely much younger—I know I’ve reported and blocked preteens who looked closer to 8 or 10.
We’re still learning what to do with death, both online and off.
Indeed, the New York Times reported in September 2020 that up to one-third of U.S. users could be under the age of 14. Public suicides can be traumatic to anyone who unwittingly sees them and terrifying to young users with little understanding and context for such deaths. In 2012, the study “Social Media and Suicide: A Public Health Perspective,” published in the American Journal of Public Health, explained that, “The emerging data regarding the influence of the Internet and social media on suicide behavior have suggested that these forms of technology may introduce new threats to the public as well as new opportunities for assistance and prevention,” they explained. They found that chat rooms and forums can influence decisions to die by suicide, especially for vulnerable groups. The study’s authors likely weren’t expecting that, eight years later, these threats would burst out of chat rooms and into videos that can be streamed, spliced, and circulated worldwide.
In their story for Dame, Murphy writes that, “[T]he infinite nature of social media paired with the finality of death can lock some in a bereavement loop—a constant connection related back to the death.” The increasing unavoidability of death on social media should compel us to address this issue at the speed of social media itself, and that requires us to confront our own cultural reluctance to address death, suicide, and grieving head-on. Likewise, it matters what we demand of TikTok, Facebook, and other platforms that too often ignore their own responsibilities to users. When social media is a numbers game, and impressions and engagements are our kings, how do we survive the proliferation of dangerous content that will outlive all of us? We know that there are adults on Facebook right now consuming fake news and political bile. But we also know that there are children on TikTok right now who could be absorbing death and all that comes with it. We don’t know what that’ll do to them given the rapid pace and particularly haunting nature of TikTok. And we don’t know that we can trust the platforms that push such content right before their eyes to do anything about it. I don’t know which is the scariest part.