This is a joke—and apparently a very relatable one for its target demographic, the millions of Generation Y and Z digital natives for whom memes are a mother tongue. A casual scroll on 9gag, which receives 3.5 billion page views a month, will turn up dozens of memes daily about self-harm or wanting to die, and young people are sharing, retweeting, and reblogging similar content across the social-media landscape. You’ll find storybook illustrations doctored to show children dreaming of grisly deaths, SpongeBob joyfully flailing to his doom during a bank stickup, and Obama about to throw himself off a bridge.
At first blush, these jokes couldn’t be in poorer taste. The World Health Organization ranks suicide as the second leading cause of death for youth worldwide. In the United States, data from the Centers for Disease Control and Prevention showed staggering 70 and 77 percent increases in suicide rates of white and black teens, respectively, between 2006 and 2016. In response, public-health officials and tech giants alike have been cracking down on potentially dangerous messaging on self-harm. Last Friday, Instagram rolled out a new policy banning “graphic” depictions of self-harm or suicide.
But memes about suicide remain largely uncharted territory. While disturbing, they’re far less graphic than actual depictions. And they’re often darkly funny. As the gatekeepers of social media are wrestling with how to police this trend, some suicide-prevention experts see a window of opportunity. Typically, suicide memers aren’t mocking suicidal thoughts; they’re commiserating and bonding over being suicidal. Morbid memes, these experts believe, may be a foot in the door to one of the most vulnerable and hard-to-reach populations: socially isolated young people.
April Foreman is a seasoned veteran of the dark web. As a licensed psychologist and executive board member at the American Association of Suicidology, she’s clicked through the foulest content on the internet to keep tabs on the volatile and high-risk souls that live there.
Foreman wasn’t surprised when suicide memes began to percolate up into the surface-level internet after a long incubation period in more hostile and conspiracy-laden depths (see: 4chan). In a way, she’s heartened by the memes’ increased social acceptability. Like so many anonymous platforms, 9gag struggles with pervasive racism, misogyny, and old-fashioned trolling. But while the predictable “lol, do it” replies pepper the comment sections to suicide memes, messages of support tend to be buoyed to the top by hundreds of upvotes. Internet scamps with usernames like necrolovertown gently direct suicide-meme posters to local suicide hotlines (or, in necrolovertown’s case, provide his Facebook contact info and a standing offer to chat—“any hour anytime I’ll be there”).
What we’re witnessing on 9gag, Foreman explains, is the writing of a new “social script.” Sometimes it’s tough to know what to say, “like if someone’s dog dies, or if you have to go to a funeral.” But through experience, communities develop a formula for how to respond supportively—something like, “‘Dude, that’s rough. I’ve gone through it. Here are the resources, let me know if you need support,’” Foreman says. Foreman has identified several corners of the internet that seem to have healthy social scripts for suicidal thoughts. “Reddit communities around certain video games”—like the Eve Online universe’s Broadcast 4 Reps—“tend to have communities where you talk about your mental health and you feel better. People help you.”
Still, Foreman cautions, destructive conversations about suicide abound deeper in the bowels of the internet. “We have people that go in there as trolls to really stir people up and make them feel worse,” she says. They make “‘sui-fuel,’ memes to get people even more depressed, with the idea that you might ‘rope’—which is kill yourself—or you might even go and do a murder-suicide.”
Foreman’s colleague Bart Andrews, another clinical psychologist and executive board member at the American Association of Suicidology, is a full-throated advocate for suicide memes as an alternative to these destructive depths. Andrews bucks the traditional wisdom on suicide contagion, the idea that suicidal thoughts can spread through a community like a virus. It’s an evidence-based notion that’s been widely unchallenged for decades, and informs national and internationalguidelines for media coverage of suicide. Andrews acknowledges that irresponsible reporting of suicide—such as sensationalistic, needlessly graphic descriptions of celebrity suicide—likely has population-level effects. But if safe-messaging guidelines prevent people from having meaningful conversations, Andrews contends, they can be deadly.
“The very people we’re trying to reach, the youth—we’re telling them they can’t talk about suicide the way they talk about it,” Andrews says. “When you read the threads on these memes, people find them helpful. They don’t feel alone. It’s a way for them to anonymously communicate their inner pain in a way that’s artistic, super clever, and that people who are struggling identify with.”
Andrews believes that decades of an effective “gag rule” on suicide stifled conversation and perpetuated stigma—and that while the younger generations are more willing to talk, there’s still a vestigial wariness among listeners that the very act of discussing suicide could make their friends worse. He rattles off a list of meme formats that emphasize hope or resilience. Perennial favorites are “not today, old friend,” where Moe from The Simpsons decides not to kill himself, and “my mom would be sad.” “They get at reasons for living,” Andrews says. “And those can be really small.”
Another camp of suicide-prevention experts prefers to err on the side of caution. Jane Pirkis, the director of the center for mental health at the University of Melbourne and an expert on suicide-contagion theory, is the traditionalist yin to Andrews’ laissez-faire yang when it comes to safe messaging. “I wouldn’t say I’m alarmed, but I don’t think it’s very good,” she told me after reviewing a handful of 9gag memes. “The work we’ve done looking at traditional media definitely shows that representation that normalizes suicide or glorifies it at all can lead to so-called copycat acts.”
Pirkis concedes that the bulk of the scientific literature on contagion came from the pre-internet age, but she insists those lessons carry into social media. “They’re very basic, Psychology 101 principles about modeling behavior, and people learning what’s normal, what’s likely to get a response,” she says. “That’s why you don’t see depictions of smoking in film and television anymore.”
This conversation around suicide memes is complicated by a generation gap between suicide-prevention experts and the communities they serve. I talked with several mental-health experts who were well beyond the age of the average memer and entirely unaware that suicide memes exist. Once they recovered from the initial surprise at this undercurrent of dark humor, however, they warmed to the idea that memes about suicide could have a capacity to heal.
These experts emphasize that it’s a fine line between destigmatizing suicidal thoughts and normalizing them. The right messages can let people know they’re not alone and that it’s okay to reach out for help. But overexposure could, in theory, lead to the belief that thoughts about self-harm are normal and not a cause for concern. Further muddying the waters, the very meme that could inspire one teen to call a psychiatrist could dredge up painful memories of a prior attempt in someone else.
There’s a dearth of experimental research on how people respond to nongraphic content about suicide, so social-media platforms are left to cobble together their own policies through high-stakes trial and error. The changes to Instagram’s self-harm policy last week, for instance, were reportedly spurred by the death of a 14-year-old in the United Kingdom. Most social-media outlets draw the line at text, image, and video that appear to encourage suicide or self-harm. Facebook, Tumblr, and Instagram have “hot words” associated with self-harm that automatically trigger messages to users about mental health and links to the National Suicide Prevention Lifeline, a network of crisis hotlines that offer free counseling around the clock. But since image-based memes are hard for AI to parse, platforms generally rely on users to report sensitive material that isn’t simply text-based.
Foreman points to Tumblr as a platform that’s getting it right. Tumblr partners with mental-health advocacy groups, like the Suicide Prevention Lifeline and the National Alliance on Mental Illness, and reviews every post reported with the “self-harm” flag, according to Victoria McCullough, the company’s head of social impact and public policy. Depending on the post itself and its reception by the community, Tumblr might remove abusive responses, remove the post itself, or refer the creator to additional mental-health resources. McCullough says the company is very cautious about removing content altogether for fear of “undermining those recovery conversations.”
9gag only added a tag specific to self-harm in the past several months. “Personally, I don’t think any community can claim that users’ comments are 100% positive at all times. There’s no such thing in life either. LOL,” 9gag’s COO, Lilian Leong, told me via email. “Of course, we can always level up our filtering measures. But we are very cautious not to get over-engineered and overkilled.”
Unlike Facebook and Twitter, 9gag is a single-scroll platform; regardless of a user’s previous activity on the site, everyone sees the same grab bag of memes. What’s on the ‘hot’ and ‘trending’ pages is determined by users’ upvotes and any editorial choices 9gag makes. Leong did not respond to questions about specific curation decisions—like why users couldn’t search the tag suicide, but could search kill myselfand suicidal—or describe the decision-making process behind the removal of a sensitive post. In the days following our exchange, however, 9gag plugged all the holes in its search system pertaining to self-harm.
At the end of my reporting for this story, I posted on 9gag asking users to talk about their experience with memes about suicide. You can see the full threads here and here. The replies were a case study of what happens when a diverse community is left all but unsupervised in their reactions to suicide memes.
Pirkis, the University of Melbourne mental-health expert, agreed with @infexo, saying it’s a deadly myth that only professionals can help people at risk of suicide. “This great unwashed population that we’re talking about has a role to play,” she says.
Foreman and her colleagues at the American Association of Suicidology look forward to seeing the dialogue expand around suicide memes, however inelegantly. “I’ve never known a single problem that got better by not talking about it,” Foreman says. “Not a single public-health problem has gotten better by reducing conversation.”