Logan Paul, a 22-year-old vlogger with 15 million subscribers on YouTube, published a video this week which showed the body of a suicide victim in the infamous Japanese forest of Aokigahara.
The abhorrent video clip features American Paul laughing and saying to his friends: “What? You never stand next to a dead guy?”
The huge prominence given to this clip has clearly raised awareness, not to the importance of mental health as Paul claims, but instead of the dangers of covering the method, location and footage of a suicide in excessive detail.
Traditional media are explicitly warned against covering these details in order to prevent copycat suicides.
And while YouTube is not a news publisher, the platform must look to the hard lessons learnt from media coverage of suicides.
The Werther effect
It was well over 200 years ago that the first impact of reporting on suicide was shown. In Goethe’s 1774 novel, The Sorrows of Young Werther, the protagonist shoots himself out of unrequited love for a woman.
Following the novel's publication, a series of suicides occurred across Europe either in a similar manner or with a copy of the book on the victim, a phenomenon that became known as the ‘Werther effect’.
A 20-year long study of the US press published in the 1970s, led by David P. Phillips, found that in the 33 months that had a front-page suicide story, there was a significant increase in the number of suicides in 26 of those months.
The impact of copycat suicides has led to strict publishing guides for journalists reporting on suicide, including 2008 guidelines from the World Health Organisation (WHO).
The WHO guide to reporting on suicide says the more coverage, and its prominence, has a direct impact on the number of incidents, and gives a clear instruction not to use photographs or video footage.
YouTube own’s policy
YouTube’s own publishing guidelines – “It's not okay to post violent or gory content that's primarily intended to be shocking, sensational or gratuitous” – do not appear to be being enforced.
There is also no specific suicide policy guidance, but in its safety centre it does suggest what to do if you come across content that appears suicidal. In addition, a search relating to self-harm will cause a header to the Samaritans to appear at the top of your feed.
Paul’s initial video was featured on YouTube’s 'Trending' page before it was deleted, a sure way of increasing its exposure. Re-purposed clips featuring the same footage were also trending on the site days later.
In a written statement that did not directly address copycat suicides, a YouTube spokesman said: “Our hearts go out to the family of the person featured in the video.
“YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated.”
However, previous form from the platform suggests that it will not punish the extremely popular vlogger too harshly.
The New Statesman found that YouTubers are extremely resistant and resilient to any sort of public shaming.
An article in the magazine gave the example of a husband and wife vlogger who lost custody of their children over their prank videos, but subsequently received a framed trophy from YouTube for surpassing 100,000 subscribers.
Judging by his insincere apologies, Paul also seems unaware what impact his videos could have on impressionable viewers.
Firstly, a statement that includes the unrepentant phrases:
“I do this sh*t [sic] every day. I’ve made a 15 minute TV show EVERY SINGLE DAY for the past 460+ days. One may understand that it’s easy to get caught up in the moment without fully weighing the possible ramifications.”
Dear Internet, pic.twitter.com/42OCDBhiWg— Logan Paul (@LoganPaul) January 2, 2018
Secondly, in response to further criticism, Paul published a video message (which has had 23 million views to date). Titled ‘So Sorry’, the clip sees Paul list a litany of groups to apologise too.
However, the apology is undermined by the fact that Paul initially monetised this video. While it has since been demonetised, blog We The Unicorns found that the video is likely to have made thousands of US dollars from the apology before its ads were switched off.
The fatal impact demonstrated by previous research makes it imperative that YouTube clamps down on footage that depicts suicide which is posted in the name of entertainment.
YouTube must create a specific policy guideline on how it stands on suicide presentation – which includes the direct effect of publishing explicit content. The platform must also abide by its own content policy of not sensationalising violence and ban users that violate these rules.
In the UK, Samaritans can be contacted on 116 123.