Steve Stephens, the Ohio man who fatally shot 74-year-old Robert Godwin Sr., uploaded the video to Facebook, and then bragged about his crime on Facebook Live, killed himself Tuesday after a police chase. But while Stephens’ death brings an end to one saga, the implications for Facebook may have just begun.
Holding Facebook responsible for what Stephens did, or for its impacts on the victim’s family and people who saw the video, is basically anathema in America. In other countries, however, Facebook and livestreaming are even more tightly regulated or even banned. What would have happened if Stephens had attempted his crime somewhere else?
Countries in Europe — especially Germany, which has strong consumer protection laws — are known for tougher regulations on companies that make up the internet, from Google to Yelp to the companies that own the physical network, and further abroad. Germany’s strict anti hate speech laws were used to crack down on Facebook after the company failed to get rid of 70 percent of illegal content including racist comments within 24 hours of being alerted to it. “We must increase the pressure on social networks,” Heiko Maas, Germany’s minister of justice and consumer protection, said in March.
In April, the cabinet voted to fine companies up to €50 million for failing to remove hate speech, terrorist propaganda, and fake news that is defamatory or incites hate, within 24 hours of a formal complaint.
What would have happened if Stephens had attempted his crime somewhere else?
“There is definitely a tendency in Germany to more heavily regulate social networks concerning the content that people are allowed to share, most specifically, when it comes to hate speech," said Zohar Efroni, an attorney in Berlin and former Stanford Center for Internet and Society fellow.
Had Stephens commited the same crime in Germany, however, the outcome probably would have been the same. The video of the murder and subsequent Facebook Live broadcast were viewable on Facebook for just two hours, and it took the company 20 minutes to remove it after it was flagged by users.
Legislation for live video is tricky, and particularly uncharted territory in Germany. Efroni says he’s unaware of any laws specific to Facebook Live, but that the challenge it presents is multifaceted. “The law should obviously prohibit the live broadcasting of crimes, but how do you enforce such a law without shutting down the platform completely?” He asked. “I think the solution lies in a combination of measures that are both legal and technological.”
Facebook Live expanded into Europe last year, and given Germany’s attitude toward these platforms, it is possible that it might one day pass additional restrictions for live broadcasts. Recently, the country’s Commission for Authorization and Supervision (ZAK) asked PietSmietTV, a popular user on the streaming platform Twitch, to file for a broadcasting license. The argument was that since Piet’s streams were round the clock, and scheduled, he was effectively operating as a TV network. The broadcasting authority, which functions like the German version of the FCC, admitted that its rules aren’t up to date with new technologies. In a press release, the ZAK claimed it was currently "intensively addressing the problem" of the increase in internet-enabled streams.
In countries with more overt censorship practices, the solution to live video streaming is complete control. In China, where Facebook has been blocked since 2009, legislation regulating live broadcasts passed last year, placing significant content restraints on anyone hoping to use live streaming tools on their own. The law prohibits streams from the country’s burgeoning crop of video broadcasting apps that "endanger national security and undermine social stability." Under the new regulations, Chinese streaming companies must have the technical capabilities to block live streams, and they must retain users’ personal information for 60 days.
In America, the Easter murder is one in a growing avalanche of crimes broadcast on Facebook. Last month, in Chicago, a teenage girl was sexually assaulted by a group of boys, and the assailants broadcast the attack in real time over Facebook Live. Compounding the viciousness of the assault was the fact that Facebook sends notifications to a user’s friend list when they begin a live video. NPR reported that at least 40 people watched the attack.
Critics have since implored that Facebook intervene to prevent criminal uses of its live video platform. “Traditional media companies have finely-wrought guidelines and policies to help them make these decisions, but Facebook depends on us to do it,” Emily Dreyfuss wrote in Wired. “And now it might very well be time for the company to roll up its own sleeves and get to work.”
But what exactly would “getting to work” look like? Some hope that engineering solutions, namely algorithms that detect nudity and other nefarious activity, could prevent an ugly video from going live. But this could be an example of a sort of magical thinking about how technology works. The ability to scrape and moderate live video is significantly far off, if possible at all. Currently, Facebook and other social media companies rely on human moderators to sift through thousands of illicit images and videos.
In the U.S., there is a legacy of protecting the companies that host content, rather than the people who use their services. Section 230 of The Communications Decency Act shields platforms like Facebook and YouTube from being responsible for what their users post, minimizing any legal incentive to police it. Victims of revenge porn who want their images removed from a website, for example, are better off appealing under copyright law, which record labels and movie studios have used to establish an effective takedown system.
Some have argued that Facebook should perhaps just put Live on hold, or only make it available to select users, namely influencers, and media companies. This runs counter to Facebook’s intent with the product in the first place. When the company launched Live last year, it was reckoning with what it called “context collapse,” the fact that Facebook users were sharing fewer details about themselves on the platform. Regulating what appears on Facebook is arguably a crisis of free speech, but it shouldn’t be ignored that it is also a crisis of profit. The dark side of the company’s live streaming tool, which remains enormously popular, likely won’t go away without a dramatic push from the government, or overwhelming public pressure. But as examples from other countries show, only the most extreme regulations can guarantee that no murders or rapes are broadcast live, and Americans are unlikely to embrace any form of censorship online. The chilling crimes broadcast on Facebook and beyond represent the challenge societies will face as tools emerge that make life, and crime, more sophisticated.