SAN FRANCISCO — Facebook is considering putting restrictions on who can post live videos on the social network following the deadly shootings at two mosques in New Zealand that were broadcast by the gunman, the company’s chief operating officer Sheryl Sandberg said Friday.
The social media giant will monitor who can use its “Live” feature depending on such factors as prior violations of Facebook’s community standards, Sandberg wrote in a blog post.
Facebook has been under pressure to change its policies since a gunman livestreamed the killing of 50 people at two mosques in Christchurch, New Zealand, on March 15.
Critics say Facebook did not remove the video quickly enough, allowing it to spread across the internet and to be uploaded to other online services such as Google’s YouTube.
The original video was seen by about 200 people during the 17-minute live broadcast, Facebook said. Facebook removed the video 12 minutes after the livestream ended, but users grabbed the footage and reposted clips from it, making it challenging for Facebook to block all the footage.
Sandberg says Facebook identified more than 900 different videos showing portions of the attacks and deployed artificial intelligence tools to identify and remove hate groups in Australia and New Zealand including the Lads Society, the United Patriots Front, the Antipodean Resistance and National Front New Zealand.
Facebook is building “better technology” to quickly identify edited versions of violent videos and images and prevent people from re-sharing these new versions, Sandberg said.
“Many of you have also rightly questioned how online platforms such as Facebook were used to circulate horrific videos of the attack,” Sandberg wrote. “We are committed to reviewing what happened and have been working closely with the New Zealand Police to support their response.”
For years now, Facebook has been buffeted by sharp criticism for rolling out the live-streaming product without an adequate plan to prevent acts of violence from being shown to its more than 2 billion users. Videos that glorify violence violate Facebook’s rules, but Facebook mostly relies on users to alert the company to videos that should be taken down.
In a 2017 interview, Facebook Chief Executive Officer Mark Zuckerberg agreed that his company shoulders the responsibility for halting violence on Live and on Facebook in general. The company has hired thousands more moderators, in part, to spot violence in live videos and is training its artificial intelligence to detect it as well.
Earlier this week, Facebook said it would ban explicit praise, support or representation of white nationalism and white separatism on Facebook and Instagram, including phrases such as “I am a proud white nationalist,” following the New Zealand shootings.
Last week, Facebook said it took down 1.5 million videos containing footage of the attacks within the first 24 hours.
Related story: Facebook Live violence horrifies users, who say Facebook’s still not doing enough
Related story: After New Zealand mosque shootings and civil rights backlash, Facebook bans white nationalism, separatism
Related story: Zuckerberg: We’re responsible for halting violent content on Facebook