Mark Zuckerberg believes Facebook Live could shine a light on wrongdoing around the world.
Facebook has yet to explain the details of its live video censorship policy. But today in response to the police shooting of Philando Castille whose last moments of life were broadcast on Facebook, CEO Mark Zuckerberg posted that “the images we have seen this week are graphic and heartbreaking, and they highlight the fear that millions of members of our community live with every day.”
The comments indicate that Facebook may lean towards less censorship of Live because of its potential to expose injustices.
Zuckerberg also expressed sympathy for the victim, others like him and their loved ones. “While I hope we never get to see another video like Diamond’s, it reminds us why coming together to build a more open and connected world is so important – and how far we still have to go,” Zuckerberg wrote.
The aftermath of the shooting and the last moments of Castile before death were broadcast on Facebook Live by his girlfriend. The video was deleted by Facebook for an hour due to what it claimed was a “technical glitch,” then reappeared with an interstitial warning alerting users to its graphical content. Yet even the temporary withdrawal has raised questions about Facebook’s role in the future of citizen journalism and censorship.
Facebook needs a “graphic but interesting” content indicator
Facebook needs to determine if it is ready to be a serious source of information, even if the content it shows is difficult or uncomfortable for some viewers looking for a lighter experience. While it may seem like graphical content might scare some users or advertisers off, it could also make Facebook a more popular place for news consumption and discussion.
Right now, the Graphic Content Warning does a good job of protecting the eyes from what they might find offensive without removing it. But the problem is that the need to add the warning is currently triggered by other users reporting the content. Still, Facebook doesn’t have a flagging option for important graphical content, only that something is graphically violent.
The best bet might be that Facebook add “Graphic but Notable” as a report option and add the disclaimer without temporary removal. Then he could ask a member of his team to review the content and make sure it deserves the warning.
But the point is, anyone who watches a show in real time cannot be notified of the graphical content. No one will see it coming. Facebook may just have to accept this, maybe add a general disclaimer to live broadcasts, and find out from time to time. some users may be shocked or offended. It seems like a price to pay occasionally for Facebook to “shed light on fear”.
Here’s an integration of Zuckerberg’s Facebook post in response to the shooting: