Trending

Here's how Facebook chooses what you see in Facebook Live

A Facebook employee holds a laptop with a 'like' sticker on it during an event at Facebook headquarters during an event at Facebook headquarters on April 4, 2013 in Menlo Park, California. 

In addition to exploding watermelons and hilarious Chewbacca masks, Facebook Live can also shine a spotlight on darker moments -- like officer-involved shootings.

But when Diamond Reynolds used Facebook Live to document her boyfriend Philando Castile's fatal shooting at the hands of a police officer, the footage vanished from the site shortly after it was posted.

The company blamed the video's absence on a technical glitch; it reappeared on the site about an hour later with a graphic content warning. But the incident has prompted questions about how Facebook decides what content it allows on its platform.

>> Read more trending stories

Facebook's content guidelines say it bans violent or graphic images from its site "when they are shared for sadistic pleasure or to celebrate or glorify violence."

Facebook rep clarified to TechCrunch that intent matters at least as much as the level of violence. The outlet writes, "If someone posts a graphically violent video saying 'This is great, so and so got what was coming to them,' it will be removed, but if they say 'This is terrible, these things need to stop,' it can remain visible."

Facebook has a team monitoring flagged content 24/7. That team can choose to remove violent content, keep it up with a disclaimer attached or leave it up untouched.