Eff Facebook’s Community Standards

We all know that offensive shit floats around on social media. A platform like Facebook with over a billion users is going to contain any number of fucked up opinions, because in a pool of one billion, the ratio of fucked up people is probably pretty significant. That being said, there are some “Community Standards” to “keep people feeling safe” when they use Facebook.

You guys, Facebook’s Community Standards are bullshit. Let’s explore.

They start off with this nice little disclaimer at the top of their community standards page (emphasis mine):

We want people to feel safe when using Facebook. For that reason, we’ve developed a set of Community Standards, outlined below. These policies will help you understand what type of sharing is allowed on Facebook, and what type of content may be reported to us and removed. Because of the diversity of our global community, please keep in mind that something that may be disagreeable or disturbing to you may not violate our Community Standards.

Ok, fair enough, I guess? I went through a phase when I was little where people saying “God” out loud offended me. If we tried to censor absolutely everything that absolutely everyone found offensive, there would probably never be content on Facebook. But there are certain things that are ubiquitously offensive, racist, and disgusting that will not cease to be so despite desperate, bigoted arguments. Let’s continue.

Under “encouraging respectful behavior,” they have guidelines to censor nudity, hate speech, and violent/graphic content.  A closer look at each section:

1. Nudity

People sometimes share content containing nudity for reasons like awareness campaigns or artistic projects. We restrict the display of nudity because some audiences within our global community may be sensitive to this type of content – particularly because of their cultural background or age. In order to treat people fairly and respond to reports quickly, it is essential that we have policies in place that our global teams can apply uniformly and easily when reviewing content. As a result, our policies can sometimes be more blunt than we would like and restrict content shared for legitimate purposes. We are always working to get better at evaluating this content and enforcing our standards.

We remove photographs of people displaying genitals or focusing in on fully exposed buttocks. We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring. We also allow photographs of paintings, sculptures, and other art that depicts nude figures. Restrictions on the display of both nudity and sexual activity also apply to digitally created content unless the content is posted for educational, humorous, or satirical purposes. Explicit images of sexual intercourse are prohibited. Descriptions of sexual acts that go into vivid detail may also be removed.

That bold bit would be great if it were true, but they only added this to their community standards after a change.org petition amassed over 21 thousand signatures to get Facebook to stop erasing post-mastectomy scarring. There are reports of them censoring breastfeeding photos as recently as August of this year, despite this change to their community standards going into effect in 2013.

They changed their policy at all because Women Action Media launched a campaign shedding light on all the gender-based hate speech that was “not in violation of community standards” while breastfeeding and mastectomy photos were censored. But more about that in a bit.

2. Hate Speech

Facebook removes hate speech, which includes content that directly attacks people based on their:

  • Race,
  • Ethnicity,
  • National origin,
  • Religious affiliation,
  • Sexual orientation,
  • Sex, gender, or gender identity, or
  • Serious disabilities or diseases.

Organizations and people dedicated to promoting hatred against these protected groups are not allowed a presence on Facebook. As with all of our standards, we rely on our community to report this content to us.

People can use Facebook to challenge ideas, institutions, and practices. Such discussion can promote debate and greater understanding. Sometimes people share content containing someone else’s hate speech for the purpose of raising awareness or educating others about that hate speech. When this is the case, we expect people to clearly indicate their purpose, which helps us better understand why they shared that content.

We allow humor, satire, or social commentary related to these topics, and we believe that when people use their authentic identity, they are more responsible when they share this kind of commentary. For that reason, we ask that Page owners associate their name and Facebook Profile with any content that is insensitive, even if that content does not violate our policies. As always, we urge people to be conscious of their audience when sharing this type of content.

While we work hard to remove hate speech, we also give you tools to avoid distasteful or offensive content. Learn more about the tools we offer to control what you see. You can also use Facebook to speak up and educate the community around you. Counter-speech in the form of accurate information and alternative viewpoints can help create a safer and more respectful environment.

So this is gross for a number of reasons. For one, they think that people saying these things as themselves, publicly, will somehow cure their shitty worldviews. They claim that the “real name” policy is to protect users and reduce the likelihood of harassment. I believe this to be absolute bullshit (they want your real name so they can better sell your information to advertisers), but that’s another article for another day. The “tools” they give you to avoid offensive content is to take it upon yourself to message the offender, unfriend them, block them or report them. So they won’t necessarily take it down, but you don’t have to look at it.

Now the term “hate speech” implies that it must be a verbal attack. Because Facebook is an increasingly visual platform, I argue that images and videos that contain hateful messaging also fall under this category. Last month, I reported a blackface video that was veiled as a “makeup tutorial” that someone had posted on their page. When Facebook wrote back that it didn’t violate their community standards, I yelled about it on Twitter, posted about it on my Instagram page, and moved on to the next incredibly infuriating example of injustice/bigotry I found on the internet.

Just this morning, another friend posted about seeing two people using blackface images of themselves as their profile picture. When friends of my friend reported it, dozens of them were given the same “doesn’t violate our community standards” reply from Facebook. So, my conclusion is that Facebook doesn’t give a shit that racist white people are in blackface on their platform. Maybe they think it falls into the “humor/satire” exception that they allow. Either way, it’s gross. But this isn’t limited to blackface! Another friend on a different occasion reported multiple images of swastikas and anti-Semitic hate speech only to be met with the same non-response. They. Do. Not. Care.

3. Violent/Graphic Content

Facebook has long been a place where people share their experiences and raise awareness about important issues. Sometimes, those experiences and issues involve violence and graphic images of public interest or concern, such as human rights abuses or acts of terrorism. In many instances, when people share this type of content, they are condemning it or raising awareness about it. We remove graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence.

When people share anything on Facebook, we expect that they will share it responsibly, including carefully choosing who will see that content. We also ask that people warn their audience about what they are about to see if it includes graphic violence.

You need only read about the overwhelming effort it took to get gender-based violent content and Facebook groups off their platform (TW: with graphic, disgusting posts) here and here.  If you don’t want to look at those links (and I don’t blame you) I can summarize. This section of their community standards policy is trash. They will allow rape jokes and horrifically violent memes towards women with glee. They only started taking down these pages after enough people complained about them, and even so, “kicking and screaming.” Hell, they still won’t even take down this now-defunct group celebrating rape, even though it has been inactive for five years and anti-hate groups have continuously campaigned for its removal.

So, what the fuck, Facebook? You’ll only protect our safety if we’re loud enough about it? If you get a little bad PR because of it?

I know Facebook is not the only social media platform guilty of horrible responses to reported abuse by its users. But it is the most-widely used social media platform in existence. It’s the one that everyone and their mom (literally) are on every day. As someone who works with social media, I know Facebook too often takes itself for granted. It is the biggest and richest social media platform and in many ways, it doesn’t have to care about how its users are treated. I fear that it has become so ubiquitous in our lives that we will keep using it no matter what they do wrong.

I condemn them for this. Blackface is never okay. Ever. It will never be funny, and it will never be inoffensive. Joking about or threatening women with domestic violence is never okay. Glorifying intimate partner violence is never okay. At least one-third of all American women killed each year are killed by their intimate partner, and roughly one in four women has been the victim of sexual assault. Domestic violence affects almost 5 million American women every year. Swatstikas, while adapted from an ancient symbol with religious significance, has come to be associated with Nazism, racism, and anti-semitism. It became the symbol for the massacre, torture, and dehumanization of millions of people in the largest genocide humanity has ever seen. Allowing blatant displays of racist support for the Nazi movement is about as hateful as it gets. Facebook should have no reason to tolerate any of these things.

Social media is often lauded as a tool for social justice, for spreading progressive ideas that can get humanity closer to treating each other equitably and bringing abusers to justice. It’s just as often a tool to spread ignorance, misinformation, and hate-filled ideas. We are all users of a platform that is such an intimate part of all our lives, and indeed is the tool with which we share our ideas for a better world. So how can we tolerate their tolerance of hateful, abusive behaviors? We have to hold them accountable. Otherwise, what are we doing there?


Leave a Reply

Your email address will not be published.