Clearly, Facebook has some kinks to work out.
In the spring, it looked like things were looking up in terms of being able to have relevant conversations on Facebook when the social media site committed to taking rape and violent threats more seriously. But in the months since, it has become overwhelmingly clear that Facebook has no idea how to monitor content, be it misogynistic, violent, racist, or a combination of the above.
Earlier this month Robert Jones, the sometimes controversial man behind the popular “literary, socio-political, sexual, pop culture” blog Son of Baldwin, informed his nearly 14,000 Facebook followers that he may have to find another place to hold crucial conversations about homophobia, racism, misogyny, trans issues, sexism, and ableism. Since 2010, Jones has been cultivating a safe place on Facebook for queer people of color, but according to the social media giant, Son of Baldwin was continuously violating their terms and conditions.
Facebook had warned Jones that they would shut down his page in the past for using the n-word, which Jones posted in a quote from James Baldwin that he was discussing academically.
Of the most recent shutdown threat, Jones wrote, “Apparently, one of the Marvel groupies reported the page because they don’t like that I’m criticizing Marvel Comics from a racial lens. Facebook, which will leave up rape pages and pages where Trayvon Martin is called a baboon—and more than leave them up, will say that there is nothing problematic about those pages—wants to remove Son of Baldwin for having critical and intellectual conversations about bigotry.”
Sadly, what happens to the Son of Baldwin page regularly is just one example of how Facebook is failing its members, especially those who are people of color, queer-identified folks, and women.
Last spring, a coalition of groups led by Women, Action, and Media (WAM) pushed the world’s most popular website to restructure its moderation system that had let rapey and violent memes spread like wildfire for years. A few months later, Twitter followed suit. Facebook also invited WAM!, the Everyday Sexism Project, and members of its coalition to work closely with Facebook on the issue of how Community Standards around hate speech should be evaluated.
Today, the site is still using a method of moderation that seems to evaluate and shut down content in a hodgepodge way that takes actions against some pages where people use sexual language or epithets in an academic or positive way and refuses to remove posts that are genuinely threatening or outright racist.
In addition to the Son of Baldwin example, WAM executive director Jaclyn Friedman has been getting feedback from people that the site has removed BDSM pages that promote kinky consensual sex. This upsets the sexual liberation activist.
“I’d like to go on record saying that we didn’t ask Facebook to do that,” Friedman says. “We’re getting blamed for it, but we fully support content featuring consensual sexual activity between adults.”
Even though it’s not WAM’s mess to clean up, the organization is actively pressuring Facebook to restore consensual BDSM pages.
“The biggest problem for Facebook is that they have terms of service, but there is no adequate system in place to monitor content being posted,” Friedman says. “Oftentimes its moderators are making snap judgments regarding images, with much of the work being outsourced to countries that aren’t exactly beacons for women’s rights. So if there is no system in place, who gets to decide what’s appropriate – like an image of a woman being beaten – and what’s not, like an image of a woman breastfeeding? I don’t believe this is a big conspiracy or that Facebook hates women. I think there is a gap between the people making the policies and the people implementing the policies and that’s what we’re trying to fix.”
That case-by-case oversight means that Facebook’s one billion users might be subjected to memes like “Slap a Bitch Day,” while sex-positive and innocuous naked art could get flagged as offensive.
For years, Facebook was unwilling to address these concerns, contending that the content endorsing violence against women were expressions of free speech or fell under the “humor” portion of their guidelines. In one specific example WAM documented, numerous people reported a widely shared image of a scantily clad woman lying on the cement with her head bashed in. The text surrounding the image said, “I like her for her brains.” After the image was reported for containing graphic violence, Facebook issued its standard statement when a reported image passes its moderation: “Thank you for your report. We reviewed the photo you reported, but we found it doesn’t violate Facebook’s Community Standard on graphic violence, which includes depicting harm to someone or something, threats to the public’s safety, or theft and vandalism.”
But even after WAM’s campaign and Facebook’s promise to change, the site’s guidelines remain murky and ambiguous at best, and it’s difficult to understand the thought process behind what gets removed from the site and what doesn’t.
New Jersey-based activist Lorena Ambrosio, a frequent guest on the Joy of Resistance radio program, learned that the hard way when she was temporarily blocked from logging in to her Facebook last month after she posted a voicemail from a well-known Occupy Wall Street activist and supposed ally to women. In the voicemail to Ambrosio’s friend, the activist refers to her as a “bitch.” It was important to Ambrosio that fellow activists understood how he speaks about women and the tactics he uses to intimidate women who speak out against him.
Almost immediately, the voicemail was reported as abuse, deleted by Facebook, and Ambrosio was blocked from accessing her account. Yet, two images reported by Ambrosio for depicting rape and violence against women (one of which featured a drugged woman being carried by a man) remained on Facebook. One, she was told, fell under the humor guidelines. The other was still waiting to be reviewed one month after she reported it, meaning the harmful, triggering image stays online until a decision is made.
When lamenting the difficult position Facebook put Jones in—deciding whether to continue having his page taken down with the threat of one day having it removed permanently, or taking his community elsewhere before the decision was made for him—the blogger wrote, “Facebook is the master’s tool.” It was a reference to Audre Lorde’s groundbreaking 1984 essay “The Master’s Tools Will Never Dismantle the Master’s House.” And it’s true. If Facebook is a place where racism, misogyny, and homophobia are not only deeply ingrained, but willfully perpetuated, how can the site be used as a tool to combat racism, misogyny, and homophobia? Short answer: it can’t. Unless Facebook addresses these issues in a serious way that results in actual, noticeable, trackable change, the site will continue to be the bigot’s playground where the only people losing are those most subjected to the hate and violence depicted in the images and pages Facebook continues to protect. So in other words, it’s business as usual.