Skip to main content

Follow ISN on

The Best
Just Got Better.

New Name. Same Standards. Exceptional Results.
Learn more about how joining ISN has enhanced our services here.

Six things Facebook’s new community rules won’t allow

Facebook is keen to keep the free flow of content on its social-media platform, which is used by nearly 1.4 billion users at least once a month. But it’s also coming under pressure to police some of the graphic videos and photos that users are routinely sharing.  On Monday, Facebook released its updated Facebook community standards, and while it won’t get in to the business of scanning content – it will continue to rely on Facebook users to report abuses – it is giving users clearer rules about what kind of content they can report to Facebook and have removed.

A Facebook Canada spokesperson told the Globe and Mail that the social media network had always prohibited a range of content – including nude photos, graphic violence, hate speech and bullying – and that now it was making those rules more explicit, detailed and available to users.  “In the clarify process there are new rules that are being developed [by Facebook],” said Jenna Jacobson, a social media research and PhD candidate at the University of Toronto.

“In this context, Facebook is making a moral judgment about what we can and cannot post online,” she added.  But there’s a caveat: “Because of the diversity of our global community, please keep in mind that something that may be disagreeable or disturbing to you may not violate our Community Standards,” the updated guidelines stated.  Here are some of the highlights from Facebook’s new content rules:

EXPLICIT PHOTOS

Photos showing genitals or focusing on “fully exposed buttocks” will be removed. “We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring,” it stated. The rules also extend to digitally created content unless the images are used for educational and satirical reasons. The nudity rules do not include photos of nude sculptures and paintings.

GRAPHIC VIOLENCE

When it comes to graphic violence, Facebook acknowledges that such content can raise awareness about human-rights abuses around the world. But how to warn people that the content they are about to see is graphic? Earlier this year, Facebook started adding warnings over videos that could “shock, offend and upset.” It is also limiting graphic photos and videos that can be seen by users under the age of 18.

In its guidelines, Facebook now urges anyone posting graphic violent images or videos for educational reasons to warn Facebook users. Where there is no educational purpose, Facebook stated: “We remove graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence.”

HATE SPEECH

When it comes to hate speech, Facebook will continue removing content targeting people based on race, ethnicity, nationality, religion, sexual orientation, gender and disabilities – among other things. But Facebook has updated its rules so that sharing hate speech content in order to raise awareness and educate people is acceptable. “When this is the case, we expect people to clearly indicate their purpose, which helps us better understand why they shared that content,” it stated.

SELF-HARM

Any content that promotes suicide or self-injury – like self-mutilation and eating disorders – will not be allowed. Also, content that attacks victims of suicide or self-injury will not be allowed.

BULLYING AND REVENGE PORN

Content that falls in to revenge porn is banned under the new guidelines. This includes intimate photos and videos shared without the permission of individuals who are shown in images.  The bullying guidelines now include restrictions on content that includes images “altered to degrade private individuals,” as well as photos and videos showing “physical bullying posted to shame the victim.”

TERRORISM AND CRIME

The updated guidelines devote a section to dangerous organizations engaged in terrorist activity or organized criminal activity. The guidelines would extend the ban on groups like Islamic State to include those who support or praise the group’s leaders or condone its violent activities.

For Facebook watchers and users, the updated guidelines are the result of growing pressure on the network – and seen by some as a positive development.  “The main pressure is from users who post something and it’s blocked or removed. Or, Facebook users see an inappropriate image online and they flag it – and it’s not going away,” said Anatoliy Gruzd, Ryerson University associate professor and director of the Ryerson Social Media Lab.  The frustrations of users unable to influence what’s happening in their online community are forcing Facebook to clarify its content rules, he added

The updated rules are likely to be tested by users, said Ms. Jacobson. The section on nudity relating to female breasts states that Facebook will “restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring.”  “But will inactive breastfeeding be okay?” wonders Ms. Jacobson.  “They’re certainly responding to the ambiguity of previous community guidelines but there will never be a set of guidelines that dictates acceptable use in every situation,” she said.  And therein is the challenge for Facebook’s Silicon Valley operators as it outlines rules for its global audience.

“Facebook is operating in multiple countries where there are multiple social norms, as well as different age groups. So it’s very difficult to have a one size fits all approach. But I understand the attempt to do so,” she added.  There is another driving force – attracting new users. Prof Gruzd said comScore data he has seen from a year ago shows Tumblr and Twitter attracting new users at a higher rate compared to Facebook and that the updated guidelines can be seen as attempt to make the network more welcoming to newcomers.  “We have seen a lot of negative publicity in the content of terrorism [on Facebook], in the context of people being cyber bullied – and they just decided to deal with it head on,” he said.

Mr. Gruzd doesn’t think the latest document, which he said sets out to clear-up “gray areas”, is the final word on disputes over content. But it is a sign that the relationship between the network’s Silicon Valley operators and a diverse global audience is maturing, he added.

Globe and Mail