Joel Kaplan, Vice President of Global Public Policy at Facebook, and Justin Osofsky, Vice President of Global Operations and Media Partnerships Department, jointly published an article on the current state of Community Standards on Friday, October 21.
The article is on Facebook’s Newsroom and responds to feedback from users and partners about the things that can (and can not) appear on the timeline. It is an update on their views about these regulations and possible changes the company could apply to its policy.
Kaplan and Osofsky highlighted the complexity of introducing global standards for the Facebook community, given that there are recent mixed reactions from the public regarding individual posts deemed inappropriate for the News Feed.
Adult and violent content could start appearing on Facebook’s timeline
Facebook will start to allow more content that could be, in other circumstances, in violation of their Community Standards, if the content proves to be newsworthy, significant, or relevant to the public interest, the article reads.
Both Vice Presidents will work with the Facebook community and partners to determine an exact methodology for these new changes, possibly by adding new censorship tools and enforcement controls to the website.
The goal of this new approach is to allow more relevant information (that might be graphic, adult-oriented, or offensive to certain groups) to show in the timeline without creating any additional safety risks for users, in particular for the large underage population on Facebook.
The article does not state when these changes will start to take place. It just says “in the weeks ahead.” Facebook will partner with various experts, journalists, law enforcement agencies, publishers, and others to make this change a reality.
Facebook encourages people to customize what kind of content they can see on their timelines
Facebook’s current Community Standards divide into four categories outlined in their official website. Posts can be removed from the site if they pose a physical threat to users, contain offensive wording or images, intend to perform any fraud, or violate intellectual property.
Sub categories from these types of posts (which extend from status updates to pictures, videos, and links to offensive websites) include content made for bullying and dedicated harassment, suicide-related, terrorist activity, and the sale of illegal and regulated goods.
Facebook also states not all content which is ‘disagreeable’ or ‘disturbing’ is in direct violation of their Community Standards, which is why they actively encourage users to customize their profiles to hide any content that might offend them.
Source: Facebook Newsroom