The controversy regarding Facebook’s handling of fake news has given way to many discourses and arguments over who’s to blame for the spread of misinformation.
On the one hand, the social media giant does have a responsibility towards its users to protect the integrity of their feeds and ensure that they only get quality content, but Facebook can only do so much for this to happen.
There is also a significant amount of user responsibility regarding misinformation. Fake news and websites do not spread over the internet only by shoving themselves down people’s feeds but also by being shared, endlessly, by people who don’t care about fact-checking.
Should fake news be Mark Zuckerberg’s responsibility?
Though most media outlets are currently in the process of reviewing Facebook’s updated tools to fight misinformation, only a few of them mention the fact that user responsibility should also be a concern.
Of course, talking about user education is arguably harder than talking about a way to counter fake-news on Facebook via third-party apps and groups, but as long as there are a million people willing to click the ‘Share’ button without confirming the authenticity of an article, Facebook has already lost that battle.
There are many ways to find out if a story is false, most of them not taking more than 20 seconds to perform. All of them, however, require the user not to believe everything they read online, which is truly the hard part.
Sites like Snopes.com, which will be partnering with Facebook on the fight against misinformation, already contain guidelines to spot fake news websites. The tools are there for the taking.
A time of crisis for verified content and rampant online falsehood
A Pew Research Center survey recently stated that nearly one-fourth of Americans admitted to sharing a fake news story online, with 14% indicating that they did so with the knowledge that the story was false.
The PRC surveyed exactly 1,002 adults from December 1 to 4. 16% of the people said that they found out the stories were fake after sharing them. They should have included another demographic in this survey, individuals who share a story without reading the article, just the headline.
False news reports have been on the rise (in America) since the latest election, with many clickbait sites taking advantage of out-of-context quotes and plausible made-up events that seemed real for any detractor (or follower) of Trump and Clinton.
The American public’s stance on fake news and Facebook’s new fact-checking army
Zuckerberg’s company is trying out updated report systems for misinformation, adding the possibility of pointing out the falsehood in the regular box surveys prompted to users when they choose to hide a particular publication.
A close relationship with fact-checking groups will also serve to issue warnings to websites found on a user’s feed, ironically similar to the ‘B. S. Detector,’ a free fake news plugin that Facebook recently killed.
An article from TIME put the focus on ‘real news,’ stating that they were the only real antidote against this epidemic, which in the end produces a larger profit for Facebook than legitimate journalistic outlets.
Most Americans currently believe that fake news has a large impact on society, again according to PRC. 45% of them were ‘somewhat confident’ in their ability to recognize false journalism and 39% were ‘very confident.’
When asked who is supposed to deal with misinformation, surveyed adults responded with similar figures for ‘Members of the public,’ ‘Govt., politicians, and elected officials,’ and ‘Social networking sites and search engines.’
Again, user accountability for the sharing of false articles fails to appear, thus giving the fake news ad market room to thrive on the demise of real journalism.
Source: Pew Research Center