Blurry Facebook wallpaper
Image: Facts Reporter.

On Wednesday, Facebook CEO Mark Zuckerberg announced the company was hiring 3,000 additional content moderators to review the content uploaded to the social network or streamed on Facebook Live.

The tech giant’s actions respond to a recent stream of incidents that were broadcasted on its platform. The executive personally announced the hiring motion with a post on his Facebook page, and later confirmed his intentions during a meeting with investors to discuss the latest quarterly earnings.

Bolstering the content reviewing workforce at the company is necessary, said Zuckerberg, because the current artificial intelligence engines are not quite there yet to meet human discerning capabilities. These 3,000 workers might also help battle the spread of fake news on the social network.

Zuckerberg and his team want to save lives with the social platform

Facebook is making this effort because it doesn’t want its platform turning into a stage for criminals to show their wrongdoings to the world. It also wants to help people who might be considering suicide and stop them from going through with it on a live broadcast.

“OVER THE NEXT YEAR, WE’LL BE ADDING 3,000 PEOPLE TO OUR COMMUNITY OPERATIONS AROUND THE WORLD – ON TOP OF THE 4,500 WE HAVE TODAY – TO REVIEW THE MILLIONS OF REPORTS WE GET EVERY WEEK, AND IMPROVE THE PROCESS FOR DOING IT QUICKLY.”

Nearly 2 billion people use Facebook every month, and close to 1.3 billion of those users log in to view and share content daily. The platform run by Mark Zuckerberg is a highly high-exposure window to the world, which is why he and his fellow executives always roll out new tools to keep everything in order.

In recent times, the social network has released suites of instruments to prevent people from committing suicide and stopping others from sharing sensitive content that has been tagged as revenge pornography. The CEO said in his post on Wednesday he wants to make these tools even simpler, faster, and easier.

The worst cases reported in Facebook Live

Last week, a 49-year-old man from Alabama live-streamed his suicide on Facebook Live. Days before, a 20-year-old father in Thailand showed to the world how he killed his baby daughter. Over 370,000 people had seen the video before it was taken down.

A notorious case happened last month, in which a Cleveland man broadcasted how he randomly shot and killed a senior citizen in broad daylight. The perpetrator killed himself later, although this was not streamed or recorded anywhere.

Earlier in January, four young men in Chicago kidnapped and tortured an 18-year-old, and uploaded it to Facebook Live. Authorities later learned the teenager was disabled and the attackers have since been caught and charged using the recording as evidence.

Mark Zuckerberg said a girl attempted to commit suicide on Facebook Live as well just last week, but that thanks to the prevention tools on the platform, local authorities were able to reach and help her before she went through with it.

Reuters reports there have been at least 50 incidents recorded on the live stream service since its launch. Facebook currently employs just over 18,000 people, and the additional 3,000 content reviewers will take those numbers above the unprecedented 20,000 mark.

Source: Facebook