On Monday, YouTube CEO Susan Wojcicki announced in an official blog post that the platform would undergo several changes to address the current content crisis it’s going through. Among the most notable actions, the executive said they would expand their reviewing workforce to more than 10,000 next year.
The move comes after weeks of diverse reports denouncing everything from harassment and hate speech to inappropriate videos in the site’s kids section. Facebook recently announced a closed-end app with parental controls to allow children to chat safely.
YouTube says it will double down on their strategy to combat extremist videos, a phenomenon that struck the community earlier this year and led advertisers to threaten the company with immediate removal of their content. Part of the policy overhaul that’s coming next year involves changes to monetization too.
— Susan Wojcicki (@SusanWojcicki) December 5, 2017
YouTube will change content reviews, comment sections, and ads
As part of the new wave of changes set to strike YouTube soon, the most urgent one seems to be yet again a matter of content reviewing. With this in mind, the site owned by Google will expand their workforce to more than 10,000 over the course of 2018.
What these people will do will be similar to the line of action followed to take down videos of terrorism and extremism. They will watch and flag content deemed inappropriate by the new standards, and that information will, in turn, be fed to machines who will take on the task later down the road.
Since AI cannot do everything, humans will also be in charge of starting a wave of comment section banning and regulation. YouTube says they will also introduce new moderation tools to address the issue of hate speech, harassment, and bullying in these small forums.
Last but not least, the company will be in talks with both content creators and advertisers over the next couple of weeks to lay down new ground rules for monetization on the platform. Suits at YouTube have heard the community “loud and clear,” and they want to propose new guidelines for fair business on the site.
YouTube says its going to find "a new approach on advertising." As of this morning ads from Lyft, Adidas, and others were running before ads of infants being held down & crying while getting shots https://t.co/iPs4XhEOLN pic.twitter.com/EjHPziNhqd
— Charlie Warzel (@cwarzel) December 5, 2017
Other platforms should follow suit before it’s too late
In the wake of these scandals, other video platforms have been pointed at as hubs where both developers and the community should take action against unruly practices plaguing their sites. Twitch.com is currently the most notorious example.
Once revered as a haven for gamers, geeks, and nerds, the streaming community is now overcrowded with so-called cam girls who essentially put up a show doing what they please wearing little clothes and at times engaging in suggestive activities for views and contributions from viewers.
Neither watchers nor streamers want to spark the next gamergate over allegations of sexism on Twitch, but some of these new girls are disrupting the dynamic of the site and even stealing views from other people who make their livelihood out of streaming.