YouTube to Expand Teams Reviewing Extremist Content

YouTube is hiring 10,000 people to monitor and control bad content

YouTube Will Add More Human Moderators To Stop Its Child Exploitation Problem

As media coverage surrounding inappropriate YouTube content continues to rise-especially in tandem with U.S. kids' mobile usage-the Google-owned platform has been vocal about taking initial steps to ensure its youngest of viewers aren't exposed to nefarious material.

"We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018".

"I've seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm", she wrote in a blog post.

"Since June, our trust and safety teams have manually reviewed almost two million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future".

"Our advances in machine learning let us now take down almost 70 percent of violent extremist content within eight hours of upload and almost half of it in two hours and we continue to accelerate that speed.", said Susan.

She said advances in machine learning meant Youtube could take down almost 70 per cent of violent extremist content within eight hours of it being uploaded and almost half of it within two hours.

In November, the New York Times reported on a number of offensive and sometimes brutal videos found on the family-friendly YouTube Kids app. The company chose to take this step amid criticism over its inability to block violent, extremist, and disturbing videos and comments.

The move comes a month after YouTube closed 50 channels that target young viewers with inappropriate content, including Toy Freaks, and deleted thousands of videos that combined received tens of billions of views.

YouTube has developed automated software to identify videos linked to extremism and now is aiming to do the same with clips that portray hate speech or are unsuitable for children. There was also a conspiracy-theory video that came up in a YouTube search a day after the October Las Vegas shooting that killed dozens. It is now trying to do the same with clips that portray hate speech or are unsuitable for children.

The site's CEO, Susan Wojcicki, also made clear YouTube was developing advanced technology which would automatically flag content up for removal.

She added: "We're also taking actions to protect advertisers and creators from inappropriate content".

The platform recently lost advertisers after ads appeared next to videos with inappropriate content involving children.

Latest News