Trinity Mount Ministries

Tuesday, December 5, 2017

YouTube to boost moderator team preventing child exploitation videos

By Ben Lovejoy

Following reports of sexualised videos of children attracting hundreds of comments from suspected pedophiles, YouTube has announced that it will be boosting its content moderator team to 10,000 people – but only by the end of 2018. The current team is reported to be around 8,000 people.

A Times investigation first uncovered the scale of the problem eleven days ago, with volunteer flaggers claiming that YouTube wasn’t taking the problem seriously …

YouTube, part of one of the most valuable enterprises in the world, gave them no help, according to one of the flaggers who spoke to The Times anonymously.

YouTube’s initial response was to acknowledge that it needed to do more to tackle the problem through both machine-learning and additional human resources.

A few days later, the company said that it had removed 150,000 videos, turned off comments on 625,000 more and terminated 270 accounts.

In a new blog post last night, YouTube CEO Susan Wojcicki announced the additional measures the company would be taking.

We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.

At the same time, we are expanding the network of academics, industry groups and subject matter experts who we can learn from and support to help us better understand emerging issues.

We will use our cutting-edge machine learning more widely to allow us to quickly and efficiently remove content that violates our guidelines.

Wojcicki said that machine-learning had already shown ‘tremendous progress’ in tackling extremist content, and that the company had begun training AI systems to detect videos that may impact child safety, though Buzzfeed notes that the challenge here may be greater.

It’s unclear whether machine learning can adequately catch and limit disturbing children’s content — much of which is creepy in ways that may be difficult for a moderation algorithm to discern.

Part of the issue comprises videos uploaded by children, which may be innocent in intent, but which attract creepy comments and worse from pedophiles.

Screenshots show how one YouTube user left dozens of inappropriate comments on videos posted by boys aged between seven and 14.

“Send me the 2 minute naked wrestling match private,” the user wrote. “And I will tell you everything you need to know to grow your channel. I wanna see u naked. It feels awesome to be naked on YouTube, try it bro.”

The blog post says that YouTube is also ramping up its team of ad reviewers to ensure advertising doesn’t appear alongside or within inappropriate videos.

We believe this requires a new approach to advertising on YouTube, carefully considering which channels and videos are eligible for advertising. We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should.


Trinity Mount Ministries

1 comment:

  1. This article gives the light in which we can observe the reality. This is very nice one and gives indepth information. Thanks for this nice article.
    cheaterland removal