So to have FULL moderation of all content uploaded would mean you'd need to subdivide all of the content into 1 min strips for people to review.
For those saying Google needs to moderate the videos, let's do some simple math.
Latest statistics on for YouTube is that 300 hours of video are uploaded *every minute*
300 hours * 60 (to get minutes) = 18000 people per minute.
Now obviously humans don't work 24 hours a day, so you'd need at least three shifts which means you need at LEAST 54000 people to just consume the ocean of uploads TODAY not the future. And also no one can work 8 hours STRAIGHT so we'd need to double this amount to allow for breaks and also to make sure you cover for other languages. So we're now at 108,000 people that need to be hired.
This number should 100% be higher to actually tackle the problem but let's say this will work.
This is only to moderate the stuff coming *IN* were aren't even mentioning the amount of moderation needed to go over every comment. But let's say they can sneak it all between overages of people's time.
Now let's say that they all get the absolute minimum of wages. And we're at 30k per person with benefits/administration fees
That will cost Google $3,000,000,000/year(for today only) on a platform that *doesn't actually make Google any money* and barely be enough to actually "moderate" the platform.
Or in other words, it wouldn't be feasible.
Secondly, this is as much a "Ring" as a "ring" was in the basement of a pizza parlor.
The only "ring" here is the algorithm noticing patterns of people viewing this content. Not the other way around.
This seems to be the only post to properly describe the scale of the challenge of comprehensive human moderation on YouTube.
Another angle. Even if YouTube has 10,000 moderators, users out-number moderators 10,000-100,000 to 1. It is entirely possible (and very probable) that users will spot things that moderators don't, and that many (most?) reports never get seen, much less thoroughly investigated.
It's disingenuous to say that YouTube/Google, through inaction, is complicit or endorses the content.
Advertisers (and offended users) should see it for what it is - edge case problems with a largely automated system tasked with overseeing content of unprecedented scale and variety. I bet the overwhelming majority of Youtube users have never ever seen nor been suggested "CP" content at any time ever.
Throwing whole features away that clearly provide exceptional UX value (disabling commenting and automated content suggestions?) does not make sense.