The issue came to the forefront past year after a report by The Times, but many content creators say YouTube's updated policies have made it very hard to monetize on the platform, even though their videos don't violate its rules.
The staggering extent of dodgy content uploaded to YouTube has been laid bare, after the Google-owned video streaming giant released its first ever Community Guidelines Enforcement Report. It also helps discover high-volume problem videos that feature spam. The inaugural report, which covers the last quarter of 2017, follows up on a promise YouTube made in December to give users more transparency into how it handles abuse and decides what videos will be removed. In June of a year ago, the machine learning was introduced, and now 50% of videos removed for violent extremism have fewer than 10 views. After YouTube started using its machine-learning algorithms in June 2017, however, it says that percentage increased to more than 50% (in a footnote, YouTube clarified that this data does not include videos that were automatically and flagged before they could be published and therefore received no views). Among the total removed videos, about 6.7 million were flagged by machines only. Earlier this year, YouTube announced the creation of an "Intelligence Desk", adding 10,000 new content moderators. The company has just reinstated a video from watchdog group Media Matters that reveals how some of the outrageous claims made by conspiracy theorist Alex Jones are lies.
With up to 300 hours of content uploaded to YouTube every minute, it's clear staying on top of dodgy videos represents a Herculean task for the service. After reinstating the video, YouTube said that removing it was a "wrong call".