To placate its critics, who accuse Google of letting YouTube breed terrorist groups on its platform, the company is reportedly hiring more people to review and enforce its policies on the video sharing site. But over the past month, our initial use of machine learning has more than doubled both the number of videos we've removed for violent extremism, as well as the rate at which we've taken this kind of content down...
In addition to banning videos that contain content which violates YouTube's user agreement, the site will also place some videos in a "limited state" which is created to prevent them from receiving attention.
Google also said that videos that are reported by users, but do not contain illegal content may now have likes and comments disabled. The company said the videos will be "placed in a limited state", which means the content will be behind an interstitial.
YouTube is collaborating with other digital giants - Facebook, Microsoft, and Twitter - in the development of technical solutions and the creation of more tools to deal with extremist content.
YouTube will start applying those changes in the coming weeks for its desktop version and will bring it to mobile versions afterwards.
According to AdWeek, Google will be working with 15 additional non-governmental institutions to help curb "hate speech", and will continue to improve their machine learning in order to censor, demonetize, and de-platform content they deem offensive.
Additionally, Google says that these systems have been more accurate than its human staff when flagging videos for removal, "in many cases" at least. The move was part of a four-pronged strategy to combat the spread of such content online following a brand safety crisis earlier on this year, during which giants like M&S, the Guardian and the United Kingdom government pulled ad spend from YouTube and the Google Display Network following concerns over unintentional ad misplacement.
Additionally, the company is planning more stringent screening processes. YouTube is now working with more than 15 new organizations, such as the No Hate Speech Movement, the Institute for Strategic Dialogue and the Anti-Defamation League.
The company sees this ability of bots as "dramatic" and better than humans.
YouTube added there is more to come regarding its fight against online extremism and that there "is always more work to be done".