As Neal Mohan, chief product officer of YouTube, told the Financial Times, YouTube is designating the work of YouTube content moderation to more real people. Youtube
YouTube had to reduce the personnel and workload of in-office human moderators at the height of the pandemic. Instead of depending on the 10,000-person staff, the organisation gave more content moderation control to automated systems that are able to recognize and sense the harmful content and remove it immediately.
YouTube announced at the end of August that 11.4 million videos were deleted from the website for breaking its Group Rules in the three months previous. This represents the largest number of videos taken from YouTube over a span of three months from the start of the site in 2005.
This led to 11 million videos being deleted between April and June, which is a larger amount than average. Yet on the side of caution, YouTube’s AI programmes erred, suggesting they deleted more videos that did not technically violate any rules.
YouTube revoked content moderation rulings on 160,000 posts, according to the FT. YouTube typically changes its decisions on less than 25 percent of appeals; half of the overall number of appeals was successful under AI moderation.
One of the decisions we made [at the beginning of the pandemic] when it came to machines who couldn’t be as precise as humans, we were going to err on the side of making sure that our users were protected, even though that might have resulted in a slightly higher number of videos coming down,” Mohan said.