YouTube is very serious when making sure inappropriate video content is not seen or shared on their service anymore. The Google-owned company announced they took down 8.7 million videos in the last three months of 2017 with the help of machines.
Yeah, they literally have a machine behind them now.
In an official blog post, YouTube states that they removed over 8 million flagged videos. Most of them being spam and porn videos attempted to be uploaded by users. Most of the work was done by machines that flagged 6.7 million and out of that number, 76 percent of those videos were removed before they could even be viewed.
YouTube reveals in the post that they introduced machine learning flagging back in June 2017. Before the move, the company realized that 8 percent of videos that were flagged for violent extremism were removed with less than 10 views. With the help of machine learning flagging they were able to take down more than half of the videos flagged for violent extremism with fewer than 10 views.
Even though they have the video removing cheat code behind them, they are still relying on the help of humans to help combat inappropriate material. YouTube also announced that they will give users a new tool in the fight against suspect videos. A Reporting History Dashboard is being introduced that will allow users to track videos they have flagged. YouTube users will be able to see the status of the video and whether or not the video has been removed from the video sharing service.
In a time where fake news is rampant, propaganda is being shared at an alarming rate these machines should indeed help YouTube combat these problems more efficiently. So if you’re even thinking about uploading something inappropriate to the service you are wasting your time at this point.
Photo: Sergei Konkov / Getty