YouTube on Tuesday began sharing new information about the impact of bad behavior on Google’s massive video site: its violative view rate, or how many times rule-breaking videos get watched before YouTube takes them down. But the massive scale of YouTube‘s total viewing, which the company doesn’t detail, means how much people are actually watching these misleading, dangerous, hateful or offensive videos is still tricky to gauge.
YouTube’s latest violative view rate shows that for every 10,000 views, about 16 to 18 of those were of videos that were later removed for violating the site’s community guidelines. That’s equivalent to 0.16% to 0.18% of YouTube’s total views, a rate that has roughly held steady for the last year. And the data show YouTube’s violative view rate has meaningfully come down from three years earlier, when it was 0.64% to 0.72%.
But putting the new rate in context is difficult because YouTube has never said how many total views its massive library gets, obscuring just how much people are actually watching these rule-breaking videos.
YouTube is the world’s biggest online video resource, with more thanand more than 500 hours of video uploaded to it every minute. But even those figures are too general to draw conclusions — and they’re outdated. YouTube first crossed the 2 billion user mark two years ago and hasn’t updated the figure since. The stat about 500 million hours hasn’t been updated in at least three years.
“We chose to actually report [violative viewing] as a percentage so you can get a sense of how meaningful [it is] overall to the platform,” Jennifer O’Connor, a product management director of YouTube’s trust and safety department, said Monday during a discussion of the new data with members of the press.
YouTube — like Facebook, Twitter, Reddit and many other internet companies that give users a platform to post their own content — has grappled with how to balance freedom of expression with effective policing of the worst material posted on its platforms. Over the years, YouTube has reckoned with , conspiracy theories, discrimination, and harassment, videos of mass murder and child abuse and exploitation, all at an unprecedented global scale. Critics of YouTube argue the company’s content moderation efforts still fall short too often.
“We don’t catch everything,” O’Connor said. “So we try to track what’s the impact of that on our viewers.” The violative view rate is one of the measures that guides YouTube’s trust and safety team to understand how much rule-breaking videos are still getting watched, she said.
The violative view rate measures individual views of videos that were later removed. Whether a user watched 30 seconds or 30 minutes of a violative video, that counts as one view. YouTube also counts a violative view if a user stopped watching the video before the violation actually occurred.
YouTube didn’t specify the kinds of policy violations that are getting seen before videos are removed. But O’Connor said that the breakdown is similar to the violation categories of removed videos, a measurement that YouTube already releases in its routine transparency reports.
In the latest period, for example, videos violating YouTube’s child safety policies were the biggest violation type triggering a removal, at 41% of all removed videos in the last three months of 2020. That was followed by violent or graphic content at 20.6%, nudity or sexual content at 15.8% and spam or misleading content at 15.5%. Violation types like hate, harassment or violent extremism were all 1% of total removed videos or less.