Rule-Breaking Videos Get Few Views, Claims YouTube

Photo illustration by Rafael Henrique / SOPA Images / LightRocket via Getty Images

SOPA Images / LightRocket via Getty Images

With the increasing scrutiny of some of the content on its platform, YouTube has started reporting how far some of these suspicious videos are actually being viewed.

The new Violative View Rate (VVR) is the number of views, in whole or in part, of an illegal video, expressed as a percentage of total views. Content includes videos that violate company child safety guidelines, violent or graphic content, nudity and sexual content, spam, or hate videos.

VVR will be the primary metric the company uses to measure its performance on accountability, according to YouTube.

“Other metrics such as turnaround time to remove a violent injury video are important, but they don’t fully capture the real impact of violent injury on the viewer,” said Jennifer O’Connor, director of trust and security for YouTube.

“For example, compare a violent video that got 100 views but stayed on our platform for more than 24 hours with content that received thousands of views in the first few hours before it was removed. Which ultimately has more impact?”

Currently, the VVR is between 0.16 and 0.18 percent, which means that out of 10,000 views on YouTube, 16 to 18 are from violent content. That’s a 70 percent decrease from the year-ago quarter, with YouTube crediting machine learning for the improvement.

The company calculates the rate by randomly selecting videos for review by its content reviewers. And it should be noted that every time the content guidelines are updated, the rate will go up for a short time while the system works to catch up.

YouTube has been particularly criticized because its recommendation system can lead viewers further and further into the rabbit hole of extremist content.

Having invested heavily in machine learning four years ago, it is now able to detect 94 percent of all violent content through automated flagging, with three quarters removed before even 10 views are received.

Since the publication of its first Community Guidelines Enforcement Report in 2018, YouTube says it has removed more than 83 million videos and seven billion comments for violating its guidelines.

It’s important to note, however, that these numbers only apply to content that YouTube content reviewers consider violent – and many believe the company is way too lenient.

And since the platform receives billions of views every day, even a low VVR can mean millions of people are exposed to content that is against the rules. However, with this new move, the company can convince lawmakers that it is doing everything in its power to effectively monitor its platform.

Comments are closed.