Following backlash, YouTube vows to tighten its harassment policy

The video platform has suffered criticism following its decision not to ban a controversial commentator’s content but to restrict him from making money on the site.

YouTube is struggling to banish extremist content without enraging users who claim such bans violate free speech.

On Wednesday, the platform announced that it was “taking a closer look” at its harassment policies, which also affect videos that are hateful or discriminatory:

In a blog post, YouTube said:

The openness of YouTube’s platform has helped creativity and access to information thrive. It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence. We are committed to taking the steps needed to live up to this responsibility today, tomorrow and in the years to come.

CNBC reported:

The move comes as YouTube continues to grapple with hateful content, misinformation and other abusive videos across the site. This week, The New York Times reported that YouTube’s recommendation system showed videos of underage girls to users who watched erotic videos. Also this week, YouTube said it would not take down videos from a conservative YouTube channel that repeatedly criticized Vox journalist Carlos Maza’s sexual orientation, even though the videos appeared to violate YouTube’s content policies.

Last week, Maza tweeted:

Within his Twitter thread, Maza wrote:

… This isn’t about “silencing conservatives.” … But by refusing to enforce its anti-harassment policy, YouTube is helping incredibly powerful cyberbullies organize and target people they disagree with.

… YouTube does not give a f*** about diversity or inclusion. YouTube wants clicks.

On Monday, YouTube responded to Maza with the following tweets:

On Tuesday, YouTube announced it wouldn’t remove commentator Steven Crowder’s channel or his videos, but on Wednesday, YouTube demonetized Crowder’s content.

The New York Times reported:

“Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our site,” YouTube said in a statement about its decision on Mr. Crowder.

On Wednesday, YouTube appeared to backtrack, saying that Mr. Crowder had, in fact, violated its rules, and that his ability to earn money from ads on his channel would be suspended as a result.

“We came to this decision because a pattern of egregious actions has harmed the broader community,” the company wrote on Twitter.

The whiplash-inducing deliberations illustrated a central theme that has defined the moderation struggles of social media companies: Making rules is often easier than enforcing them.

Following YouTube’s decision, Crowder lashed out against what he dubbed the #VoxAdpocalypse. He tweeted statements such as these:

As social media users took to Twitter to argue either side of the debate, #VoxAdpocalypse became the top trending hashtag in the United States.

“YouTube’s policies have satisfied no one in this very public debacle, which is likely why the company is now reconsidering them,” The Verge reported.

In its blog post, YouTube outlined several steps it was taking to crack down on harassment, hate speech and supremacist content:

YouTube has always had rules of the road, including a longstanding policy against hate speech. In 2017, we introduced a tougher stance towards videos with supremacist content, including limiting recommendations and features like comments and the ability to share the video. This step dramatically reduced views to these videos (on average 80%). Today, we’re taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status. This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory. Finally, we will remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place.

We recognize some of this content has value to researchers and NGOs looking to understand hate in order to combat it, and we are exploring options to make it available to them in the future. And as always, context matters, so some videos could remain up because they discuss topics like pending legislation, aim to condemn or expose hate, or provide analysis of current events. We will begin enforcing this updated policy today; however, it will take time for our systems to fully ramp up and we’ll be gradually expanding coverage over the next several months.

The New York Times reported:

YouTube did not name any specific channels or videos that would be banned. But on Wednesday, numerous far-right creators began complaining that their videos had been deleted, or had been stripped of ads, presumably a result of the new policy.

Along with removing extremist content, YouTube said it had tested a feature that limited recommendations for videos that contained “borderline content and harmful misinformation, such as videos promoting a phony miracle cure for a serious illness, or claiming the earth is flat.” YouTube said this feature caused views within the U.S. on these videos to drop “by over 50%.”

YouTube also said it was cracking down—through demonetization, blocking the channel from running ads or using other ways to make money—on channels with content that violates its hate speech policies. However, even demonetized channels can share links to products and services, along with donation links.

Despite its promise to adjust its harassment policies and to better police extremist content, the platform continues to get backlash from social media users on both sides of the debate.

How would you advise YouTube to proceed amid this backlash, PR Daily users?


PR Daily News Feed

Sign up to receive the latest articles from PR Daily directly in your inbox.