Organizations are pulling ads from YouTube following accusations that the platform’s algorithm is assisting child predators.
The boycott came after the viral circulation of a video published by Matt Watson, a YouTube content creator. The video, published Sunday, has nearly 2 million views at time of publication. [Editor’s note: The video contains content that viewers can find disturbing.]
In the video, Watson says:
Every algorithm under the sun is detecting you swear more than two times and make a video about panic attack[s] and depression, and yet this is going on, with adverts playing before them. And it’s been in the public consciousness for over two years, and yet nothing is being done. I’m disgusted.
… For the most part, the videos targeted by pedophiles did not violate YouTube’s rules and were innocent enough — young girls doing gymnastics, playing Twister or stretching — but the videos became overrun with suggestive remarks directed at the children.
The commenters left time stamps for parts of the video that can appear compromising when paused — like a girl’s backside or bare legs. They also posted remarks that praised the girls, asked whether they were wearing underwear, or simply carried a string of sexually suggestive emojis.
But with a blank YouTube account, and a couple of quick searches, hundreds of videos that are seemingly popular with paedophiles are surfaced by YouTube’s recommendation system. Worse still, YouTube doesn’t just recommend you watch more videos of children innocently playing, its algorithm specifically suggests videos that are seemingly popular with other paedophiles, most of which have hundreds of thousands of views and dozens of disturbing comments. Many include pre-roll advertising.
… Even search suggestions help funnel people down that path – start typing “girl yoga” and the autocomplete options include “young” and “hot”. Enter “twister girl” and autocomplete suggests “little girl twister in skirt”.
… The majority of the most troubling conversations are in Spanish, Russian and Portuguese, indicating that YouTube’s automatic systems that detect predatory comments may not work as well for non-English languages. But many of the comments we saw were also in English.
In his video, Watson showed that many of the videos in what he called a YouTube “wormhole” are monetized with ads from Grammarly, Disney, Clorox, Modern Line Furniture, GNC, Dr. Oetker, Bell, Fairlife, Vitacost, McDonald’s and other organizations.
As the video went viral and headlines appeared, YouTube’s marketing partners quickly distanced themselves from the platform. Several brand managers also made statements denouncing the behaviors that Watson revealed.
The New York Times reported:
“When we learned of this issue, we were — and still are — absolutely horrified and reached out to YouTube to rectify this immediately,” Senka Hadzimuratovic, a spokeswoman for the online grammar tool Grammarly, said in an email. “We have a strict policy against advertising alongside harmful or offensive content and would never knowingly associate ourselves with channels like this. It goes against everything our company stands for.”
A spokesperson for Fortnite publisher Epic Games said it had paused all pre-roll advertising on YouTube. “Through our advertising agency, we have reached out to YouTube to determine actions they’ll take to eliminate this type of content from their service,” the spokesperson added. A World Business Forum spokesperson said it found it “repulsive that paedophiles are using YouTube for their criminal activities”. A Peloton spokesperson said it was working with its media buying agency to investigate why its adverts were being displayed against such videos.
Nestle (NSRGF), the owner of brands like Kit-Kat and Nespresso, decided to “pause” its advertising on YouTube after some of its ads were shown on videos “where inappropriate comments were being made,” a company spokesperson said.
… Dr. Oetker asked YouTube “to explain how it could happen that advertising of our company was placed in an environment that we strictly reject and consider highly reprehensible ethically,” the company said in a statement. It added that it expects YouTube to “immediately remove from its site any contributions that threaten the integrity and protection of minors.”
Bloomberg reported that The Walt Disney Co. had pulled its advertising from YouTube, but the company has not returned reporters’ requests for comment.
As backlash grew and brand managers pulled their ads, YouTube outlined how it moved to resolve the crisis.
The New York Times reported:
Chi Hea Cho, a spokeswoman for YouTube’s parent company, Google, said it had deleted the accounts and channels of people leaving the disturbing comments, deleted comments that violate its policies and reported illegal activity to the authorities.
… In response to the latest concerns, Mrs. Cho said, YouTube disabled comments on tens of millions of videos featuring minors and removed thousands of inappropriate comments on videos with young people in them. She said YouTube had also terminated over 400 YouTube channels for comments that they left on videos and reported illegal comments to the National Center for Missing and Exploited Children.
YouTube also called the behavior that Watson addressed “abhorrent.”
“Any content—including comments—that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments,” a spokeswoman for YouTube said in an email.
Total ad spending on the videos mentioned was less than $8,000 within the last 60 days, and YouTube plans refunds, the spokeswoman said.
On Tuesday, YouTube changed its “strikes system” in response to those who do not follow the platform’s community guidelines.
- The first strike will result in a one-week freeze on the ability to upload any new content to YouTube, including live streaming, and other channel activities. Strikes will expire after 90 days.
- The second strike in any 90-day period will result in a two-week freeze on the ability to upload any new content to YouTube.
- The third strike in any 90-day period will result in channel termination.
However, YouTube’s adjustments to its punishments don’t address how the platform will cut down on inappropriate content in the future nor how it will stop its algorithm from suggesting other videos that can feed into child predatory behavior.
YouTube has not talked about its plan to “protecting families on YouTube” since a blog post it published on Nov. 22, 2017, when it was under fire for another crisis involving content for children.
… In 2017, YouTube updated its policies to address an event known as “ElsaGate,” in which disturbing, sexualized kids’ content was being recommended to children. That same year, YouTube decided to close some comment sections on videos with children in an attempt to block predatory behavior from pedophiles. As early as 2013, Google changed its search algorithm to prevent exploitative content from appearing in searches on both Google and YouTube. But despite years of public outcry, YouTube still hasn’t found a way to effectively deal with apparent predators on its platform.
YouTube has also struggled to keep advertisers on board as reports of bigoted content grew.
This is isn’t the first time companies have distanced themselves from the company. In 2017, companies including Walmart, PepsiCo and Dish Network pulled their ads from YouTube after they appeared alongside videos sharing racist and anti-Semitic views. YouTube has also been criticized for failing to quickly remove videos featuring disturbing content aimed at children.
How would you advise YouTube’s communications team to handle this crisis, PR Daily readers?