At TED 2019, Twitter CEO talks of follower counts, ‘likes’ and Nazis

Jack Dorsey fields questions about the platform’s future as public trust in social media ebbs. In response, users tweet to bash and fact-check his remarks.

Are Twitter’s challenges for regaining consumers’ trust so ingrained in its structure that a significant remedy would require an overhaul, rather than targeted tweaks?

Twitter CEO Jack Dorsey sat down Tuesday for a casual interview at TED 2019, the popular conference series that focuses on technology, entertainment and design.

Dorsey’s responses suggest that quick fixes would do little to quell the rise of abusive language and misinformation on the platform.

He’s the latest social media honcho to venture into a high-profile setting to convince the public that the industry overall, and his company in particular, are striving to improve and deserve consumers’ patience. Twitter has faced scrutiny in recent years over how it handles abuse, hate speech and aggressive behavior on its platform.

Moderator Chris Anderson depicted Dorsey as the captain of the Titanic and questioned whether the CEO grasped the fear and urgency that so many feel about social media platforms’ outsize role in shaping public opinion.

TechCrunch wrote:

For most of the interview, Dorsey outlined steps that Twitter has taken to combat abuse and misinformation, but Anderson explained why the company’s critics sometimes find those steps so insufficient and unsatisfying. He compared Twitter to the Titanic, and Dorsey to the captain, listening to passengers’ concerns about the iceberg up ahead — then going back to the bridge and showing “this extraordinary calm.”

“It’s democracy at stake, it’s our culture at stake,” Anderson said, echoing points made yesterday in a talk by journalist Carole Cadwalladr. So why isn’t Twitter addressing these issues with more urgency?

“We are working as quickly as we can, but quickness will not get the job done,” Dorsey replied. “It’s focus, it’s prioritization, it’s understanding the fundamentals of the network.”

Dorsey claims that the reason Twitter appears so ineffective in responding to abusive speech on the platform is that any potential fixes would strike at the fundamentals of how the platform operates. At times, his solution appeared to involve remaking the platform entirely.

TechCrunch continued:

Dorsey recalled that when the team was first building the service, it decided to make follower count “big and bold,” which naturally made people focus on it.

“Was that the right decision at the time? Probably not,” he said. “If I had to start the service again, I would not emphasize the follower count as much … I don’t think I would create ‘likes’ in the first place.”

Since he isn’t starting from scratch, Dorsey suggested that he’s trying to find ways to redesign Twitter to shift the “bias” away from accounts and toward interests.

More specifically, Rodgers asked about the frequent criticism that Twitter hasn’t found a way to consistently ban Nazis from the service.

“We have a situation right now where that term is used fairly loosely,” Dorsey said. “We just cannot take any one mention of that word accusing someone else as a factual indication of whether someone can be removed from the platform.”

The critique comes at a time of increased public divisiveness and rancor. Just this week, false claims about the fire that engulfed Notre Dame Cathedral spread rapidly on the platform.

However, Dorsey avoided specifics in his TED interview.

Wired reported:

Dorsey didn’t address any of these incidents specifically at TED. In fact, his answers lacked specificity overall. When he was asked pointed questions, he evaded them, as he often does. Rodgers asked him how many people are working on content moderation on Twitter—a number the company has never published, and Tuesday continued the vagueness streak.

“It varies,” Dorsey said. “We want to be flexible on this. There are no amount of people that can actually scale this, which is why we have done so much work on proactively taking down abuse.”

That proactive work was the big news Dorsey announced from the stage: A year ago, Twitter wasn’t proactively monitoring abuse actively using machine learning at all. Instead, it relied entirely on human reporting—a burden Dorsey was quick to recognize was unfairly put on the victims of the abuse. “We’ve made progress,” he said. “Thirty-eight percent of abusive tweets are now proactively recognized by machine-learning algorithms, but those that are recognized are still reviewed by humans. But that was from zero percent just a year ago.” As he uttered those words, Twitter sent out a press release with more information on the effort, highlighting that three times more abusive accounts are being suspended within 24 hours of getting reported compared with this time last year.

At times, Dorsey seemed focused on optics rather than the deeper problems that plague Twitter.

Wired continued:

Dorsey did bring up one specific fix. “The first thing you see when you go to [the page to report abuse] is about intellectual property protection. You scroll down and you get to abuse and harassment,” he noted. “I don’t know how that happened in the company’s history, but we put that above the thing that people actually want the most information on. Just our ordering shows the world what we believed was important. We are changing all that, we are ordering it the right way.”

For all his insistence on the bigger picture, this was a very small problem for Dorsey to point out, and one with a very obvious solution. Nevertheless, Twitter is not fixed. Why? The reasoning here is agonizingly circular: Because Dorsey says he doesn’t want to do a bunch of small iterative quick fixes; he wants to fundamentally rebuild the site to encourage better conversations, and that will take time—time it’s unclear the world can afford.

In a twist, users took to Twitter’s platform to badger and press the CEO.

Users also fact-checked his claims in real time:

Other users took to the platform to assert that Dorsey doesn’t understand what users want:

Twitter says it has made progress on removing abusive content and accounts from the platform and on proactively protecting users.

It wrote in a blog post:

People who don’t feel safe on Twitter shouldn’t be burdened to report abuse to us. Previously, we only reviewed potentially abusive Tweets if they were reported to us. We know that’s not acceptable, so earlier this year we made it a priority to take a proactive approach to abuse in addition to relying on people’s reports.

This time last year, 0% of potentially abusive content was flagged to our teams for review proactively. Today, by using technology, 38% of abusive content that’s enforced is surfaced proactively for human review instead of relying on reports from people using Twitter. This encompasses a number of policies, such as abusive behavior, hateful conduct, encouraging self-harm, and threats, including those that may be violent.

What do you think of Dorsey’s TED interview, PR Daily readers?


One Response to “At TED 2019, Twitter CEO talks of follower counts, ‘likes’ and Nazis”

    Sabrina says:

    Interesting recap! Outside of Trending news, I don’t get much value from Twitter anymore, which shows how far removed he is from his product. He talks about quick fixes – even mentioning how he would not produce “likes” if he could, but has taken no steps to eliminate that feature. As a personal user and brand user, I find the “likes” function to be completely useless. Twitter is only good as its engagement and he needs to make major steps to improve that, while addressing the abusive and aggressive actions that have flourished on the platform.

PR Daily News Feed

Sign up to receive the latest articles from PR Daily directly in your inbox.