To regain trust, Facebook purges over 800 political accounts

The social media platform says the accounts’ content was fine—instead citing dodgy tactics to justify their removal. The company wants to restore public trust after multiple data scandals.

Facebook wants to regain consumer trust—and it’s taken a major step by scrutinizing political speech on the platform.

The company announced it would remove more than 800 accounts—each with a strong political bent—that it says have engaged in tactics and actions that violate its user regulations. The company called the connections “inauthentic” and promised to crack down on bad actors.

It wrote in its online newsroom:

People need to be able to trust the connections they make on Facebook. It’s why we have a policy banning coordinated inauthentic behavior — networks of accounts or Pages working to mislead others about who they are, and what they are doing. This year, we’ve enforced this policy against many Pages, Groups and accounts created to stir up political debate, including in the US, the Middle East, Russia and the UK. But the bulk of the inauthentic activity we see on Facebook is spam that’s typically motivated by money, not politics. And the people behind it are adapting their behavior as our enforcement improves.

Facebook acknowledges its murky rules about what content pages and accounts can share.

It concluded:

Of course, there are legitimate reasons that accounts and Pages coordinate with each other — it’s the bedrock of fundraising campaigns and grassroots organizations. But the difference is that these groups are upfront about who they are, and what they’re up to. As we get better at uncovering this kind of abuse, the people behind it — whether economically or politically motivated — will change their tactics to evade detection. It’s why we continue to invest heavily, including in better technology, to prevent this kind of misuse. Because people will only share on Facebook if they feel safe and trust the connections they make here.

Some applaud the move.

The Washington Post reported:

Facebook has long struggled with where to draw lines around domestic content. After the 2016 election, company executives declined to purge thousands of misleading pages for fear that doing so would alienate conservatives, according to two people familiar with the discussions.

“It is totally reasonable for companies to say if you abuse our mechanisms, we will punish you, even if the individual content is okay,” said Alex Stamos, who resigned as Facebook’s chief security officer this summer and is now an adjunct professor at the Center for International Security and Cooperation at Stanford University and a Hoover Fellow. “Facebook first reduced the ability to use ads to punish extreme content. Now they are attacking organic recommendation systems, such as the likes and shares used to artificially inflate posts.”

Some question whether money was truly the overriding factor in targeting which accounts to purge.

Engadget wrote:

While Facebook emphasizes that these accounts were motived by money, the New York Times reports that many of those taken down today were political in nature and relied on fake news to generate clicks. The social media company noted that while spam accounts have typically used topics like celebrity gossip or natural disasters to generate traffic, they now often use “sensational political content” to do so, regardless of the political leaning. Facebook told the New York Times that the removals today include the largest number of domestic accounts and Pages involved in influence campaigns, a shift from the foreign-based campaigns we’ve seen in the past.

“As we get better at uncovering this kind of abuse, the people behind it — whether economically or politically motivated — will change their tactics to evade detection,” said Facebook. “It’s why we continue to invest heavily, including in better technology, to prevent this kind of misuse. Because people will only share on Facebook if they feel safe and trust the connections they make here.”

Facebook has declined to identify the purged accounts—nor even say how many were ousted.

Gizmodo wrote:

We reached out to Facebook to ask for a complete list of the removals and a spokesperson told us that information will not be released. Facebook provided the following examples: The Resistance, Reasonable People Unite, Reverb Press, Nation in Distress, Snowflakes. It’s likely Facebook wants to keep this information to itself in order to avoid giving any political factions ammunition to cry foul.

It appears Facebook briefed the New York Times and the Washington Post ahead of the removals and each outlet named the same pages that were supplied to Gizmodo. The selections were characterized as being variously dedicated to right and left-wing views.

Some are questioning the wisdom of Facebook’s inclination to keep its cards close to the vest.

Gizmodo concluded:

The fact that Facebook is keeping almost all of the details about this action under wraps may save it some short-term pain, but it just gives everyone’s imagination the chance to run wild. In its Thursday announcement, Facebook wrote it was taking this action because “people will only share on Facebook if they feel safe and trust the connections they make here.” It’s a good point that applies to the connection between Facebook and its users as well.

Some on Twitter call for more transparency:

Others are given to speculation:

Because it won’t share the complete details of the move, Facebook will face more questions from consumers and critics. Might greater transparency be yet another step toward cultivating public trust?

What do you think, PR Daily readers? How else might the social network help restore consumer trust?

(Image via)

COMMENT

PR Daily News Feed

Sign up to receive the latest articles from PR Daily directly in your inbox.