Can Meta Really Avoid Political Content?

Meta's anti-politics stance may not be sustainable long-term. 

Can Meta Really Avoid Political Content?

Here’s the thing about Meta’s public stance on distancing itself from political content: That doesn’t mean that Meta’s apps aren’t going to be used for political influence anyway.

Last week, Forbes reported that Facebook is hosting hundreds of ads that distribute misinformation about the upcoming election, with Meta taking in millions of dollars from these campaigns, despite them clearly violating the platform’s rules.

As per Forbes:

One of the ads features a stylized image of Vice President Kamala Harris with devil horns and an American flag burning behind her. Other ads feature images of Harris and VP candidate Tim Walz interposed with post-apocalyptic scenes, and pictures of Walz and President Biden mashed up with images of prescription drugs spilling out of bottles. One features an apparently AI-generated image of a smiling Harris in a hospital room preparing to give a screaming child an injection. Another features images of anti-vaxxer and third-party candidate RFK Jr. Some of the ads question whether Harris will remain in the race and suggest that America is “headed for another civil war.”

Which is no surprise. In the 2016 election, Russian-based operatives used Facebook ads to promote a range of conflicting reports about U.S. political candidates, in order to sow discord among American voters. The ultimate aim of this push was unclear, but the massive reach potential of Facebook served as a significant lure for such operations. Which eventually saw Meta CEO Mark Zuckerberg hauled before Congress to answer for the role that his platforms had played in election misinformation.

That, coupled with media entities pushing to charge Meta for the use of their content, formed the impetus for Meta’s anti-politics push, and Meta has been gradually moving away from such ever since. It’s cut its dedicated news section, and ended deals with news publishers, while earlier this year, Meta directly announced its intention to move away from political content entirely, in favor of more entertaining, less divisive interaction in its apps.

Which was timely, in getting ahead of the U.S. election push. But now, Meta’s being caught up in the same way as it was when it had been more open to political discussion. So really, is its public stance against such actually going to have any effect, or is it more of a PR move to appease regulatory groups?

Really, Meta can’t avoid politics, as it’s reliant on what users post in its apps. All it can do, as it’s been seeking to implement, is to reduce the reach of political posts, in order to lessen the presence of such. But politics is also a key element of discussion, and public interest, and if Meta’s going to keep serving the public as an informational and interactive source, then it can’t cull politics completely.

That’s particularly true in the case of Threads, its Twitter clone app, which is aiming to facilitate real-time discussion and engagement. Doing so while also trying to side-step politics isn’t going to work, and it does seem that, eventually, Meta’s going to have to revise its thinking on this element if it wants to maximize the potential of the app.

Yet, Meta also says that it’s responding to user requests in reducing political discussion in stream.

As Meta CEO Mark Zuckerberg noted in a Facebook earnings call on January 27th, 2021:

“One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services.”

Meta’s since been able to drive far more engagement with clips from old TV shows that have been re-packaged into Reels, which they’re injecting into your Facebook and IG feeds at ever-increasing rates.

But still, it seems like Meta’s always going to be fighting a losing battle in reducing political content, no matter how it looks to approach this.

So is this a sustainable strategy? Well, Meta’s still playing a part in distributing political misinformation now, and will continue to be a factor in such efforts.

Should Meta just remove all of its political restrictions and let people discuss what they want? That also could be a losing game, if it impacts engagement negatively. But I do think that Meta will need to take a more variable approach to this, especially when you also consider Meta’s current definition of “political” content:

Informed by research, our definition of political content is content likely to be about topics related to government or elections; for example, posts about laws, elections, or social topics. These global issues are complex and dynamic, which means this definition will evolve as we continue to engage with the people and communities who use our platforms and external experts to refine our approach.”

The parameters here are pretty vague, and I do think that Meta will have to be more clear about such moving forward.

I also suspect that Meta’s main concern was to avoid rising division in the lead up to the U.S. election, and maybe, in the wake of the poll, that’ll see Meta revising its political approach either way, and Threads, in particular, will see a new approach on this front.  

But either way, Meta’s not avoiding scrutiny on this front, which is impossible when your platforms facilitate reach to 40% of people on the planet.