The issue about Meta's public position concerning their distance from political content: That does not mean the Meta's apps will never be used for political influences.
Last week, according to Forbes, Facebook housed hundreds of ads that provided misinformation on the upcoming elections, while Meta raked in millions of dollars by allowing the campaigns, that are so clearly violating rules on the platform.
Here's Forbes:
One of the ads features a cartoon image of Vice President Kamala Harris with devil horns and an American flag burning behind her. Other ads feature images of Harris and VP candidate Tim Walz interposed with post-apocalyptic scenes, and pictures of Walz and President Biden mashed up with images of prescription drugs spilling out of bottles. One features an apparently AI-generated photo of Harris smiling in a hospital room, ready to administer an injection to a screaming child. Another shows images of anti-vaxxer and third-party candidate RFK Jr. Some of the ads wonder if Harris will stay in the race and declare that America is "headed for another civil war."
Which is no surprise. Russians, in the 2016 presidential election, employed Facebook advertising to advertise conflicting reports about U.S. political candidates in an effort to sow discord on American voters. The aim of this push remained unclear, but the reach potential of Facebook served as a massive lure for such operations. It was later that Meta CEO Mark Zuckerberg was summoned before Congress to account for his role in the platforms marred by election misinformation.
That, combined with news organizations seeking to make Meta pay for its use of their content, was the push that spurred Meta's antipolitics push, and Meta has been incrementally backpedaling from this course ever since. It has ended its news center, and diminished its news partnerships, but in the spring, Meta publicly declared it would divorce political content in its efforts to create more interesting, less divisive interaction on its apps.
Which was well-timed, getting in front of the U.S. election push. Now, however, Meta gets caught up in the same game it was when it had been more open to discussion about politics. So is its public stance against that actually going to have effect, or is it merely a PR move to pacify regulatory groups?
Truth to tell, Meta cannot opt out of politics, especially because it depends on the flow of posts by users across its apps. All that it can do, given what it has been aspiring to put in place is reduce the number of readers who see political posts-keep it at bay-but at the same time, if politics is going to become an important ingredient in conversations, public interest, Meta can't altogether remove all the politics.
This is particularly the case with Threads, its Twitter clone app, trying to enable real-time discussion and engagement. It cannot hope to do so and, at the same time, avoid politics, which seems a pretty impossible task; indeed, it does appear that, sooner or later, Meta's going to have to rethink this element if it's going to maximize the potential of the app.
But Meta also says it is acting on user demands by cutting back political conversation in stream.
According to Meta CEO Mark Zuckerberg, speaking on Facebook earnings call on January 27th, 2021:
"One of the top pieces of feedback we're hearing from our community right now is that people don't want politics and fighting to take over their experience on our services."
Now Meta has been able to nudge much more engagement using clips from old TV shows repackaged into Reels, which they're throwing into your Facebook and IG feeds at ever-increasing rates.
Still, still, it seems like Meta's always going to be fighting a losing battle on reducing political content, no matter how it looks at approaching this.
But is this a sustainable strategy, then? Of course not: Meta is still playing a role in disseminating political misinformation today, and will remain an actor in such efforts.
Should Meta just eliminate all political restrictions and let people say whatever they want? That is also a losing game if it affects engagement. However, I do believe Meta will need to do it in a more nuanced fashion as well, particularly with regard to this definition of "political" content from Meta.
Based on research, our definition of political content is the content likely to be about issues related to government or elections; for instance, law posts, election posts, or social issues. As these global issues are intricate and dynamic, this definition will change as we continually engage with the people and communities who use our platforms and external experts to help us refine our approach.
Parameters are very vague here, and I do believe Meta will need to be more clear on such in the future.
I suspect that Meta's largest fear was to prevent escalation divisions during the lead up the election in the United States and maybe, post-survey, that'll observe an evolution of political philosophy that will see Meta adopting different policies, at which end threads will be no exemption in this regard.
But either way, Meta is not avoiding scrutiny on this front, which is impossible when your platforms facilitate reach to 40% of people on the planet.