One interesting side note heading into what may be one of the most tumultuous election periods on record is the differing ways that each of the social apps are approaching political content.
In addition to the U.S. election, which falls on November 5th, there's also India (May), the European Union (June), the U.K., South Africa, and a heap more heading to the polls this year.
And with all of this underway, among countries that feel very high social media usage, each app is looking to either reduce or increase the presence of political debate within their apps. Which potentially could have a major impact on voting outcomes, depending on how things actually play out.
Let's start with Meta, now taking definitive steps in actually reducing the presence of political content within its apps.
Last week, Meta announced it was killing its Facebook News initiative and ending all agreements with local news publishers in an effort to dilute more news content -- and subsequent debate -- on Facebook and Instagram.
In fact, that has been going on for some time. Following the Capitol Riots in early 2021, Meta's CEO, Mark Zuckerberg, noted that one of the most common notes of feedback that the company had been getting from users was that people didn't want to see those divisive political posts anymore in their feeds.
After years of angst-ridden posts, sparking tension between friends and family, Meta realized that the negative association of such wasn't worth the engagement benefits. Which then set the Meta team on a mission to reduce political content, however it could. That's since seen the company lean further into AI-recommended content, mostly via short-form video clips, which have since become a far bigger element in user feeds.
Indeed, through the last quarter of last year, AI-based content recommendations were 40% of the content that people saw on Instagram, and that percentage was a hair lower on Facebook. The outcome? Over the course of the last twelve months, Meta's seen time spent on Facebook grow 7%, and the same on IG grow 6%.
Talked, news content now makes up less than 3% of what people around the world are seeing in their Facebook feed - and dropping.
With this backdrop, and the reputational damage that news and political debate has caused Meta over the past, it is now looking to step away from it altogether, for a more light hearted, entertainment-based approach to content.
And which is even more sense given how much news and political debate have cost Meta over recent years.
Altogether, on account of the Cambridge Analytica debacle, in which political operatives reportedly stole Facebook user data to formulate voter influence operations, Meta has had to shell out nearly $6 billion in direct costs, through a penalty from the FTC, a settlement over the "data breach," and fines from the US Securities and Exchange Commission.
But the reputational damage may have been even worse. In 2022, Meta said that Apple's new opt-in data tracking prompts would cost its ad business over $10 billion in that year alone, which can also be linked back to an erosion of trust in the company due to the Cambridge Analytica incident.
On balance, then, it makes sense for Meta to step away from news and politics where it can. But what will that ultimately mean for voters?
According to Pew Research, 30% of Americans get at least some of their news input from Facebook, and with the platform deliberately moving away from such, that has to have an impact.
The end result will mean that Meta can raise its hands and claim that it had nothing to do with the outcome of any election, whatever it may be, that could save it from similar negative headlines. But is that good for democracy, and will that lead to a less-informed public?
In turn, the site now previously known as Twitter is doubling down on politics, as owner Elon Musk makes the app his personal megaphone to shout from the rooftops over whatever political issue he believes needs to be shouted about on any particular day.
Musk regularly tweets about the number of immigrants pouring into the United States, about the war in Ukraine, about drug policy, about corporate governance issues, and how America's "great" cities are devolving. He also quite regularly points fingers at sundry politicians, from the President on down, and as the most-followed person in the app, who's also (reportedly) tilted the algorithm in favor of showing his posts to more people, he alone has significant influence over the general discussion among X users.
In fact, X's even more "free speech"-aligned approach to content moderation, where it places greater reliance on its crowd-sourced "Community Notes" to police misinformation, has opened it to increased manipulation, which can further skew political discussion in the app.
But that's pretty much how Elon wants it, with his stated view being that people should be able to see all opinions-no matter wrong or ill-informed-as long as they can decide what they think and what they don't.
Which seems to brush off the historical evils inflicted by it, but still, Musk believes that no one should be censored because of being such, and all opinions should merit examination .
Which then makes more political debate on the app. And with Meta trying to cut back on political and news content, X may actually be winning out, in becoming the social app of choice to engage in political debate.
Is that a good thing?
I mean, in theory, as discussed, X's Community Notes system should enable "the people" to decide what they believe, and what should be left, or "noted" in the app. But Community Notes are only displayed on posts once contributors of opposing political viewpoints agree that a note is necessary. Which, for many of the most divisive political debates, is never going to happen, so for a lot of claims, X is facilitating the spread of misinformation.
Indeed, studies indicate that growing Community Notes has done little to lessen the exposure to misinformation in the application.
But when you also factor in the belief that organized groups already within the Community Notes system are working to amplify and/or quash notes that oppose their own agendas, and that those groups may be potentially acting on behalf of foreign governments, X appears to be offering little protection from voter manipulation into the election.
Which could color the debate on the political and therefore the voters, with Musk himself particularly keen to shift the swing to Republican candidates.
X has much less direct impact on respect for his opinion than Facebook (Pew Research data finds 12% of U.S. adults receive regular news content in the app). But X/Twitter has always wielded disproportionate influence over related debate because it is used by newshounds and reporters who rely on the app for so much of their information and then spread those ideas to other platforms.
That's why Donald Trump was able to use Twitter so effectively, and probably why Elon was so attracted to it.
The consequence is that you'll end up having more voters who are more informed by misinformation through the app, with some of the most inflammatory, angst-inducing claims already coming from X posts.
Will that affect the vote? Most probably, and with Meta offering no counter, that must be an important one.
In the end, though, Meta appears to be more interested in its business than it is in what it does, or doesn't do, in politics. Which, again, gives with the cost-benefit analysis of this for the company. But the worry is that X-sourced, unfounded conspiracies are going to infect the minds of enough voters so that it influences the result of each election, and could cause much more harm down the line.
Of course, it's not Meta's job to referee such, and it's also fair to note that TikTok has similar problems – and a difficult position, given its alleged ties to the Chinese Government and how that might play into what users see in that app.
But it is a potentially worrying scenario, heading into the various polls, with X's more "free for all" approach looking much more like the situation in the lead-up to the 2016 election, as opposed to the lessons learned as a result.
The worst part is that nothing can seemingly be done about this, and all of this analysis and attribution will be conducted in retrospect.
And for many, many people, that may come too late.