It may not be what it is, but Meta's latest "media responsibility" push feels like a shot at Elon Musk, and the revised approach that X is taking to content moderation in its app.
Today, Meta has announced new Media Responsibility guidelines-the core principles that it is using to inform its own content moderation and ad placement policies in an effort to ensure even greater safety and protection for all users of its products.
Meta says,
The advertising industry got together to embrace media responsibility; however, there still is no definition of it that binds the whole industry. We take it at Meta to be a responsibility of the whole marketing industry to contribute to making better the world through an altogether more accountable, equitable and sustainable advertising ecosystem.
To this end, Meta recently launched a new mini-site - summarizing its "four pillars of media responsibility.
Those four pillars include
Safety and expression – Ensuring everyone has a voice, while protecting users from harm
Diversity, equity and inclusion – Ensuring that opportunity exists for all, and that everyone feels valued, respected and supported
Privacy and transparency – Building products with privacy "at their very core" and ensuring transparency in media placement and measurement
Sustainability - Protecting the planet, and having a positive impact
The mini-site details deeper overviews of each element, in addition to explainers on exactly how Meta is looking to enact such within its platforms.
Meta claims the goal of the mini-site is to let ad partners and users "hold us accountable, and see who we're working with", so it may give more assurance and transparency into its different processes.
And yes, that does feel a bit like Meta is taking pot shots at Elon and Co. here.
The new X team is increasingly placing their trust in crowd-sourced moderation, via Community Notes, that appends user-originated fact-checks to posts that include questionable claims within the app.
That, in turn, is fundamentally flawed because it relies upon ideological consensus to ensure that all notes go into the app and can be seen there, with the disagreement over at least some of those controversial matters meaning that the needed accord is never forthcoming to mute many false claims left and unchallenged on the app.
But Musk believes that "citizen journalism" is more accurate than the mainstream media, at least in his view, in that Community Notes are more representative of the truth, even if much of that is considered information that is not true.
As a result, claims about COVID, the war in Israel, U.S. politics, and literally every contentious argument today all contain at least some level of misinformation seeping through on X, since contributors to Community Notes cannot come to an agreement on the actual core facts of such.
That's part of the reason so many advertisers are steering clear of the app, while Musk himself continues to peddle misinformation or outright lies, and amplify hate speech profiles, further eroding trust in X's ability to manage information flow.
Of course, some would see this as the appropriate approach as it allows the users to fight back on what they feel are falsehoods from media. Here, Meta is doing the opposite of this and harnessing all of its knowledge and experience accumulated over time as a tool for minimizing adverse content.
All of the company's approaches are further articulated in the mini-site and the best routes through which detailed description may serve transparency, this time, accountability.
Either way, it is an interesting overview, offering more insight into the various Meta strategies and initiatives.