Cross Check, a Facebook and Instagram program giving special treatment to celebrities and other high profile users, is opaque, “flawed” and needs to be overhauled, according to recommendations released by parent company Meta’s semi-independent Oversight Board on Tuesday, which offer a rare insight into the controversial content moderation initiative and underscore the differences between the tech giant’s actions and its stated values.
The Key Facts
Cross Check, which separates the moderation of high profile or sensitive accounts from other accounts and adds an additional layer of scrutiny before content decisions are made, is designed to “satisfy business concerns” rather than protect the public or implement Meta’s stated values, the Oversight Board said.
The program appears to be driven more by Meta’s desire to avoid “provoking” VIPs and facing accusations of censorship than professed aims of safeguarding the public and Meta’s numerous public statements that it applies platform rules evenly were misleading, the board added.
The board called for “significant improvements” to the program and offered 32 recommendations to change the process such as publishing key metrics surrounding the program, implementing audits to make sure it’s working effectively and hiding posts while they are evaluated (presently, they remain public pending review).
Other suggestions included increasing transparency and labelling accounts publicly, eliminating repeat offenders from Cross Check, beefing up content moderation resources, as well as increasing its transparency.
Nick Clegg, Meta’s president of global affairs, said the company will fully address the board’s recommendations and respond to the report within 90 days.
Clegg claimed that Clegg has made some changes to the program including adding formal criteria, expanding eligibility, and creating annual reviews.
What We Don’t Know
Meta is not obligated to implement or accept the Oversight Board’s recommendations, though it must respond to them. Meta can likely implement some of the recommendations quickly and inexpensively, like highlighting public accounts covered by the scheme. Other recommendations, such as highlighting accounts that are covered by the scheme publicly, could prove to be more complicated. These might include those that involve significant expenditures or the increase of staff who focus on moderation. Meta recently made deep cuts in its workforce and laid off employees in response to gloomy economic forecasts.
The Crucial Quote
Cross Check is a way of managing the challenges associated with moderating the vast quantities of content posted on Meta’s platforms every day, the board said. It is difficult but the board believes it unfair for Meta not to equally address issues such as falsely flagging (or not flagging) rule-breaking content for the most influential people. “Meta has a responsibility to address its content moderation challenges in ways that benefit all users and not just a select few,” the report said.
The Key Background
Facebook, Meta, and executives like Mark Zuckerberg insist that public figures and those with high visibility have equal rights. Last September saw the explosive dismantling of this position. Wall Street Journal revealed the secretive set of rules and procedures shielding VIP users—including celebrities, politicians, journalists and advertisers—from the normal moderation process. This program has millions of users. It allows rule-breaking material to be up for much longer periods of time than is normal. One instance of nonconsensual pornography was allowed. Journal reported. After the report was submitted, the Oversight Board investigated the situation and reprimanded the firm for concealing its true scale and scope.
Facebook Oversight Board will investigate if the company has different rules for powerful users (SME).
Inside Meta’s Oversight Board: 2 Years of Pushing Limits (Wired)