The Oversight Board, an independent body set up to review Facebook and Instagram content decisions and policies, slammed the company Tuesday on its counter-verification program. In its statement, the Oversight Board outlined a number of changes that Facebook’s parent company, Meta, is expected to make regarding content moderation on its social media platforms.
Cross-check is an internal Facebook program that has been touted as a “Quality controlmeasurement – a way to check a content decision for potential moderation when it comes to high profile Facebook users. As Facebook reviews millions of pieces of content a day, the company is bound to make mistakes. A cross-checking system has been implemented to help limit removals of faulty content by users considered a business priority.
However, according to a report from the wall street journalthe program basically implemented a two-tier moderation system for high-level Facebook users and everyone else.
Basically, thanks to cross-checking, celebrities, politicians, and other influencers have been able to regularly break Facebook and Instagram rules without incurring penalties like those given to all other users. As many as 5.8 million accounts have made the cross-verification whitelist at any given time. Those names included former President Donald Trump and Mark Zuckerberg himself.
The board got tough in a statement, accusing Meta of not being honest with them about the cross-check program upfront.
“During our review, we found several deficiencies in Meta’s cross-checking program,” the Oversight Board wrote. “While Meta told the board that the cross-check was intended to advance Meta’s human rights commitments, we found that the program appears more directly structured to address corporate concerns. ”
Although the board said it understands Meta is a business, it failed to follow its own content policies, track data on program results, and was not transparent about the program. .
The counter-verification program first came to light when Facebook whistleblower Frances Haugen shared internal documents regarding the real damage caused by the social media platform. Haugen informed the Supervisory Board on cross-checking, among other issues revealed in the documents.
The Supervisory Board has suggested a number of changes to the program, mainly around transparency. For example, the Oversight Council said Meta should mark user accounts as part of the cross-verification program. The Council suggested that this “would allow the public to hold privileged users accountable for protected entities’ compliance with their commitment to follow the rules”.
In addition, the Oversight Board recommended that Facebook always remove certain “high-gravity” content, which a user should be part of the cross-check. If a cross-whitelisted user continually violates the rules, the Oversight Board suggests that Meta remove their account from the program.
While Meta acts on the Board’s decisions on specific content moderation decisions, such as reinstating a specific user post, the Board’s policy change recommendations are just that: recommendations. Meta is not obligated to listen to changes suggested by the Supervisory Board to the counter-verification program.