The lone representative of Southeast Asia in Facebook’s Oversight Board (OB) asserted that the body should be “as independent as it could get” even as free expression and human rights advocates caution against high expectations from what is also dubbed as the “supreme court” for the world’s largest social media networking site.
Endy Bayuni, The Jakarta Post senior editor who was guest speaker in an online forum organized by the Consortium of Democracy and Disinformation (D&D;) and Ateneo de Manila University’s Asian Center for Journalism last May 27, described the OB as “an additional layer of content moderation effort” by Facebook — but which will be independent of the tech giant — that will make binding decisions on content regulation and non-binding recommendations for Facebook’s content policy.
The OB is the fulfillment of Facebook Chief Executive Mark Zuckerberg’s 2018 promise of establishing an independent body that will rule “in the best interests of our community and not for commercial reasons”.
Representation in the board
The Board is envisioned to have 40 members from different parts of the world, and of diverse background and fields of expertise. Last May 6, FB released the names of the first 20 members.
However, questions have been raised on the OB’s independence, as well as the inadequate diversity in representation and limited authority.
Bayuni is one of only three individuals representing the Asia Pacific and Oceania region. Five of the first 20 members of the board come from the United States – with two of them serving as the OB’s four co-chairs.
Facebook justifies this U.S.-dominated composition, saying most of the complaints the social media platforms receive from its users come from the U.S.
However, Jenny Domino, a human rights lawyer based in Myanmar who has done extensive research on hate speech and social media regulation and was a panelist in the D&D; forum, said, “if anything, that might indicate the kind of outreach and the limited procedure that has characterized Facebook’s operations before.”
Bayuni agreed on the importance of diversity “for [the OB] to succeed.” He expressed hope for more representation from SEA, as the board completes its 40 members.
Australia-based Filipino sociologist Nicole Curato, who was also in the panel, remarked that the board’s high regard for experts and its supposed integrity coming from “the principles of independence and separation of powers” could be a problem since “the world is tired of liberal democracy,” with people becoming skeptical and dismissive of “the elitism of experts.”
“My worry is that when, for example, the oversight board renders a decision, they can easily be dismissed as biased — liberal bias, giving voice to the elites again,” she said.
She suggested a citizen assembly model for the OB, citing how it was recently employed by the French government, by randomly selecting ordinary citizens who are tasked to decide on France’s policy on carbon emissions, with the goal of cutting said emissions by 40 percent before 2030.
She further said that people all over the world will acknowledge the legitimacy of decisions reached by a board that involved the participation of ordinary citizens like them. She further likened this to a jury model which gets a decision’s legitimacy from the idea that the judgment on an individual came from his/her peers.
“Questions of ethics, questions of what’s acceptable and what’s not acceptable content, are questions that must not be exclusively left to experts. They must not be left to the best and the brightest, they must be left to the most ordinary citizens who can systematically reflect on these issues,” Curato added.
The board’s scope of authority
Domino emphasized that the board will only be able to tackle “a very thin slice of content moderation issue[s].” This is because of the several criteria that a case must first pass before it falls under the OB’s jurisdiction.
The board can only hear a case that meets the “significant” and “difficult” criteria. The OB’s by-laws define “significant” as issues with “real-world impact (…) that are severe, large-scale, and/or important for public discourse,” while “difficult” cases are contents that raise concerns about “current policies or their enforcement, with strong arguments on both sides for either removing or leaving up the content under review.”
A “significant and difficult” case could reach the OB in two ways: through a referral of Facebook, or through a request directly filed by a user who disagrees with Facebook’s initial decision of taking down a content. However, netizens can only submit a direct request to the board if they have already “exhausted appeals” with Facebook.
The OB, upon the start of its operations, can review only individual pieces of content “that [have] been removed for violations of content policies” on Facebook and Instagram. This means that contents such as “groups, pages, profiles, and events,” as well as those which have been allowed by Facebook to stay on the social media platforms, are not within the scope of the board’s authority.
This is a “significant limitation,” according to Evelyn Douek, a lecturer on law at Harvard Law School whose expertise include online speech regulation and private content moderation. In her piece “What Kind of Oversight Board Have You Given Us?,” Douek noted that “the most controversial content moderation decisions” of Facebook “have been decisions to leave content up, not take it down.”
This then gives Facebook a way to escape the OB, Douek said, considering that the board can deliberate only on take-downs, Facebook can just “simply down rank” the contents in question in its algorithm, placing these cases at the bottom of Facebook feeds.
All these further put under scrutiny the legitimacy of the OB’s claim of “independence” from Facebook, making the board “less (of a) ‘supreme court’ and more (of an) ‘optional consultant’” of Facebook in its content moderation decisions, Douek said.
Community standards and international human rights norms
Decisions of the OB will largely be based on the existing community standards or guidelines of Facebook and Instagram. The board, as declared in its charter, “will pay particular attention to the impact of removing content in light of human rights norms protecting free expression.”
This is likely because Facebook, according to Bayuni, “is trying to improve its performance, taking into account the human right elements” in its community standards, especially since the operations of the tech giant are “impacting on people’s human rights.” He also assured that cases dealing with “worst human rights violations [are] definitely something that [the OB] will hear.”
However, this is precisely where major criticisms and concerns on the OB from the human rights community are stemming from. Domino pointed out that the values and community standards employed by Facebook, which will guide the OB’s decisions, “do not clearly align with human rights law.” Questions on how the board will balance the five values of Facebook on which its community standards are based from — voice, authenticity, privacy, safety, and dignity — while also taking into account both local and international human rights norms, are now up for the board to address.
Bayuni, for his part, said it is expected that a debate on the matter will transpire. “The freedom of expression is very important for all of us, but when freedom of expression clash[es) with the other values — privacy, safety, dignity, and authenticity — I expect there will be a debate, an argument within the oversight board in coming up with a decision,” he said.
Disinformation on social media
When asked regarding the OB’s stand on political advertisements potentially spreading falsehoods on Facebook but which the tech giant refuses to ban, Bayuni said the board will ultimately adhere to the human rights criteria.
“If the contents of these political ads or political propaganda — if the content has impacts on the human rights of other people, then it’s definitely something that we want to handle — we can handle,” Bayuni said.
Curato, on the other hand, pointed out that regulating deliberate disinformation campaigns and hate speech on social media is beyond the board’s mandate, for the OB addresses “crystalized issues,” or those raising questions “that have clear, ethical dimensions.” Online disinformation and atrocity speech, on the other hand, are insightful but “very insidious” and “very subtle.”
“One of the issues that cannot be resolved by [the OB] is the very character, the dynamic character of disinformation. Once troll farms, manipulative state actors realize that there is a board that can decide against them, then they can change their habits, they can change their disinformation and manipulation habits,” Curato said.