Facebook’s new Charter, setting out its approach to content moderation and enforcement—including the establishment of an independent Oversight Board—was published on Tuesday of this week.
The concept of the Charter was first announced in a Facebook post by CEO Mark Zuckerberg in November 2018, which sketched out the broad parameters of a new framework aimed at “proactively enforcing [Facebook’s] policies to remove more harmful content, preventing borderline content from spreading, giving people more control of their experience, and creating independent oversight and transparency into our systems.”
In January 2019, Facebook published an initial draft of this Charter, then opened a full public consultation on it in May. In our response to this consultation, we found many things to welcome in the draft, but also several areas where clarification was needed—and we set out the key criteria the Charter would need to meet to make content governance at Facebook transparent, accountable, and rights-respecting.
So – how far does the new, final text of the Charter meet these criteria? We assess point by point below:
1. The selection committee for the Oversight Board will include independent stakeholders as well as representatives of Facebook
Facebook will still select an initial (unspecified in number, but “small”) slate of Board members, without input from independent stakeholders. After this, however, the Board will “take the lead” in selecting all future members, working alongside executive search firms. A “members recommendation portal” will also be established, allowing “anyone interested” to recommend potential new Board members to the Board.
In our view, this is a significant improvement on the previous version of the Charter. While Facebook will—problematically, in our view, but operationally justifiable—remain the sole arbiter of the initial crop of Board members, it has gone much further in its commitment to the Board’s future independence.
The inclusion of more robust safeguarding around conflicts of interest is also welcome: Facebook has committed to ensuring that the Board excludes “anyone who is a current or former employee of Facebook, or a spouse or domestic partner of an employee; a current government official or lobbyist working on behalf of any government; a high-ranking official within a political party; or a significant shareholder of Facebook.” This provision goes some way towards assuaging concerns that the initial, Facebook-selected crop of Board members could affect its long term legitimacy; and if the Board does, indeed, come to comprise the 40 members which Facebook anticipates, the influence of this initial vanguard will gradually diminish, anyway.
2. The Oversight Board will be able to influence Facebook’s rules and policies. The Board will be kept informed by Facebook of how its decisions are being implemented
In Article 3, Section 7.3, the Charter sets out the following:
Independent of any pending case, Facebook may request policy guidance from the board. This guidance may concern the clarification of a previous decision by the board or guidance on possible changes to Facebook’s content policies. All guidance will be advisory.
And in Article 3, Section 4, covering the Board’s decisions, it says:
At the board’s discretion, the final decision may include a policy advisory statement, which will be taken into consideration by Facebook to guide its future policy development.
From this, we can deduce a few things. First, the new Charter goes much further than the previous draft in saying it will consider guidance from the Board. In the previous draft, there was no explicit formal provision made for receiving any guidance from the Board.
Second, Facebook is retaining a lot of discretion in whether it decides accepts this guidance. Any guidance coming from the Board will be “advisory” rather than binding. It also cannot come unsolicited: it either has to accompany a decision, or Facebook has to “request” the guidance from the Board.
This is a reminder that the Oversight Board is not, after all, an Advisory Board. Rather, it is an entity with a narrow remit to adjudicate on specific cases relating to content, which may, in the course of its deliberations, generate non-binding recommendations for Facebook’s policy. This may disappoint some observers, but this direction has been clear since early in the development process.
Some commentators have suggested that certain wording in the Charter—notably “Facebook will support the board to the extent that requests are technically and operationally feasible and consistent with a reasonable allocation of Facebook’s resources” in Section 3—constitute “grey areas”, which the company will use as an excuse for ignoring the guidance of the Board. If this is Facebook’s intention, it will likely come to light relatively quickly, because in Article 4 of the Charter they have committed to “transparently communicating” about their response to any guidance from the Board.
3. Individual users will be able to request the review of a decision by the Board
In instances where people disagree with the outcome of Facebook’s decision and have exhausted appeals, a request for review can be submitted to the board by either the original poster of the content or a person who previously submitted the content to Facebook for review. (Article 2, Section 1)
4. Facebook will pay the costs of compensating the Board members and its supporting staff
The trust will arrange for compensation of members for their service on the board. Member compensation will be issued on a schedule based on the fulfillment of duties and will not be conditioned or withheld based on the outcome of board decisions. (Article 1, Section 5)
5. The operations of the Board will be as transparent as possible—with, for example, any internal procedural rules made public, and a report published annually on the Board’s operations over the previous year
The board will have a full-time staff. Staff will be responsible for supporting the board’s administration and operations. Their primary duties will include reviewing case submissions and coordinating outside research and statements for selected cases. Their work will enable the board to review cases, issue decisions and recommendations, publish decisions and release reports. (Article 3, Section 1)
In a post on Facebook’s Newsroom, its Director of Governance and Global Affairs Brent Harris also makes clear that the bylaws and governance documents of this Trust will also be made public.
6. The final Charter should strongly emphasise the importance of the right to freedom of expression
The Charter begins with these words:
Freedom of expression is a fundamental human right […] Free expression is paramount, but there are times when speech can be at odds with authenticity, safety, privacy, and dignity. Some expression can endanger other people’s ability to express themselves freely. Therefore, it must be balanced against these considerations. (Introduction)
This is a strong statement in support of freedom of expression, and its framing of legitimate restrictions is in line with the international human rights framework—as set out in, for example, the UN Guiding Principles on Business and Human Rights.
The Charter also clarifies that “members [of the board] must have…familiarity with matters relating to digital content and governance, including free expression, civic discourse, safety, privacy and technology.” (Article 1, Section 2)
And, in the section of the Charter which sets out the responsibilities of the Board, it includes a clause that “members will contribute towards building a board that, as an institution, upholds and advances free expression”. (Article 1, Section 6)
7. The Board’s decisions should be as consistent as possible, with any decision made on a particular piece of content generally serving as a precedent
For each decision, any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar. (Section 6)
The Charter is a significant step forward from the previous draft version. As a document, it meets, either entirely or partially, all of the baseline criteria we believe a transparent, accountable and rights-respecting model of content moderation oversight would need to.
It’s important to emphasise, however, that the Charter is not the full picture. We are yet to see the bylaws for the Oversight Board, which will set out many of the procedural aspects of the Board’s day-to-day operations in much greater detail: from how board members will be selected to review cases, to the level of confidentiality imposed on board members on disclosing board deliberations. These bylaws will initially be set out by Facebook, in draft form, before control over their review and implementation passes to the Board, at the end of 2019.
The publication of these bylaws will hopefully answer many of the remaining questions we have about the model. Key areas where we look forward to clarification include:
- How the board will seek external expertise to support decisions where board members require further guidance;
- The timeframe for decisionmaking and modus operandi of the board’s deliberation process;
- How the member recommendation portal will function in practice; and
- How often amendments and bylaws will come up for review.
GPD will continue to monitor the progress and implementation of Facebook’s Oversight Board. For monthly commentary and analysis on all aspects of the evolving online content regulation debate, sign up to our newsletter here.