30 May 2018

GPD’s response to David Kaye’s report on Platform Content Regulation

By

Earlier this week, David Kaye, the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, published his latest report to the UN Human Rights Council: “A Human Rights Approach to Platform Content Regulation”. This report (to which Global Partners Digital contributed a submission), addresses the regulation of user-generated content online. It looks at some of the key issues and challenges surrounding restrictions on what online content is permitted, both in terms of national legislation affecting platforms, and the rules and moderation processes that platforms themselves develop and implement. The report concludes with a series of recommendations for both states and platforms.

While much of the report reiterates existing international human rights standards relating to the right to freedom of expression, it marks the first time a UN mechanism has examined their specific application to online content and the moderation processes of platforms.

Among the many welcome aspects of the report, three points stand out in particular:

First, the clear assertion that states must ensure that the legal and policy framework ensures an enabling environment for freedom of expression online. As we emphasised in our own submission, this means repealing legislation which criminalises or unduly restrictions expression online, and refraining from imposing inappropriate liability upon platforms for content they host, such as via disproportionate sanctions or requirements for proactive monitoring or filtering of content. The report also suggests that states publish detailed transparency reports on requests for removal of content, which would be a welcome and valuable step.

Second, the report calls for online platforms to use international human rights law and standards as the basis for developing and implementing their content standards. Many of the challenges identified in the report – such as a lack of clarity in terms of service and seemingly arbitrary or discriminatory decisionmaking – could be addressed through adopting a “human rights by default” approach. As we suggested in our submission, this would mean companies incorporating directly into their terms of service “relevant principles of human rights law that ensure content-related actions will be guided by the same standards of legality, necessity and legitimacy that bind State regulation of expression”. The report’s strong affirmation of the responsibility of platforms, under the UN Guiding Principles on Business and Human Rights, to adhere to the same standards as states when it comes to respecting the right to freedom of expression, is important and timely.

Finally, we welcome the report’s call for greater means for users to appeal and obtain remedies for wrongful decisions, as well as public accountability. This would mean platforms ensuring “robust remediation programmes” with appropriate remedies provided when the right to freedom of expression is adversely impacted.

Also mooted in this recommendation is the creation of a “social media council” with the power to evaluate complaints against online platforms. This would be an important and welcome step towards the independent oversight of platforms’ content moderation processes. However, we believe that the proposal could go further. Based on our own research and consultations, we judge that – in order to be truly effective – an oversight model would have to review the full development and implementation of platforms’ terms of service, as well as monitor the functioning of their grievance and remedial processes. In our new white paper, we propose a model in which online platforms would agree to a set of Online Platform Standards, setting out minimum standards related to these activities. Compliance would be monitored and assessed by a global multistakeholder oversight body, with platforms failing to meet the Standards publicly called out and provided with recommendations for improvement. We invite further discussion, feedback and debate on the practicalities of this approach.

Overall, David Kaye’s report provides an important input into the ongoing debate relating to freedom of expression online, and the role of online platforms in particular. By clearly setting out how international human rights standards apply to freedom of expression online, and the roles and responsibilities of both states and platforms, the report moves the debate forward and provides clear markers by which the actions of these actors can be assessed. With platforms coming under unprecedented scrutiny, and governments turning to greater regulation as the solution to the challenges of unlawful and harmful content online, we hope to see the report’s recommendations widely accepted and implemented.