In September, the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, launched a consultation on ‘Content Regulation in the Digital Age’. The consultation, which closed today, examines two broad issues: the standards that social and search platforms apply to online content under their terms of service, and the processes that platforms implement when evaluating whether content violates those terms of service.
These issues, and the questions they raise, present, as the Special Rapporteur noted in the consultation’s concept, ‘some of the most urgent challenges to freedom of expression on private platforms today’. We agree. These platforms now facilitate an unprecedented ability to share and receive information, ideas and other forms of expression for billions of people. But in recent years, new challenges have emerged: the spread of ‘hate speech’ and incitement to hatred and violence online; the use of platforms for radicalisation and the sharing of ‘violent extremist’ content; questions around the availability of ‘fake news’ and disinformation, and the impact that this may have on political processes. How to address these forms of content, as well as others such as child pornography and copyrighted content, in a way which is rights-respecting is a difficult question which many states and platforms are grappling with.
The report which will follow on from the consultation seeks to address both of these actors, and will include recommendations about ‘appropriate private company standards and processes’ as well as ‘the role that States should play in promoting and protecting freedom of opinion and expression online’.
Building on our existing work on content regulation and intermediary liability, GPD has responded to the consultation. In doing so, we have sought to answer the questions posed, but we also think that it is important to ask some broader questions which we hope the Special Rapporteur will address in his report:
- Given the new forms of expression which the internet has facilitated, precisely what is the scope of online ‘content’?
- What are the differences between offline and online freedom of expression that might justify different approaches when responding to content which is unlawful or harmful?
- As neither mere hosts of content, nor publishers, what is the role and status of social and search platforms today?
- Given the wide range of actions that can have an impact upon the availability of online content how should we define ‘regulation’ of content?
- Do the very different types of content which can be unlawful or harmful need to be treated differently when developing responses?
In looking at these broader questions, we believe that:
- There is a need to change the framing of these challenges from looking at platform-specific responses to issue-specific responses;
- Addressing these issues requires a holistic approach which addresses both the offline as well as the online dimension;
- There is a need to ensure that a variety of different, appropriate actors are included in the framing of, and developing responses to, these issues;
- Stronger consideration should be given to the wide divergence of tech companies involved, taking into account their different statuses, roles and models.
On the specific questions in the consultation, we have set out our thoughts, as well as a series of recommendations to both platforms and states on how to ensure that the right to freedom of expression online is protected and respected. In particular, we examine:
- How platforms should respond to national content regulation laws and measures that may be inconsistent with international human rights standards;
- How companies should deal with demands in one jurisdiction to take down content so that it is inaccessible in other jurisdictions;
- What processes platforms should employ to develop their terms of service and assess content against those terms of service;
- What processes platforms should employ to enable users to appeal restrictions or the removal of content, as well as the remedies that should be available;
- The appropriate role of automation and algorithmic filtering in regulating content, including safeguards;
- Principles for standardising content regulation through technology, as well as human and other resources;
- The level of transparency that platforms should provide on their terms of service, internal policies and process, to users when content is restricted or taken down; and
- The role of states, and their obligations with respect to the right to freedom of expression when it comes to platforms and content regulation.
As well as reviewing the submissions in response to the consultation, the Special Rapporteur will also be visiting social media and search companies worldwide and conducting conduct civil society consultations focusing on thematic and regional concerns around content regulation. The final report will be published in June 2018. More information about the consultation can be found on the Special Rapporteur’s website, and you can read GPD’s full response to the consultation here.
For further information, contact Richard on richard[at]gp-digital.org.