GPD has responded to a call for input on the upcoming thematic report by the UN Special Rapporteur on freedom of opinion and expression, which will focus on disinformation.
In our submission, we evaluate and assess the key challenges posed by disinformation by considering the specific human rights impacts of disinformation, and on state and platform responses, drawing on insight from GPD’s wider work on these issues. We highlight the negative impacts—exacerbated by the current pandemic—that disinformation has on the right to free and fair elections, the right to health, the right to non-discrimination, and in some cases the right to life. We then consider how short-sighted responses by states and platforms have often themselves posed serious risks to individuals’ right to freedom of expression and privacy.
In addition to these challenges and trends, we provide a number of recommendations for states and platforms to address disinformation in a rights respecting manner.
Key recommendations include:
- States should avoid content based restrictions on disinformation, particularly through criminal laws, which should only be used in the most severe circumstances where there is an intention to cause some clear, objective public harm.
- States should not require platforms to make determinations on the legality of content, including disinformation. However, it may be appropriate for states to require platforms to ensure that their terms of service are clearly understood, with mechanisms for appeal.
- States should require certain platforms to submit transparency reports or relevant information on their advertising, targeting practices, and algorithmic decision making, particularly as they relate to democratic processes and public health crises. However, the scope of platforms to be included must be proportionate. Platforms should only be included where there is clear evidence of harm being caused or facilitated by their services.
- States should consider legislation that requires platforms to allow users to understand what types of controls or algorithms are in place, including the data points used to make recommendations. This should be provided in an accessible format.
- The international human rights framework should guide the development of all corporate policies. Companies should acknowledge their responsibilities under the UN Guiding Principles on Business and Human Rights, and develop content moderation policies which align with the principles of legality, legitimacy, necessity and proportionality.
- Companies should develop terms of service and ensure that these are clearly understood to users, with established appeal mechanisms to ensure they are enforced fairly and consistently. This should involve the translation of such terms into local languages.
- Companies should conduct periodic assessment of their content moderation policies and appeals mechanisms to ensure they do not pose risks to individuals human rights, particularly to freedom of expression and the right to an effective remedy.
- Terms of service should apply to all users and be applied in a consistent and transparent manner. Companies should specifically consider their approaches to public figures—ensuring that any decisions related to public figures are evaluated with regard to international human rights standards, and don’t infringe upon individuals’ right to access information.
This consultation will inform the development of the special rapporteur’s thematic report to be presented to the Human Rights Council at its 47th session in June 2021. To continue following this process and for more updates on disinformation —sign up to our monthly Digest.