Online platforms and spaces are critical to the enjoyment of freedom of expression in the digital age. But they can also facilitate the spread of harmful forms of content—like disinformation and hate speech.
In recent years, this has resulted in increasing regulation by governments, and stricter moderation by platforms. These responses tend to be developed in closed spaces, and fail to sufficiently reflect human rights standards.
At GPD, our focus is twofold: making the spaces where content policies are developed more open, inclusive and transparent; and engaging directly to ensure approaches to content regulation and moderation are rights-respecting.
-
NEWS 23 Jan 2023
GPD provides comments on UNESCO content regulation guidance
-
POST 20 Dec 2022
The return of the UK’s Online Safety Bill: what’s changed and what’s next
-
POST 26 Sep 2022
The ITU: a brief explainer
-
POST 26 Sep 2022
Three reasons human rights defenders should care about the ITU
-
POST 15 Sep 2022
Ukraine: still a ‘not so cyber’ conflict?
-
NEWS 8 Sep 2022
GPD input to Ofcom’s call for evidence on role under UK Online Safety Bill
-
POST 14 Jul 2022
The EU Digital Markets Act: is interoperability the way forward?
-
POST 5 Jul 2022
How will the Digital Markets Act affect human rights? Four likely impacts
Load more