10 Mar 2026

EU DSA: Challenges for Global Majority Researchers

This blog is based on a longer report by Agustina Del Campo with input from Maria Paz Canales: “Data access for researchers within digital platforms: Unpacking the human rights implications, practical opportunities and challenges for the Global Majority.”

——

Platform accountability and access to data

Access to data is central to accountability in any digital governance ecosystem. Yet, as debates about online safety, illegal content and the protection of fundamental rights continue, those seeking to understand the impacts and influence of large platforms often lack access to the information they need. These platforms hold vast amounts of data. In particular, information about how content spreads, how moderation decisions are made, and how online harms evolve. Without access to this data, policymakers, civil society, and researchers are left trying to make sense of complex digital ecosystems with only a partial and sometimes distorted view.

What’s the DSA?

The EU’s Digital Services Act (DSA) was designed, in part, to address this problem. Through new and ambitious transparency obligations, the DSA (and the 2025 Delegated Act) seeks to reduce information asymmetries between platforms, regulators, and the public. Article 40 is one of the most significant provisions of the DSA, allowing “vetted researchers” to request access to certain platform data to study systemic risks, including the spread of illegal and harmful content. 

Importantly, this mechanism is not limited to European researchers. In principle, applicants from anywhere in the world can qualify if they meet requirements around institutional affiliation, independence, funding transparency, and data security, giving the DSA potential global reach. 

Challenges for Researchers

Unfortunately, formal openness does not automatically translate into meaningful access. For many researchers outside well-resourced institutions and regions, there are legal, financial, and structural barriers which ultimately determine who can realistically benefit from the DSA.

Convoluted request processes

Article 40 is far from straightforward. The request process is lengthy and highly procedural. Researchers must first apply to a “Digital Services Coordinator,” a national authority responsible for supervising and enforcing the DSA, and justify why the data cannot be obtained elsewhere, while also demonstrating compliance with strict GDPR-level security standards. All of this happens before a platform even decides how, or whether, to grant access. Applicants are expected to submit narrowly tailored, highly specific requests despite having limited visibility into what data platforms actually hold. In practice, this is comparable to going to a library and being required to list, in advance, every book you will need by exact title and author, without access to the catalogue. 

Superficial accountability measures

At the same time, Article 40 and the Delegated Act – which sets out the rules for vetted researchers – seem to reinforce a largely quantitative model of transparency. The system relies heavily on metrics such as how many posts were removed or how many accounts were suspended. These figures may show scale, but they say little about context, proportionality, or impact. For example, a company could show a high volume of content removed as evidence that it is actively tackling disinformation, without explaining whether mistakes were made, how decisions were reached, or whether certain communities or types of speech were disproportionately affected. Without clearer definitions and access to more qualitative, contextualised information, it becomes difficult to make sense of what companies have reported. The danger is that researchers and policymakers build their analysis on incomplete data, potentially leading to partial or misleading interpretations.

Complex EU “Digital Rulebook”

The DSA does not operate in isolation. It sits within a dense and interconnected EU legal framework that shapes how Article 40 is interpreted and implemented (including the GDPR, the AI Act, the Digital Markets Act, various Codes of Conduct, and most recently the European Democracy Shield). Navigating this framework requires a high level of institutional capacity and regulatory literacy. For researchers already embedded in European regulatory debates, this landscape may be familiar. For those not familiar, it presents a significant hurdle and raises the practical threshold for participation, reinforcing existing imbalances in who may be able to effectively engage. 

European-centric support networks

The research ecosystem that is currently forming around the DSA remains largely European. Many of the emerging collaboratives and support initiatives are anchored in well-resourced institutions. For example, the DSA40 Data Access Collaboratory, which was created to connect and support researchers submitting Article 40 requests, currently only lists European researchers as part of the consortium. While these efforts are important, they also highlight how coordination risks becoming regionally concentrated. 

If Article 40 is to live up to its global potential, more inclusive and genuinely international networks are essential. Researchers from the Global Majority need spaces to connect, define shared priorities, and articulate common research questions that can underpin strong, coordinated data requests. These networks give researchers the space to refine and clarify what data they need. In turn, they are better positioned to advocate for access in the public interest. In that sense, building collective voice and strategic coordination may be just as important as the legal right to request data itself. 

Narrow scope of harms lacks relevance in Global Majority contexts

The DSA also poses some important substantive challenges for global majority researchers, particularly around whose risks and priorities are ultimately reflected. The limited engagement of Global Majority actors in shaping the drafting and interpretation of Article 40 means the framework rests on assumptions taken from specific political and social contexts. Article 40 restricts permissible research to focus on systemic risks as defined in Articles 34 and 35, anchoring valid research questions to European priorities and conceptions of harm. For example, in Global Majority contexts, concerns often centre on state-driven disinformation, censorship, or government–platform entanglements, issues that may not be fully captured by the DSA’s framing. 

Why it matters that the EU DSA falls short

Article 40 holds real promise as an inclusive mechanism for enabling access to data for a broad community of researchers. But formal openness alone is not enough. If meaningful participation from Global Majority researchers remains limited, the DSA risks creating a system that reproduces the very inequalities and asymmetries it aims to address. This is not simply a question of fairness. It directly shapes how effectively we can respond to online harms. Disinformation campaigns, algorithmic bias, and content moderation practices are all felt beyond borders and across jurisdictions, manifesting, differently depending on the political and social context. Understanding them requires global perspectives and comparative insight. Ensuring Global Majority researchers can utilise the DSA would create a more diverse evidence base which in turn would strengthen responses to online harm worldwide. For the EU, it would help anticipate the unintended consequences of its own rules, making its regulatory framework more resilient, credible, and ultimately safer for everyone.