In this first blogpost of a two-part series, we lay out some of the key provisions of the European Digital Markets Act and four impacts it will likely have on human rights.
In the upcoming second blogpost, we’ll take a closer look at its interoperability requirements, assessing their objectives, their technical feasibility and their potential impacts on privacy and encryption.
This week, EU lawmakers adopted the Digital Markets Act, after the text was agreed by the European Council and the European Parliament in March. The Act is part of the European Union’s Digital Services Package, and sister to the Digital Services Act (DSA), which introduces new rules for platforms around their content moderation policies and practices.
While most of the new obligations that the DMA introduces for large online platforms are directed at market fairness and consumer protections, they also have implications for the human rights of users. In this blog post, we provide a top level overview of the potential human rights impacts of key provisions of the DMA, such as its prohibitions on monopolistic market practices, its protections for user privacy, and its interoperability requirements.
Impact 1: Preventing monopolistic market practices
The core aim of the DMA is to address unfair business practices by “gatekeeper” platforms, which are those which provide core or essential online services and which have significant and entrenched market power (see Article 3 for a full definition). The gatekeeper platforms often control entire online ecosystems of different platform services, and hold huge influence over users’ online lives, interactions and transactions. For example, companies like Apple, Google and Microsoft produce hardware, software, development environments, app stores, operating systems, payment services and cloud storage services; and companies like Meta and Amazon own a wide range of products and services that interact and interface with each other.
The DMA will prohibit gatekeepers from engaging in “unfair” business practices, including:
- ranking their own products or services higher than those of their competitors (Article 6.5)
- using non-public data generated by their business users – including click, search, view and voice data generated by the business user or their end users – in competition against them (Article 6.1)
- forcing users to exclusively use gatekeeper services unless there is genuine security concern associated with using a third-party option (Articles 5.3, 5.4, 5.5, 5.7 and 6.6)
- requiring users to install, download or use non-essential gatekeeper services, or preventing them from uninstalling or unsubscribing from such services (Articles 5.8, 6.3 and 6.4)
By prohibiting these self-preferencing market practices which entrench the market dominance of the gatekeepers, the DMA aims to prevent online platforms from exploiting the attention or custom of their users for financial profit, and to increase the choices available to individuals about how they conduct their online lives, including how they seek information and express themselves online. These provisions also seek to foster greater competition amongst smaller and medium sized enterprises.
One potential outcome of these provisions would be that products designed specifically for marginalised groups—for example, for religious minorities, disabled people or speakers of indigenous languages—could remain competitive despite their smaller user bases, which would have a positive impact on the rights to equality and non-discrimination.
Impact 2: Increased user choice through interoperability
As well as preventing gatekeepers from pre-installing or self-preferencing their own products or services, the DMA also requires gatekeepers to ensure that their operating systems and key hardware and software features are interoperable with third party software applications, hardware providers and service providers (Articles 6.4 and 6.7). This would mean, for example, that Apple would have to make its near-field communications chip in iPhones compatible with payment processing services other than Apple Pay, and that a Google pixel watch would have to be compatible with non-Google operating systems. Other provisions also mandate data portability and interoperability of interpersonal messaging services (Article 6.7 and Article 6.9), meaning that users could switch to alternative services without losing all of their data, or switch to alternative messaging platforms without losing the ability to send messages to those using the gatekeeper messaging platform.
We explore these interoperability requirements in more detail in our second blogpost. But, in general, they are intended to reduce network effects, meaning that users are not forced to choose a particular service simply because of the size or dominance of its networks, and can instead choose the service that performs the best on the metrics most important to them. For example, a user might choose a social network platform with greater protections against hateful or misogynistic speech; or they might choose a search engine which has better protections for data privacy. Beyond allowing individuals more choice over how they conduct their online lives, with positive implications for free expression, privacy, and non-discrimination, this requirement may also increase the pressure on gatekeeper platforms to address a range of other human rights risks posed by their business models if they wish to remain competitive to users.
While the overall aims of the interoperability requirements are positive, several questions remain for EU policymakers. Some argue that interoperability will actually reduce “multi-homing” (using multiple services), which could lead to even more people simply using the most popular messenger on an exclusive basis. Others have concerns about the technical implementation of the interoperability provisions:such as how encrypted communications can be made interoperable without introducing privacy or security risks, and who would oversee the licensing system that would likely be needed for gatekeepers to provide third parties with open APIs to their services. While it is likely that there are indeed solutions to these problems, these may take far longer than the proposed six month period to implement, and it may be wise to postpone bringing in these requirements until such questions can be answered. We’ll take a closer look at these issues in a forthcoming second blogpost.
Impact 3: More robust data protection
Article 5.2 of the DMA prevents gatekeepers from collecting, storing or processing personal data collected through a third party but hosted on the gatekeeper’s service, unless the user has given their explicit consent for this purpose. The introductory text of the DMA also clarifies that gatekeepers are not allowed to use “dark patterns” (manipulative or distorted design of websites or web interfaces) to implicitly nudge users towards giving consent for such practices.
These provisions limiting gatekeeper use of third party data will strengthen the security and privacy of users’ communications and personal information. By protecting users from interference or processing of their data by gatekeepers for financial or political gain, the DMA seeks to enhance individuals’ right to privacy and to limit the ability of gatekeepers to manipulate users into buying, thinking or believing certain things through targeted advertising. By strengthening privacy protections around user data, these provisions are also likely to have a positive impact on a number of other rights, allowing individuals to more freely express themselves online, form their own opinions, and vote in elections free of political manipulation.
Impact 4: Increased transparency
The DMA introduces a number of requirements forcing gatekeepers to be more transparent about their business operations. Most of these relate to business users, such as requiring search engine gatekeepers to explain their pricing and fee systems to advertisers and publishers (Articles 5.9 and 5.10), and to give them access to their marketing and advertising performance data (Article 6.8); or requiring gatekeepers operating software application stores to publish general conditions of access to the store by third parties (Article 6.12). Gatekeepers will also have to notify the European Commission in advance of any mergers or acquisitions of other companies (Article 14), and to provide the Commission with an annual report on its compliance with the DMA (Article 11). By mandating the disclosure of these pieces of information, the DMA will provide users, policymakers and researchers with the information necessary to critique or evaluate gatekeeper policies and practices, increasing the accountability of gatekeeper services.
There is also a requirement in Article 5.6 that gatekeepers must not prevent business users or individual users from making complaints about the gatekeeper to relevant authorities. Ensuring that users have access to both judicial and non-judicial reporting systems is an essential part of an online platform’s responsibilities under the UN Guiding Principles on Business and Human Rights, and it is therefore encouraging to see this explicitly set out in the DMA.
The DMA includes a number of provisions which are likely to positively impact human rights. Some of these impacts are direct, such as improving protections for user privacy or ensuring access to user reporting mechanisms. Other impacts are longer-term and more indirect, such as improving free expression and non-discrimination through ensuring a more diverse selection of services are available and usable for users.
While the broad aims and overall approach of the DMA are certainly commendable, we remain cautious that many of its provisions are novel and as yet untested methods for addressing the significant power of large online platforms, going beyond existing competition law or existing standards around interoperability. This means that some of its human rights impacts may only be fully understood once it has begun to be applied and enforced in practice.
In the upcoming second post in this series, we take a deep dive into the DMA’s interoperability requirements and assess how they might positively or negatively impact individuals’ human rights.