May 17. 2024. 3:38

The Daily

Read the World Today

When the dust settles: Remaining Questions of Privacy vs Utility under the DMA


In March 2024, the European Commission held a series of workshops to provide stakeholders with an opportunity to pose questions regarding the gatekeepers’ compliance solutions with respect to the various obligations imposed by the Digital Markets Act (DMA) [1]. The interactions throughout the workshops showed that implementation must be an iterative process and that questions remain for the Commission and other actors to answer regarding the DMAs’ data portability obligations under Articles 6(9), 6(10), and 6(11) DMA.

One of the DMA’s objectives is to facilitate the movement of data from gatekeeper platforms and services to third parties through a number of data-sharing obligations in business-to-business (B2B) and business-to-consumer (B2C) contexts. As the European Data Protection Supervisor (EDPS) pointed out in their DMA opinion [2], such data sharing will inevitably also include personal data provided by the end user, for instance, via search queries when the individual enters personal data as part of their search or data the end user generated through the use of the gatekeeper’s respective core platform service. The processing of personal data in the EU is regulated by the GDPR. This raises specific questions about the interaction of GDPR and DMA in this context.

The European Data Protection Board (EDPB) has recognised the impact of the DMA as part of the new digital framework. The 2024-2027 EDPB strategy includes more cooperation with other digital regulatory authorities and an active role in the DMA High-Level Group as a key action point [3]. It is clear that effective data protection, as a fundamental right, should not be hampered by decisions made to improve contestability or fairness in the market. Conversely, though, the GDPR should also not be interpreted in a way that unnecessarily undermines effective competition and economic growth as intended by the Commission’s digital strategy. Failure to adopt a coherent approach between competent regulators would doubtlessly bring legal uncertainty in the context of implementing data portability and potentially undermine its full potential.

This leads to one question as the DMA is implemented: how to strike the appropriate balance between data protection and competition? Recital 61 of the DMA, which corresponds to Article 6(11) of the DMA, tries to address this.

Article 6(11) obligates gatekeepers to share ranking, query, click and view data from online search engines with third-party search engine providers on FRAND terms [4]. Anybody who has ever used a search engine would be aware of the potential for large amounts of personal data or even sensitive data to be included in any given search query [5]. Article 6(11) DMA recognises this as well and requires personal data included in the data sets to be anonymised before sharing. Recital 61 expands on this and clarifies that gatekeepers: “should ensure the protection of personal data of end users, including against possible re-identification risks, by appropriate means, such as anonymisation of such personal data”. Recital 61 adds another requirement, though, namely that these measures should be applied without “substantially degrading the quality or usefulness of the data”.

This raises two questions: for one, what constitutes “appropriate means” under the Recital 61 DMA, and second, what is the right balance between protecting personal data and the utility of the data for purposes of the DMA or put differently, what is the minimal viable data point until utility for purposes of the DMA is lost?

While Article 6(11) DMA refers to anonymisation only, Recital 61 appears to consider anonymisation as an example of such ‘appropriate means’ (“such as”). However, the remainder of Recital 61 picks up the language of Recital 26 GDPR, which would imply a more limited interpretation of “anonymisation”. The GDPR does not provide more of a definition of “anonymous”, but historically, interpretation provided by data protection authorities (DPAs) set an extremely high bar for anonymisation, including making the deletion of the original data set a condition of rendering the data set (irreversibly) anonymous [6]. In a more recent opinion, the Spanish AEPD and EDPS issued a joint paper to clarify the DPA’s understanding of properly anonymised data further. The paper emphasises that from a data protection point of view, 100% anonymisation in a strict sense is the aim, but it concedes that this may not always be possible [7]. The conclusion the AEPD and the EDPS come to for such a case is for the controller “to choose between processing personal data (and use, e.g. pseudonymisation) and apply the GDPR, or not to process the data at all “ [8].

In the context of Article 6(11) DMA, however, this is unlikely to be an option. The gatekeeper is obligated to provide the data set anonymised where it contains personal data, but also “without substantially degrading the quality or usefulness of the data” for the purpose of the DMA. The AEPD and EDPS joint paper accepts that there are cases where the balance between utility and re-identification may not be found, for instance, where the total number of individuals in a data set is too limited [9]. In the context of search data, this may be the case where a query is conducted in a rare language, for instance, (e.g. Lithuanian) or in the context of a small country (URLs are often country-specific) where the possibility of re-identification of individuals is considerably higher [10]. This may be amplified by the type of searches entered or other user behaviour.

Additionally, third-party recipients of the ported data would have the opportunity to combine data sets received under Article 6(11) DMA with existing data sets, increasing the potential for re-identification despite the gatekeeper’s best efforts [11].

The right balance between the goals of the DMA data sharing obligation and the protection of personal data as required by the DMA will have to be case-specific and context-driven. By comparison, Recital 64 of the Data Act sets out in the context of B2G data sharing that: “the data holder should take reasonable efforts to anonymise the data or, where such anonymisation proves impossible, the data holder should apply technological means such as pseudonymisation and aggregation, prior to making the data available.” This suggests that in the context of the Data Act at least, the standard is “reasonable effort” when it comes to anonymisation and privacy enhancing technologies such as differential privacy could find consideration. The same should be considered in the application under the DMA.

A consistent approach to overlapping legal concepts in the DMA, DA, and GDPR is vital for a coherent implementation and avoiding fragmentation. It requires a unified approach and cooperation by the competent supervisory authorities and more guidance potentially issued together by the competent regulators.

References:

  1. Regulation (EU) 2022/1925.
  2. European Data Protection Supervisor, Opinion 2/2021 on the Proposal for a Digital Markets Act, para. 32, p. 12, available at https://www.edps.europa.eu/system/files/2021-02/21-02-10-opinion_on_digital_markets_act_en.pdf.
  3. European Data Protection Board Strategy 2024-2027, Pillar 3, p. 4, available at https://www.edpb.europa.eu/our-work-tools/our-documents/strategy-work-programme/edpb-strategy-2024-2027_en.
  4. FRAND stands for fair, reasonable, and nondiscriminatory.
  5. Searches may include medical conditions such as specific symptoms, diseases, treatments, or medications; queries regarding side effects of prescription drugs; financial concerns such as debt, bankruptcy, getting loans, or unusual spending patterns; searches related to specific laws, criminal activity, lawsuits, and (“what is the penalty for [crime]”); searches related to sexual preferences, etc.
  6. Article 29 Dat Protection Working Party Opinion 05/2014. The Opinion was published under the Directive 95/46/EC.
  7. AEPD-EDPS joint paper on 10 misunderstandings related to anonymisation, misunderstanding nr 5: “The expression “anonymous data” can not be perceived as if datasets could simply be labelled as anonymous or not. The records in any dataset have a probability of being re-identified based on how possible it is to single them out. Any robust anonymisation process will assess the re-identification risk, which should be managed and controlled over time. Except for specific cases where data is highly generalised (e.g. a dataset counting the number of visitors of a website per country in a year), the re-identification risk is never zero”, p. 5, available at https://www.edps.europa.eu/data-protection/our-work/publications/papers/aepd-edps-joint-paper-10-misunderstandings-related_en.
  8. Additional guidance on anonymisation is also expected to be released by the EDPB. Also, a new interpretation can potentially be expected by the CJEU as part of the proceedings under the case SRB v EDPS (Case T-557/20, SRB v EDPS and Appeal C-413/23 P). The case seemed to emphasise that determining whether data has been anonymised requires a risk-based and contextual assessment of the risk of reidentification.
  9. AEPD-EDPS joint paper, p. 4
  10. Combining a financial query with location information. Searches may include specific addresses or pinpoints on location features.
  11. Recital 39 of the Data Act indicates that for third party data recipients “refrain from using data falling within the scope of this Regulation to profile individuals”; the DMA does not have an equivalent recital.