“Clear is kind. Unclear is unkind,” according to author Brené Brown.

A joint opinion from the European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) on the European Union’s proposed digital and data strategies, including the Digital Services Act (DSA), the Digital Markets Act (DMA), the Data Governance Act (DGA) and the Regulation on a European approach for Artificial Intelligence (AIR) – says “legal uncertainty is unkind” and urges EU regulators to fix the proposed legislation accordingly.

“Without further amendments, the proposals will negatively impact the fundamental rights and freedoms of individuals and lead to significant legal uncertainty that would undermine both the existing and future legal framework. As such, the proposals may fail to create the conditions for innovation and economic growth envisaged by the proposals themselves.”

Key concerns:

(1) Lack of protection of individuals’ fundamental rights and freedoms

  • Use of AI systems categorizing individuals from biometrics (such as facial recognition) according to ethnicity, gender, as well as political or sexual orientation, or other prohibited grounds of discrimination – should be banned.
  • Use of AI to infer emotions of a natural person should be prohibited, except for certain well-specified use-cases, namely for health or research purposes, subject to appropriate safeguards, conditions and limits.
  • Any use of AI for an automated recognition of human features in publicly accessible spaces – such as of faces but also of gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioral signals – in any context – should be banned.
  • After a phaseout period – targeted advertising on the basis of pervasive tracking as well as of the profiling of children should be prohibited.
  • The laws should require interoperability, making it easier for people to switch digital providers.
  • The obligation of data protection by design and by default should be paramount, particularly in the context of ‘connected objects’ (e.g. the Internet of Things and the Internet of Bodies), due to the significant risks to the fundamental rights and freedoms of the persons concerned.

(2) Fragmented supervision

  • The laws should provide that inasmuch as personal data is concerned, the relevant competent authorities should be the data protection supervisory authorities.
  • They should also specify what happens in situations of overlapping competence between the data protection supervisory authorities and the new supervisory authorities formed by the new legislation.
  • It should be clear how certificates and codes of conduct under the proposed AI act interface with requirements under the GDPR.
  • The legislative proposals should provide for an explicit legal basis for the exchange of information necessary for effective cooperation and identify the circumstances in which cooperation should take place.
  • The proposals should also enable the competent supervisory authorities under each proposal to share information obtained in the context of any audits and investigations that relate to the processing of personal data with the competent data protection authorities, either upon request or on their own initiative.

(3) Risks of inconsistencies

  • The proposals should clearly state that they shall not affect or undermine the application of existing data protection rules and ensure that data protection rules shall prevail whenever personal data are being processed.
  • The legal basis for each use of personal data should be clear from the proposals
  • Terminology should be defined, with references back to the data protection legislation, in order to avoid inconsistencies
  • The proposals should sufficiently specify whether they refer to non-personal data, personal data or both. They should also specify that in cases of ‘mixed data sets’ the GDPR applies.