The Commission Nationale de l’Informatique et des Libertés, the French Data Protection Agency, has issued a 150M Euro fine against Google and a 60M Euro fine against Facebook/Meta for cookie consent violations.

Here are some key takeaways, and their US relevance:

  • It must be as easy to refuse cookies as it is to accept them.
  • An “accept all” button requires one action to accept. A process with a “manage options” that leads you to another screen where you scroll (or unclick) and the click an “accept” button to reject cookies – is three actions. Therefore, the consent is not freely given for the purpose of the ePrivacy directive. (That is important for CCPA/CPRA, where the regs also say that the process for opting out should be as easy as the one for opting in.)
  • The modalities allowing users to consent or refuse must be presented in a clear and understandable manner. In particular, when the refusal can be manifested by simply closing the window for collecting consent or by not interacting with it for a certain period of time, this possibility must be clearly indicated to users on that window. (CCPA and the FTC also require such clear presentation.)
  • You have to provide the information in a clear and complete way so that the user understands the meaning of the choices.
  • It is counter-intuitive to have to click on a button entitled “Accept cookies” to actually refuse their deposit. This encourages the user to think that it is ultimately not possible to continue browsing by having refused the deposit of advertising cookies, since the entire process of refusing cookies is based on information referring to the acceptance of cookies.

If you speak French, you can read more here.

For vehicle data, GDPR is just the beginning, the German Brandenburg regional government said in a Q&A. Stay tuned for the Data Governance Act.

Here are some key points:

  • Vehicle manufacturers have to observe GDPR when collecting and processing data. For this purpose, the customer’s consent to the use of their data (e.g. on-board systems such as car apps) should always be obtained when purchasing. They also must abide by GDPR when processing personal data on driving behavior for their own purposes.
  • GDPR is a good foundation for ensuring data protection for data collected, processed and stored by modern vehicles. The legal requirements on “privacy by design” and “privacy by default” are important cornerstones for manufacturers to take into account when developing their products.
  • The large amounts of data accumulating in modern vehicles, and the developments in autonomous and networked driving, repeatedly raise questions about storage, access, disposal and exploitation rights. In addition to the consistent implementation of the regulations, that also makes further legal regulations necessary. For example, the planned EU regulations on data intermediaries in the Data Governance Act should be mentioned here.
  • Consumers must be able to decide individually whether and to whom they grant access to the data sets they have produced. In addition, consumers must be able to delete personal driving data at any time within the framework of the legal limits.

The German Data Protection Conference (DSK) issued guidance on the Federal Act on the Regulation of Data Protection and Privacy in Telecommunications and Telemedia (‘TTDSG’), which went into effect on December 1, 2021.

Some key takeaways:


  • If no personal data is processed, only TTDSG is applicable. If both personal and non-personal data is processed, both TTDSG and GDPR apply. However, for the storage and access of information on/from terminal equipment, TTDSG takes precedence. For the subsequent processing, GDPR applies.
  • Storage in or access from terminal equipment requires consent.
  • This is not just telephony or VoIP, but also cable, WLAN, and IoT connections (including appliances and smart TVs).

Storage and Access:

  • Storage and access includes: access to hardware device identifiers, advertising identification numbers, telephone numbers, SIM card serial numbers (IMSI), contacts, call lists, Bluetooth beacons or SMS communication. For all devices, the reading of the unique identifiers of the network hardware (MAC addresses) and browser fingerprinting.
  • An access requires a targeted transmission of browser information that is not initiated by the end user. If only information, such as browser or header information, is processed that is transmitted inevitably or due to (browser) settings of the end device when calling up a telemedia service, this is not to be considered “access to information already stored in the end device.” Examples of this are: (1) the public IP address of the terminal device, (2) the address of the called website (URL), (3) the user agent string with browser and operating system version and (4) the set language.
  • You can get consent to store and access information and consent for further processing under GDPR 6(1)(a) at the same time if: (a) you inform the users of all purposes (including of the subsequent processing), and that it is clear to the user that several consents are given in a single action (e.g. the pressing of a single button). Consent by approval of a banner is not consent for TTDSG and GDPR, it’s just consent under TTDSG.


Consent for Chas the same requirements as consent for GDPR.

From whom: consent is required from the person who objectively uses the terminal equipment. Ownership of the terminal equipment is basically irrelevant, as is the question of who is the contractual partner in the telecommunications service that is used via the terminal equipment.

Time: before access to the terminal equipment


  • All storage and access activities must be transparent and comprehensible.
  • Users must be informed, among other things, about who is accessing the respective terminal equipment, in what form and for what purpose, what the functional duration of the cookies is and whether third parties can gain access to them.
  • Information must also be provided about the fact that a subsequent revocation no longer has any effect on the lawfulness of the access or storage that took place until the revocation,
  • Information provided at different points within a telemedia offering must be consistent
  • If processes take place within the scope of the telemedia offering that fall under both the TTDSG and the GDPR, information must be provided separately about the two legal bases.


  • Active action on the part of the end user is always required
  • Opt-out procedures are always unsuitable for establishing effective consent. The fact that the end user’s browser allows cookies or web storage, e.g., local shared objects (LSOs), cannot constitute consent, regardless of other aspects such as informed consent or certainty.
  • The mere further use of a website or app, e.g. through actions such as scrolling down, surfing through website content, clicking on content or similar actions can also not constitute effective consent to access or store information on a terminal device.
  • The evaluation takes into account how the buttons for giving consent and other options for action are labeled and designed, and what additional information is provided.
  • If consent banners are displayed in telemedia offerings that merely contain an “Okay” button, clicking the button does not constitute an unambiguous declaration. Even the terms “agree,” “I consent,” or “accept” may not be sufficient in individual cases if it is not clear from the accompanying information text what specifically the consent is to be given for.
  • In cases where it is not possible to remain inactive because a consent banner blocks access to some or all of the content of the telemedia service, end users must at least be able to express their rejection without additional clicks (compared to consent).
  • Button to “Accept all” and on the other hand a button with names such as “Settings”, “Further information” or “Details” is not compliant. You need to give an equivalent option e.g. “reject all”.


  • General or blanket consent for various potential subsequent processing operations is not compliant.
  • End users must then also be able to consent to or reject the different purposes separately.
  • It is possible to design consent banners with multiple layers, i.e. to provide more detailed information only on a second level of the banner, which users can access via a button or link. However, if a button already exists on the first level of the banner, with which consent can be given for various purposes, concrete information on all individual purposes must also be contained on this first level. It would be too vague to merely provide generic, general or vague information on the purposes here, such as “In order to provide you with a better user experience, we use cookies”

Freely Given:

  • In assessing whether consent for access to end user devices was given freely, it must first be clarified whether there was any compulsion at all for the end user’s to make a declaration or whether they could have remained inactive.
  • It can be assumed that such a compulsion exists if a banner or other graphic element for requesting consent obscures access to the website as a whole or parts of the content and the banner cannot simply be closed without a decision
  • The argument that no one is forced to visit a website whose content is in principle also offered by others on the market cannot be accepted. As the European Data Protection Board (as well as its predecessor institution) has already made clear, consent cannot be regarded as given voluntarily because there is a choice between a service, which includes consent to the use of personal data for additional purposes, and a comparable service offered by another controller

Possibility to revoke consent:

  • It must be as easy to revoke as to give consent.
  • If consent is given directly when using a website, it must also be possible to revoke it in this way. Exclusive revocation options via other communication channels such as e-mail, fax or even by letter do not comply with the requirements. It is also not permissible to refer users to a contact form,


CMP’s – legally compliant consent is by no means automatically obtained through the use of the CMP alone. The responsibility for the effectiveness of the consent obtained remains with the respective provider of the telemedia service.

Service specifically requested:

  • The basic service is to be regarded as the telemedia service desired by users as soon as they deliberately call up a service. However, this action does not automatically mean that the user wants all additional functions of the basic service. The desired range of functions must be assessed in each individual case from the perspective of average user
  • The explicit wish of the users with regard to these additional services and functions must therefore be expressed in further actions. In the context of websites, this means that users do not have to accept every access to their terminal equipment,
  • Even for services explicitly requested by end users, only those accesses to the terminal equipment that are technically necessary to provide the requested service are covered by the exception.
  • Storage is only absolutely necessary in a few cases, since many functions that are to be implemented by storing information on and reading it from users’ end devices can be implemented without individualization. For example, it is not considered necessary that a cookie with a unique ID is stored long-term and can be retrieved for storing consent or for load balancing

Audience measurement

  • In the website and app context, the original reach measurement has therefore evolved into a reach analysis with a non-fixed scope, using numerous, often individualized pieces of information, to which any criteria can be added.
  • The purpose for which the reach measurement is used is decisive for answering the question of whether a telemedia service expressly desired by the user can be assumed. Even the simple measurement of visitor numbers is therefore not to be classified per se as a component of the basic service, but depends on the specific purpose pursued in each case.

Legal Basis under GDPR

  • You need a legal basis for the data processing associated with the integration of third-party content on websites regularly involves the disclosure of personal data to the operators of the respective third-party servers, e.g. advertisements, fonts, scripts, city maps, videos, photos or content from social media services.
  • Accountability: data controllers must be able to prove that the processing of personal data is lawful. This means that controllers must check and document in advance on which authorization facts they base the processing.
  • Consent: must be clearly related to data processing processes (and not merely the technical use of cookies or similar). Also, controllers must take appropriate technical and organizational measures to ensure that only personal data that are necessary for the respective specific processing purpose are processed by means of data protection-friendly default settings.

Consent banner:

Not every use of cookies or subsequent tracking requires consent per se, so corresponding consent banners should only be used if consent is actually required. Otherwise, the misleading impression is created that the data subjects have a choice, although this does not exist.

Requirements for banners:

  • Separate HTML element
  • Provide full information including all actors involved, which can be activated only if selected
  • While the consent banner is displayed, no further scripts of a website or an app that potentially access the end devices of the users are loaded.
  • Access to the imprint and privacy policy must not be hindered by the consent banner.
  • Information may be stored or read only after consent is given
  • For every option to give consent (button) you need an option to reject.
  • Store the submission of consent so that the banner does not reappear.

Legitimate interest

  • In the context of tracking, the requirements of Article 6 (1) (f) of the GDPR are only met in a few constellations in practice.
  • Attention must be paid to whether these service providers also process data of the data subjects for their own purposes (e.g., to improve their own services or to create interest profiles). In this case – and even if the third-party service provider only reserves the right to do so in the abstract – the scope of a commissioned processing pursuant to Art. 28 DS-GVO is exceeded. For the transfer of personal data – even if it is only the IP address – to these third-party service providers, Article 6 (1) (f) of the GDPR can then generally not form an effective legal basis.

International transfers

  • A transfer of personal data to the USA and other third countries without a level of data protection recognized by the EU Commission may therefore only take place subject to suitable guarantees, such as standard data protection clauses, or if an exceptional circumstance exists for certain cases pursuant to Art. 49.
  • The mere conclusion of standard data protection clauses such as the standard contractual clauses adopted by the EU Commission is not sufficient. You need to conduct a TIA and see whether supplementary measures are necessary.
  • Especially in connection with the integration of third-party content and the use of tracking services, however, it will often not be possible to take sufficient supplementary measures. In this case, the services concerned must not be used, i.e., they must not be integrated into the website
  • Personal data processed in connection with the regular tracking of user behavior on websites or in apps cannot, in principle, be transferred to a third country on the basis of consent pursuant to Art. 49 GDPR. The scope and regularity of such transfers regularly contradict the character of Art. 49 GDPR as an exceptional provision

A German Court has ordered pain and suffering damages as a result of a data breach, the first decision of its kind in Europe.

According to the judgment, Scalable Capital has to pay the plaintiff, represented by consumer organization EuGD Europäische Gesellschaft für Datenschutz mbH, € 2,500 in damages for non-material damage because he was affected by the Scalable data leak. The plaintiff from southern Germany is one of the 33,200 Scalable Capital customers whose e-mail addresses, copies of ID cards, photos and account numbers ended up on the Darknet between April and October 2020 as a result of a data leak.

While it is possible to sue for such damages in some US states, to my knowledge, no such award has been made in the US either. Most data breach lawsuits are filed for negligence, breach of contract, breach of warranty, breach of fiduciary duty, false advertising and unfair or deceptive trade practices.

Recently, in the Ramirez case, the U.S. Supreme Court held that Article III standing requires a concrete injury even in the context of a statutory violation. And in Spokeo, the court said that it is not enough to allege a bare procedural violation, divorced from any concrete harm.

With enforcement on children’s data privacy ramping up around the world, Ireland’s Data Protection Commission has issued a detailed report on the fundamental principles of such data privacy, as well as some helpful suggestions to controllers on how to improve.

The key principles:

  1. FLOOR OF PROTECTION: Online service providers should provide a “floor” of protection for all users, unless they take a risk-based approach to verifying the age of their users.
  2. CLEAR-CUT CONSENT: When a child has given consent for their data to be processed, that consent must be freely given, specific, informed and unambiguous, and by a clear statement or affirmative action.
  3. ZERO INTERFERENCE: Ensure that the pursuit of legitimate interests do not interfere with, conflict with or negatively impact, at any level, the best interests of the child.
  4. KNOW YOUR AUDIENCE: Take steps to identify your users and ensure that services directed at/ intended for or likely to be accessed by children have child-specific data protection measures in place.
  5. INFORMATION IN EVERY INSTANCE: Children, not just their parents, are entitled to receive information about the processing of their own personal data.
  6. CHILD-ORIENTED TRANSPARENCY: Privacy information about how personal data is used must be provided in a concise, transparent, intelligible and accessible way, using clear and plain language that is comprehensible and suitable to the age of the child.
  7. LET CHILDREN HAVE THEIR SAY: Don’t forget that children are data subjects in their own right and have rights in relation to their personal data at any age.
  8. CONSENT DOESN’T CHANGE CHILDHOOD: Consent obtained from children or from guardians/parents should not be used as a justification to treat children of all ages as if they were adults.
  9. YOUR PLATFORM, YOUR RESPONSIBILITY: If a platform uses age verification and/or relies on parental consent for processing, it should go the extra mile in proving that its measures around age verification and verification of parental consent are effective.
  10. DON’T SHUT OUT CHILD USERS OR DOWNGRADE THEIR EXPERIENCE: If your service is directed at, intended for, or likely to be accessed by children, you can’t bypass your obligations simply by shutting them out or depriving them of a rich service experience.
  11. MINIMUM USER AGES AREN’T AN EXCUSE: Theoretical user age thresholds for accessing services don’t displace the obligations of organizations to comply with the controller obligations under the GDPR and the standards and expectations set out in these fundamentals.
  12. A PRECAUTIONARY APPROACH TO PROFILING: Don’t profile children and/or carry out automated decision making in relation to children for marketing/advertising purposes unless you can clearly demonstrate how and why it is in the best interests of the child to do so.
  13. DO A DPIA: Undertake data protection impact assessments (DPIA). The principle of the best interests of the child must be a key criterion in any DPIA and must prevail over the commercial interests of an organization in the event of a conflict.
  14. BAKE IT IN: If you routinely process children’s personal data you should, by design and by default, have a consistently high level of data protection which is “baked in” across your services.

Key recommendations:

  • Data SharingDo not systematically share a child’s personal data with third parties without clear parental knowledge, awareness and control; Build in parental reminders/notifications, where appropriate.
  • Profiling: Turn off identifiers, techniques or settings which allow any tracking of activity online for advertising purposes.
  • Nudge techniques: Avoid the use of nudge techniques that encourage or incentivize children to provide unnecessary information or to engage in privacy disrupting actions.
  • Encourage privacy enhancing behavior: push notices/just-in-time notifications emphasizing that one or more option(s) provides a greater level of privacy than the action the child user is about to pick.
  • Opt to process personal data on the user’s device, as opposed to transferring the data to the cloud.
  • Avoid the use of personalized auto features, such as autoplay features and reward loops where children’s personal data is used to support these features.
  • Provide parents with an overall view of activity (including any history of activity) and settings that their child has available to them. Consider allowing parents to modify child account controls and settings, where appropriate.
  • Make it visible to the child that their parent(s) can tell which app/ website/ program etc. they are using or that their parent(s) can later review their activity history.
  • Higher security settings for child account data may be appropriate, including the possibility of isolating or “air gapping” child personal data from adult personal data. Administrator accounts for child data should be flagged or have a specific role so that internal organizational access can be easily distinguished, monitored, audited and altered.
  • Avoid the collection and processing of children’s biometric data.
  • Where a child can share communications, content or data, ensure limited audience selections by default. Contact from others outside of the child’s authorized contacts should not be possible for younger children without parental knowledge, awareness and intervention.
  • Geolocation:
    • Turn off geolocation by default for child users unless the service being provided is necessarily dependent upon it. If this is the case, make it clear to the child (e.g. through the use of symbols/icons) that their location is available to the service or can be seen by other users.
    • Provide clearly visible controls to allow the child to change this at any time or following each session, after a short time period, or once the event or feature requiring location has completed.
    • Significantly reduce the level of accuracy of geolocation data collection except where necessary.

Norwegian regulator Datatilsynet has slapped Grindr, a location-based online dating application, with a $7.1 million fine for sharing data with advertisers without the consent of its users. Here are some of my initial takeaways.


  • The opinion was released in (excellent) English, and this is very important and much appreciated.
  • The opinion is very well written, clear, organized and legally sound. I have recommended reading it verbatim to my team for training purposes.

On Compliance, SA’s, stops and shops:

  • Supervisory authorities are expected to follow EDPB guidelines when enforcing the GDPR in concrete case.
  • Even if your practices are much better than what the industry has been doing, this is not necessarily compliant with the law.
  • The fact that few complaints have been filed by data subjects doesn’t mean a low level of damage suffered. Few people have the initiative to sue and many don’t understand complex processing enough to sue.
  • The controller’s financial situation and the fact that they profited from the infringement (e.g. due to advertising) are aggravating factors.

On Location data:

  • GPS location is particularly revealing of the life habits of individuals, and can be used to infer sensitive information. This is especially sensitive when opting out of location data deteriorates the functionality of the app.
  • The processing of an individual’s location information can be a highly intrusive act, depending on the circumstances.
  • Even data which is normally indirectly identifiable, when containing online identifiers, it can potentially be combined with other data collected from other services, and from other devices through cross-device tracking and be reidentified.

On data sharing:

  • You are responsible for controlling/taking responsibility for your own data sharing. If you are only transmitting an opt-out signal (conveying the data subject’s opt out preference together with personal data) and have to rely on the actions of others (users, OS, partners, etc.) to halt the sharing where required, you are in breach of your duties under Art 5(2), 24, 25. Same goes for downstream partners “blinding” App ID.

Read more here.

Who refused the cookies in the cookie jar?

The Commission Nationale de l’Informatique et des Libertés (CNIL) has sent new orders for cookie compliance to 30 additional companies, bringing the total to 90.

The sectors affected include: public institutions, higher education, clothing, transportation, retail and distance selling.

Some key issues:

  • automatic embedding of cookies before consent;
  • banners with no easy rejection option
  • cookies subject to consent still embedded after refusal expressed by the user.

Read more here.

“Clear is kind. Unclear is unkind,” according to author Brené Brown.

A joint opinion from the European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) on the European Union’s proposed digital and data strategies, including the Digital Services Act (DSA), the Digital Markets Act (DMA), the Data Governance Act (DGA) and the Regulation on a European approach for Artificial Intelligence (AIR) – says “legal uncertainty is unkind” and urges EU regulators to fix the proposed legislation accordingly.

“Without further amendments, the proposals will negatively impact the fundamental rights and freedoms of individuals and lead to significant legal uncertainty that would undermine both the existing and future legal framework. As such, the proposals may fail to create the conditions for innovation and economic growth envisaged by the proposals themselves.”

Key concerns:

(1) Lack of protection of individuals’ fundamental rights and freedoms

  • Use of AI systems categorizing individuals from biometrics (such as facial recognition) according to ethnicity, gender, as well as political or sexual orientation, or other prohibited grounds of discrimination – should be banned.
  • Use of AI to infer emotions of a natural person should be prohibited, except for certain well-specified use-cases, namely for health or research purposes, subject to appropriate safeguards, conditions and limits.
  • Any use of AI for an automated recognition of human features in publicly accessible spaces – such as of faces but also of gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioral signals – in any context – should be banned.
  • After a phaseout period – targeted advertising on the basis of pervasive tracking as well as of the profiling of children should be prohibited.
  • The laws should require interoperability, making it easier for people to switch digital providers.
  • The obligation of data protection by design and by default should be paramount, particularly in the context of ‘connected objects’ (e.g. the Internet of Things and the Internet of Bodies), due to the significant risks to the fundamental rights and freedoms of the persons concerned.

(2) Fragmented supervision

  • The laws should provide that inasmuch as personal data is concerned, the relevant competent authorities should be the data protection supervisory authorities.
  • They should also specify what happens in situations of overlapping competence between the data protection supervisory authorities and the new supervisory authorities formed by the new legislation.
  • It should be clear how certificates and codes of conduct under the proposed AI act interface with requirements under the GDPR.
  • The legislative proposals should provide for an explicit legal basis for the exchange of information necessary for effective cooperation and identify the circumstances in which cooperation should take place.
  • The proposals should also enable the competent supervisory authorities under each proposal to share information obtained in the context of any audits and investigations that relate to the processing of personal data with the competent data protection authorities, either upon request or on their own initiative.

(3) Risks of inconsistencies

  • The proposals should clearly state that they shall not affect or undermine the application of existing data protection rules and ensure that data protection rules shall prevail whenever personal data are being processed.
  • The legal basis for each use of personal data should be clear from the proposals
  • Terminology should be defined, with references back to the data protection legislation, in order to avoid inconsistencies
  • The proposals should sufficiently specify whether they refer to non-personal data, personal data or both. They should also specify that in cases of ‘mixed data sets’ the GDPR applies.

If you use a U.S.-based sub processor (even for data processed in the EU), you lose, the German administrative court of Wiesbaden said in an interim decision.

No transfer. No worries. TIA anyway.

Even if the server is possibly located in the EU, the US company has access to it and the U.S. Cloud Act applies.

According to the Cloud Act, U.S. government agencies could unilaterally request personal data from U.S. companies without a court order and without a legal aid agreement.

The U.S. law allows initial suspicion of any criminal act to be sufficient, whereas the EU: suspicion of serious crime only.
Therefore, personal data is risk of unauthorized access, which constitutes a breach of confidentiality in accordance with Article 32 (1) (b) GDPR.

Even if a service only transmits an unabridged IP address when it is loaded for the first time, this is still processing that is significant in terms of data protection law.

Under Art 48 GDPR, transfer of personal data on the basis of a decision by a foreign court or a foreign administrative authority may in principle only take place if it is based on an international agreement in force such as a mutual assistance agreement between the requesting third country and the European Union or a member state can be supported, and no such agreement exists between the EU and the USA.

The Court also considered Art 49 derogations, but decided, based on the facts, they weren’t met.