What does the U.K. Information Commissioner’s Office have to say about what it takes for adtech initiatives to be compliant with data protection?

“There is an opportunity for market participants to move towards developing solutions that incorporate key considerations of data protection compliance. They should also place the interests, rights and freedoms of individuals at the core of their design.” the UK ICO said in a new opinion.

Here are some key points:

  • New initiatives in adtech must address the risks that adtech poses and take account of data protection requirements from the outset.
  • Any proposal that has the effect of maintaining or replicating existing tracking practices is not an acceptable response to the significant data protection risks that the Commissioner has already described.
Adtech proposals must:
  • engineer data protection requirements by default into the design of the initiative,
  • offer users the choice of receiving adverts without tracking, profiling or targeting based on personal data; Market participants should evidence a high privacy, no tracking by default option, and demonstrate how user choice can be exercised throughout the data lifecycle,
  • be transparent about how and why personal data is processed across the ecosystem and who is responsible for that processing,
  • articulate the specific purposes for processing personal data and demonstrate how this is fair, lawful and transparent,
  • address existing privacy risks and mitigate any new privacy risks that their proposal introduces.
To be compliant, new initiatives should:
  • move away from the current methods of online tracking and profiling practices,
  • improve transparency for individuals and organizations,
  • reduce existing frictions in the online experience,
  • provide individuals with meaningful control and choice over the processing of device information and personal data,
  • ensure valid consent is obtained where required,
  • ensure there is demonstrable accountability across the supply chain.

Organizations must demonstrate that new approaches do not introduce additional privacy threat vectors or lead to increased use of fingerprinting or both.

On online tracking:
  • Online tracking may involve many of the types of processing operations defined in data protection law, depending on the circumstances.
  • Online tracking can therefore be considered as processing activities involving the monitoring of individuals’ actions, especially over a period of time (including the behavior, location or movements of individuals and their devices), in particular to: build profiles about them; take actions or decisions concerning them; offer goods and services to them; evaluate the effectiveness of services they use; and analyze or predict their personal preferences, behaviors and attitudes.
  • PECR aside, neither the balancing test nor a compatibility assessment would enable the processing to be fair and lawful without consent. This is because of the nature, scope, context and purposes of these processing activities, and the risks they pose to rights and freedoms.
  • Organizations should not assume that there are no compliance requirements with PECR or data protection law merely because TPCs are removed (or that they already do not use them for tracking purposes).
First party data does not have an inherently lower risk than third party:
  • Some uses of first-party cookies may be regarded as carrying a lower privacy risk (eg the concept of “first party analytics”). However, this is not a general rule and does not necessarily apply to first-party cookies alone. The risks ultimately depend on the nature, scope context and purposes of the processing and how it is implemented.
  • However an organization categorizes personal data, the processing must be done in line with the law.
  • What is relevant for data protection purposes is: (1) whether the data is personal data; (2) the organization(s) responsible for determining the purposes and means of the processing, and for demonstrating compliance; and (3) if the processing involves disclosure to other organizations, clarifying who they are, their roles and responsibilities, and how they will process the data in compliance with the law after they receive it.
  • Similarly, what is relevant for PECR (ePrivacy) purposes is: (1) who is responsible for processing terminal equipment information; and (2) the purposes they want to process it for.
  • Data protection law places obligations on the entity or entities that determine the purposes and means of the processing of personal data. The entity responsible for such decisions is the controller for that processing. This is the case regardless of: (1) where the controller sources the personal data (i.e., direct from an individual, or from elsewhere); and (2) whether the controller is a large technology platform with multiple services, or a single organization that seeks to share personal data with other organizations.
  • The focus should be on the nature of the risks involved, and their likelihood and severity.
Data sharing by large companies
  • Data protection law does not automatically enable platforms to track individuals across multiple services in an unfettered manner, or otherwise use personal data in ways that smaller market participants may not be able to.
  • While legitimate interests is the most flexible lawful basis for processing, organizations cannot assume it is the most appropriate one. If they do rely on legitimate interests, they take on extra responsibility for considering and protecting people’s rights and interests.
  • Organizations may be able to process data in the context of intra-group transmission or sharing with other organizations if the disclosure is fair and compatible with the original purpose. The disclosing entity needs to justify the disclosure. The receiving entity needs to justify its own processing, taking into account how it received the data. However, data cannot be passed on for a new purpose – internally or externally – if doing so would be incompatible with the original purpose, considering the circumstances.
  • Interpretive guidance in the recitals of the UK GDPR about intragroup transmission for internal administrative purposes does not mean an organization can always rely on legitimate interests for this type of processing.
  • The Commissioner substantively addressed the TCF in the 2019 report, noting that it was insufficient to ensure transparency, fair processing or free and informed consent. There were also concerns stemming from a lack of clarity about how compliance was monitored and a reliance on contractual controls.
  • Subsequent iterations of the TCF and its use by publishers have not significantly addressed these issues.
On GPC (Global Privacy Control):
  • GPC’s draft specification states that it is intended to convey a “general request” concerning the sale or sharing of personal data, but “is not meant to withdraw a user’s consent to local storage as per the ePrivacy Directive […] nor is it intended to object to direct marketing under legitimate interest.”
  • As such, the GPC does not at this time appear to offer a means by which user preferences can be expressed in a way that fully aligns with UK data protection requirements.
On Identifier based solutions:
  • Generally speaking, these solutions do not address the issues raised in the 2019 report regarding transparency, control, consent or accountability.
  • They also introduce a more fundamental question about whether it is necessary, proportionate or fair for individuals to have to provide their personal data in the first place. This is particularly the case if identifier-based solutions only offer an opt-out.
  • Important to remember that: (1) if terminal equipment information is processed, Regulation 6 of PECR (Art 5 eprivacy) applies whether the information is personal data or not; and (2) the concept of personal data is broader than direct identifiability.
  • Information is personal data when it relates to an identified or identifiable individual.
  • In these solutions, an identifier is created for the purposes of processing information relating to that individual. This is regardless of the extent to which the original email address or other information such as their name can be inferred from it. Depending on the specifics, these approaches may also not result in effective pseudonymisation, particularly if the original email address is also involved.
  • It is also unclear whether these solutions enable individuals to have a general choice about tracking in the first place, and what happens when they make this choice. This may essentially replicate the current issues with tracking walls.
  • These approaches also need to ensure that they do not use dark patterns and nudge techniques to get individuals to “agree” to be tracked in order to access those services.
  • Several elements in the The W3C “Self-Review Questionnaire: Security and Privacy” may have application in the context of controllers who need to undertake DPIAs. While the W3C processes are not a replacement for any legal requirements like DPIAs, they may form part of the relevant considerations.
  • Even where the entity proposing the adtech solution is not a controller or processor, it is good practice to undertake a DPIA.