I had the pleasure of speaking recently with Jamal Ahmed on the PrivacyPros Podcast about privacy enforcement and privacy career trends.

Among the questions I tried to address:

  • What does the new Network Advertising Initiative (NAI) and its opt outs of hashed email targeted advertising mean?
  • Why should everyone read George Orwell’s “1984” and Carol Dweck’s “Mindset”?
  • What does the end of third-party cookies mean for data processing?
  • Why should women overcome the often natural tendency to open with an apology?
  • Why should we care about cookies even under U.S. data protection laws?
  • Why should privacy professionals (and everyone) listen to Brené Brown’s awesome “Dare to Lead” and “Unlocking Us” podcasts?
  • Where are U.S. data protection regulations going?
  • Why do you need to be passionate about what you do to be a successful privacy professional?
  • Why do I intersperse so many “likes” when I speak spontaneously? (Sorry, no good answer to that one. But I’ll accept commiseration.)


Federal Trade Commission authority boost?

H.R. 2668 – The Consumer Protection and Recovery Act – has passed in the U.S. House of Representatives.

The bill amends the Federal Trade Commission Act to provide the FTC with explicit authority to require bad actors to return money earned through illegal activity and to seek both injunctive and monetary relief for consumers in Federal courts (reversing a recent decision by the Supreme Court which limited its Section 13(b) authority).

The White House issued a press release saying: “The Administration supports House passage of H.R. 2668, the Consumer Protection and Recovery Act. … The Administration applauds this step to expressly authorize the FTC to seek permanent injunctions and pursue equitable relief for all violations of law enforced by the Commission and ensure that the cost of illegal practices falls on bad actors, not consumers targeted by illegal scams.”

Have you been leisurely following California Consumer Privacy Act (CCPA) litigation thinking, “That’s only for data breaches, not ‘soft’ violations.”

Think again.

California Attorney General Rob Bonta’s office has been busy enforcing CCPA for the past year.

Per a new enforcement report, you had better make sure that:

  • Your privacy policy is easily understood.
  • You have notices at collection, online and offline.
  • Your “Do Not Sell” link is there and works. (Links to DAI/NAA opt out pages are not enough!)
  • You disclose your financial incentives.
  • Your service provider agreements have the right limitations.

The AG also has added a new reporting tool for individuals to report faulty or missing “Do Not Sell” links.

You can take a deeper dive into the issue by reading this OneTrust DataGuidance article.

The Ohio Personal Privacy Act, also known as House Bill 376, is being considered in the Buckeye State.

Here are a few takeaways:

  • Enforcement by Attorney General only
  • Affirmative defense for companies that maintain and comply with a written privacy program that reasonably conforms with the NIST Privacy Framework.
  • “Business” include non-profits
  • Similar to Virginia and Colorado, “consent”  uses the GDPR formulation of “freely given, specific, informed and unambiguous”
  • Excludes data in the employment context
  • Narrow definition of “publicly available” (only government records)
  • “Sale” – monetary or other valuable consideration; transfer to affiliate is exempted
  • GLBA financial institutions and HIPAA CE and BAs, higher ed institutions and B2B transactions – exempted
  • Long list of data including health related – exempted
  • Exemption for fraud and identity theft detection

Consumer rights:

  • Right to know – via privacy notice which needs to include, in addition to what we saw in the other laws:
    1. details regarding the business and any affiliate to which personal data is transferred
    2. data retention practices
    3. information security practices
    4. notification of material changes to the policy (this requires affirmative consent or a notice + opt out 60 days in advance, as well as a need to provide direct notification where possible)
  • Right of access (by at least one method out of a provided list) covering the preceding 12 months
  • Right to delete (by at least one method), but exceptions include the written records retention schedule
  • Right to opt out of sale (with verification required); compliance with COPPA required for the sale of children’s information; required to notify third parties of the request and request that they comply.
  • No discrimination
  • Agreement between business and processor is required (but no prescriptive provisions)

Failure to maintain a privacy policy that reflects the data privacy practice to a reasonable degree of accuracy is an unfair and deceptive practice (but not privacy right of action).

The European Data Protection Board has issued final guidelines on virtual voice assistants.

The guidelines appear to be largely unchanged from the draft issued in February for public consultation.

The main change is noting that even if VVAs are themselves a software service, they always operate through a physical device such as a smart speaker or smart TV. VVAs use electronic communication networks to access these physical devices that constitute “terminal equipment” in the sense of the e-Privacy Directive.

Therefore, the provisions of Art 5(3) of the e-Privacy Directive apply whenever a VVA stores or accesses information in the physical device linked to it.


I spoke this week on Usercentrics’ Tech That Talks program, taking look at personalized ad targeting and the future of cookies.

Among the issues we discussed:

  • First party data isn’t holy water. It doesn’t absolve all data of sin and still requires a proper legal basis (for GDPR), transparency (all laws); consent or the ability to opt out, etc.
  • Advising clients on their obligations under the various U.S. data protection laws is a bit like playing Sudoku. You have to puzzle out the grid, figure out which bit applies where and then apply the strictest requirement to each type of processing.
  • 2021 has certainly “raised the line” on privacy awareness and enforcement in the U.S. with new state laws (VA CDPA and CO CPA); dozens of privacy bills both state and Federal and a revamp of the FTC promising to provide more enforcement and more guidance.
  • When it comes to your use of personal information for marketing: trust but verify. Consumers are more likely to share information with brands that they trust. But verify that you really are using the data as you say you are and that it isn’t being compromised by the actions of third parties with whom you share data.
  • Anonymization isn’t easy and isn’t a spot in time thing. You need to make sure you tell your attorneys what you have done to anonymize and how this could be re-identified. You need to establish policies and procedures to prevent re-identification by you and your downstream service providers and you need to have a system (preferably automated) to monitor this.

“Nothing ever happens in privacy, the team will manage itself”

This statement, which did not survive the test of time, was shared by one of the privacy pros who  participated in this month’s International Association of Privacy Professionals’ Women Leading Privacy networking session. I co-lead the session.

Some pet peeves/needs in leadership that we discussed include:

  • Flexibility and attentiveness in connection with the post-COVID return to workplace, and elsewhere. Leadership should gauge the thoughts, feelings and preferences of the workforce and make educated decisions based on them.
  • Empathy is critical in communications. We discussed “soft skills” (and why soft skills is a bad term” and Brené Brown’s “Dare to Lead” and “Unlocking Us” podcasts.
  • Courage, and specifically the need for courageous leaders that own decision making and are also able to ask questions, be curious and not dictate courses of action.
  • Empowering employees to think for themselves. It is the best way to lead a company to innovate and succeed in the future.
  • Figuring out how to bridge cultural differences and how to approach privacy compliance given the vast variety in the laws.

As always, it was great fun speaking with Future of Privacy Forum’s lovely and knowledgeable mobility guru Chelsey Colbert during Part 2 of OneTrust DataGuidance’s connected vehicles and data protection presentation.

Here are some takeaways from our chat:

  • In the Cold War spy series “The Americans,” characters kept changing their route to and from their house and the dead drop sites in order not to be followed. But now, your car just knows where you are at any given time.
  • Precise geolocation data, helpfully not defined consistently across all laws (is it 500 sq. m., 1750 ft? 1850? a bird? a plane?), is like a potent winter perfume. Spray in the air and walk through, it can be perfectly elegant. Overuse, and you are left with a pesky nuisance (Read: Data risk).
  • Should your car be allowed to gauge your mood and your road rage? There are safety considerations obviously, but there are also mistakes (mood recognition AI is not accurate) and algorithmic biases (darker skin tones are disproportionately considered angrier).
  • Life, liberty and the pursuit of affordable repair. The Massachusetts Right of Repair law of 2000 is currently being litigated with key issues including: is the implementation timeline (immediate) feasible? Is there an increased information security risk? Does this lead to a violation of existing motor vehicle safety laws.
  • Who is in the driver’s seat? You or your car? As cars are becoming more autonomous, the issue of who is the driver becomes more complex, especially in the Level 3 autonomous vehicles.

Children’s data isn’t child’s play.

If you have a product or service that collects information from children, you should:

  • Be transparent. No, really. And figure out the best ways to be transparent for kids, which includes just in time notices, video and audio. It is a good idea to enlist the help of UX/CX experts and to look for upcoming guidance from the Information Commissioner’s Office.
  • Conduct a data protection impact assessment. This is mandatory for companies under the ICO’s Age Appropriate Design Code and also under the new US privacy laws, which classify the information of a “known child” as sensitive information requiring care.
  • Ask parents and kids. It is important to not just ask about how to best sell your product (do that too), but also about how to best protect their information. Make sure they understand your interface.
  • Adopt data protection by design and by default. It is best to turn off geolocation, sharing of information etc. When a child goes to change the default choice to something less safe, alert them to the meaning of this via a pop-up.
  • Don’t. Use. Dark. Patterns. Not just for data collection or opt out (those are already prohibited by GDPR, CPRA, CDPA, etc), but also not for nudging and inappropriately impacting behavior. The FTC and EU regulators are looking at this closely.

CNIL, the Commission Nationale de l’Informatique et des Libertés, which is France’s Data Protection Authority, publishes framework to deal with post-Schrems II cross border transfers following the European Data Protection Board’s final guidelines on supplemental transfer measures:

Step 1
  • Inventory your transfers (involve: DPO, information systems department, purchasing department, operational managers of services, digital service providers).
  • Identify all digital tools used and all vendor contracts. CNIL lists the possible software and tools that could require transfers.
  • Document this in an excel spreadsheet of the tools and create a data flow map
Step 2
  • Create an action plan.
  • Carry out risk assessments with respect to personal data flows.
  • Assess whether the transfers have a legal basis.
  • Consider possible solutions following such analysis.
  • Identify who is responsible for the transfers.
  • Identify the transfer tools put in place.
  • Assess the effectiveness of the tool used in relation to the legislation of the country to which the data is being transferred.

Submit the assessment and action plan, which should include the identified action priorities and the resources that can be operationalized to achieve the same, to the relevant organization executive, as well as regularly reviewing data flows outside the EU alongside reviewing their legality, in particular on the occasion of each new purchase of digital services.

Read the full CNIL framework.