If at first they don’t consent, try, try again?

A new form of privacy fraud further complicates the relationship between the Ad Tech industry and GDPR.

As Ad Tech vendors struggle to comply with the strict requirements of the EU General Data Protection Regulation (GDPR), especially around the acquisition of freely given, specific, informed and unambiguous user consent for the use of personal data – a new form of privacy fraud called “consent string fraud” has been detected.

What is a GDPR consent string? This is “a series of numbers added to an ad bid request, which identifies the consent status of an ad tech vendor. That means whether or not they have a user’s consent to use their data in order to serve them personalized advertising.”

What is consent string fraud? In this practice, companies (whether knowingly or mistakenly), tamper with the consent string, changing the “0” (no user consent) to a “1” (have user consent).

CPO Magazine has more details.

Jeffrey L. Widman writes:

Fingerprint scanner, illustrating concept of biometricsIn 2008, the Illinois legislature enacted the Illinois Biometric Privacy Act, 740 ILCS 14/1 et seq. (“BIPA”) to provide standards of conduct for private entities in connection with the collection and possession of “biometric identifiers and information.” BIPA regulates the collection, use, safeguarding, handling, storage, retention and destruction of such biometric identifiers. Biometric identifiers include retina and iris scans, fingerprints, voiceprints, and scans of hands and faces. It does not include writing samples, signatures, photographs, physical descriptions or biological materials used for medical or scientific purposes.

BIPA’s Requirements

Significantly, BIPA does not prohibit the collection or purchase of biometric identifiers. Instead, BIPA requires private entities to develop written policies to establishing a retention schedule and guidelines for the destruction of such biometric identifiers. BIPA also imposes a set of guidelines with which the entities that do possess such biometric identifiers must comply. These include requirements that such entities:

  • Inform individuals in writing that the information is being collected or stored;
  • Inform individuals in writing of the purpose and length of time for which the information is being collected and stored; and
  • Obtain written consent from individuals whose biometric information is collected;

BIPA also prohibits entities that possess biometric identifiers from (i) selling, leasing, trading or otherwise profiting from such identifiers; and (ii) otherwise disclosing or disseminating such information unless the individual consents to such disclosure, the disclosure completes a financial transaction authorized by the individual, the disclosure is required by municipal, state or federal law or the disclosure is required in response to a warrant or subpoena.

The Recent Onslaught of BIPA Class Actions

Although BIPA provides a private right of action to individuals aggrieved by a violation of the Act, plaintiff’s attorneys essentially ignored BIPA from 2008 through 2016 and few lawsuits were brought on behalf of aggrieved individuals. However, in the past year, more than 30 class actions have been filed in Illinois for purported BIPA violations. Why the trend? For one, BIPA imposes penalties of $1,000 per negligent violation of the Act and $5,000 (or actual damages, whichever is greater) for intentional or reckless violations. Second, BIPA allows for the recovery of reasonable attorneys’ fees and costs, including expert witness fees. Accordingly, BIPA is a prime target for members of the plaintiff’s bar.

Although there is little case law interpreting BIPA, the Illinois Appellate Court issued its first opinion in December 2017 addressing the Act. In Rosenbach v. Six Flags Entertainment Corp., 2017 IL App. (2d) 170317, the court, citing several Federal Court decisions, dismissed a plaintiff’s BIPA claim for failure to state a claim due to the her inability to cite actual damages. In so holding, the Court focused on whether an individual is “aggrieved” (as required by BIPA) if he or she alleges that biometric information was collected without consent, but does not allege actual injury. In dismissing the case, the appellate court found that mere technical violations are not actionable since a plaintiff is not “aggrieved” as the plain language of BIPA requires. While the opinion may deter some cases from being filed, it certainly leaves the door open for claims of actual damage and we expect BIPA cases to continue to be filed in the near future.

Jeffrey L. Widman is a partner in the firm’s Litigation Department, based in its Chicago office.

In a daylong Privacy Summit at Citizens Bank Park in Philadelphia, the co-chairs of Fox Rothschild’s Privacy & Data Security practice group led a series of panel discussions with leading cybersecurity professionals and government officials.

Elizabeth Litten moderating “Looking Inward: Risk Management Part I”

Fox partner Elizabeth Litten, who serves as Fox Rothschild’s HIPAA Privacy & Security Officer, and partner Mark McCreary, the firm’s Chief Privacy Officer, moderated a two-part panel series examining cyber risk management for protecting company data. The first segment, “Looking Inward: Risk Management Part I,” focused on the best internal company practices, policies and training to combat cyber threats and protect valuable data. “Beyond Company Walls: Risk Management Part II” examined the ways businesses should approach vendor management and cyber insurance to further secure and safeguard their data assets.

Mark McCreary moderating “Beyond Company Walls: Risk Management Part II”

 Litigation partner Scott Vernick moderated the panel “Current State of Affairs in Regulation & Enforcement.” Discussion highlighted the domestic and international data privacy and security obligations relevant to U.S. businesses.

 The summit closed with a thought-provoking keynote address from Eric O’Neill, a former FBI counterintelligence operative who helped apprehend Robert Phillip Hanssen – one of the most notorious spies in U.S. history – who provided memorable insights about corporate diligence and defense.

 View the Event


Acting Federal Trade Commission (FTC) Chairman Maureen K. Ohlhausen made it clear that she expects the FTC’s enforcement role in protecting privacy and security to encompass automated and connected vehicles. In her opening remarks at a June 28, 2017 workshop hosted by the FTC and National Highway Traffic Safety Administration (NHTSA), she said the FTC will take action against manufacturers and service providers of autonomous and connected vehicles if their activities violate Section 5 of the FTC Act, which prohibits unfair and deceptive acts or practices.

Such concern is warranted as new technologies allow vehicles to not only access the Internet, but also to independently generate, store and transmit all types of data – some of which could be very valuable to law enforcement, insurance companies, and other industries. For example, such data can not only show a car’s precise location, but also whether it violated posted speed limits, and aggressively followed behind, or cut-off, other cars.

Acting Chairman Ohlhausen noted that the FTC wants to coordinate its regulatory efforts with NHTSA, and envisions that both organizations will have important roles, similar to the way the FTC and the Department of Health and Human Services both have roles with respect to the Health Insurance Portability and Accountability Act (HIPAA).

Traditionally, NHTSA has dealt with vehicle safety issues, as opposed to privacy and data security. Thus, it may mean that the FTC will have a key role on these issues as they apply to connected cars, as it already has been a major player on privacy and data security in other industries.

Acting Chairman Ohlhausen also encouraged Congress to consider data breach and data security legislation for these new industries, but speakers at the workshop (video available here and embedded below) noted that legislation in this area will have difficulty keeping up with the fast pace of change of these technologies.

Part 1:

Part 2:

Part 3:

Specific federal legislation, or even laws at the state level, may be slow in coming given the many stakeholders who have an interest in the outcome. Until then, the broad mandate of Section 5 may be one of the main sources of enforcement. Companies who provide goods or services related to autonomous and connected vehicles should be familiar with the basic FTC security advice we have already blogged about here, and should work with knowledgeable attorneys as they pursue their design and manufacture plans.

Eric Bixler has posted on the Fox Rothschild Physician Law Blog an excellent summary of the changes coming to Medicare cards as a result of the Medicare Access and CHIP Reauthorization Act of 2015.  Briefly, Centers for Medicare and Medicaid Services (“CMS”) must remove Social Security Numbers (“SSNs”) from all Medicare cards. Therefore, starting April 1, 2018, CMS will begin mailing new cards with a randomly assigned Medicare Beneficiary Identifier (“MBI”) to replace the existing use of SSNs.  You can read the entire blog post here.

The SSN removal initiative represents a major step in the right direction for preventing identity theft of particularly vulnerable populations.  Medicare provides health insurance for Americans aged 65 and older, and in some cases to younger individuals with select disabilities.  Americans are told to avoid carrying their social security card to protect their identity in the event their wallet or purse is stolen, yet many Medicare beneficiaries still carry their Medicare card, which contains their SSN.  CMS stated that people age 65 or older are increasingly the victims of identity theft, as incidents among seniors increased to 2.6 million from 2.1 million between 2012 and 2014.  Yet the change took over a decade of formal CMS research and discussions with other government agencies to materialize, in part due to CMS’ estimates of the prohibitive costs associated with the undertaking.  In 2013, CMS estimated that the costs of two separate SSN removal approaches were approximately $255 million and $317 million, including the cost of efforts to develop, test and implement modifications that would have to be made to the agency’s IT systems – see United States Government Accountability Office report, dated September 2013)

We previously blogged (here and here) about the theft of 7,000 student SSNs at Purdue University and a hack that put 75,000 SSNs at risk at the University of Wisconsin.  In addition, the Fox Rothschild HIPAA & Health Information Technology Blog discussed (here) the nearly $7 million fine imposed on a health plan for including Medicare health insurance claim numbers in plain sight on mailings addressed to individuals.

On July 23, 2017, Washington State will become the third state (after Illinois and Texas) to statutorily restrict the collection, storage and use of biometric data for commercial purposes. The Washington legislature explained its goal in enacting Washington’s new biometrics law:

The legislature intends to require a business that collects and can attribute biometric data to a specific uniquely identified individual to disclose how it uses that biometric data, and provide notice to and obtain consent from an individual before enrolling or changing the use of that individual’s biometric identifiers in a database.

— Washington Laws of 2017, ch. 299 § 1.  (See complete text of the new law here).

Washington’s new biometrics act governs three key aspects of commercial use of biometric data:

  1. collection, including notice and consent,
  2. storage, including protection and length of time, and
  3. use, including dissemination and permitted purposes.

The law focuses on “biometric identifiers,” which it defines as

data generated by automatic measurements of an individual’s biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises, or other unique biological patterns or characteristics that is used to identify a specific individual.

— Id. § 3(1).

The law excludes all photos, video or audio recordings, or information “collected, used, or stored for health care treatment, payment or operations” subject to HIPAA from the definition of “biometric identifiers.” Id.  It also expressly excludes biometric information collected for security purposes (id. § 3(4)), and does not apply to financial institutions subject to the Gramm-Leach-Bliley Act.  Id. § 5(1).  Importantly, the law applies only to biometric identifiers that are “enrolled in” a commercial database, which it explains means capturing a biometric identifier, converting it to a reference template that cannot be reconstructed into the original output image, and storing it in a database that links the biometric identifier to a specific individual.  Id. §§ 2, 3(5).

Statutory Ambiguity Creates Confusion

Biometric data
Copyright: altomedia / 123RF Stock Photo

Unfortunately, ambiguous statutory language, combined with rapidly-advancing technology, virtually guarantees confusion in each of the three key aspects of the new law.

Regarding collection, the new law states that a company may not “enroll a biometric identifier in a database for a commercial purpose” unless it: (1) provides notice, (2) obtains consent, or (3) “provid[es] a mechanism to prevent the subsequent use of a biometric identifier for a commercial purpose.”  Id. § 2(1).  Confusingly, the law does not specify what type of “notice” is required, except that it must be “given through a procedure reasonably designed to be readily available to affected individuals,” and its adequacy will be “context-dependent.”  Id. § 2(2).

If consent is obtained, a business may sell, lease or disclose biometric data to others for commercial use.  Id. § 2(3).  Absent consent, a business may not disclose biometric data to others except in very limited circumstances listed in the statute, including in litigation, if necessary to provide a service requested by the individual or as authorized by other law. Id. However, the new law may ultimately be read by courts or regulators as including a “one disclosure” exception because it says disclosure is allowed to any third party “who contractually promises that the biometric identifier will not be further disclosed and will not be enrolled in a database for a commercial purpose” inconsistent with the new law.  Id.

The new law also governs the storage of biometric identifiers.  Any business holding biometric data “must take reasonable care to guard against unauthorized access to and acquisition of biometric identifiers that are in the possession or control of the person.”  Id. § 2(4)(a).  Moreover, businesses are barred from retaining biometric data for any longer than “reasonably necessary” to provide services, prevent fraud, or comply with a court order.  Id. § 2(4)(b).  Here too the law fails to provide certainty, e.g., it sets no bright-line time limits on retention after customer relationships end, or how to apply these rules to ongoing but intermittent customer relationships.

The Washington legislature also barred companies that collect biometric identifiers for using them for any other purpose “materially inconsistent” with the original purpose they were collected for unless they first obtain consent.  Id. § 2(5).  Confusingly, even though notice alone is enough to authorize the original collection, it is not sufficient by itself to authorize a new use.

Interestingly, the new Washington law makes a violation of its collection, storage or use requirements a violation of the Washington Consumer Protection Act (the state analog to Section 5 of the FTC Act).  Id. § 4(1).  However, it specifically excludes any private right of action under the statute and provides for enforcement solely by the Washington State Attorney General, leaving Illinois’s Biometric Information Privacy Act as the only state biometrics law authorizing private enforcement.  Id. § 4(2).

Washington’s new law was not without controversy.  Several state legislators criticized it as imprecise and pushed to more specifically detail the activities it regulates; proponents argued that its broad language was necessary to allow flexibility for future technological advances. Ultimately, the bill passed with less than unanimous approval and was signed into law by Washington’s governor in mid-May.  It takes effect on July 23, 2017.  A similar, but not identical, Washington law takes effect the same day governing the collection, storage and use of biometric identifiers by state agencies.  (See Washington Laws of 2017, ch. 306 here).

An executive order signed by President Trump last week potentially put the six-month old Privacy Shield in jeopardy. Although targeting mostly immigration and border patrol, the EO, titled “Enhancing Public Safety in the Interior of the United States,” also eliminates privacy protection for foreigners.

The White House, Washington, D.C.Section 14 of the Executive Order reads:

Privacy Act. Agencies shall, to the extent consistent with applicable law, ensure that their privacy policies exclude persons who are not United States citizens or lawful permanent residents from the protections of the Privacy Act regarding personally identifiable information.”

The potential consequences of this should be obvious. Excluding non-U.S. citizens or residents from the protections of the Privacy Act could effectively destroy the U.S. safeguards provided by the Privacy Shield regarding the adequacy of protection of the personally identifiable information of EU citizens. This could lead leading to the invalidation of the Privacy Shield Agreement outright.

In a statement, the European Commission supported the Privacy Shield and downplayed the impact of Trump’s EO. “The U.S. Privacy Act has never offered data protection rights to Europeans,” a spokeswoman for the EC said. This suggests that the EC is taking the position that the Privacy Shield is not contingent on the Privacy Act, which covers only data held by U.S. agencies, and not by private companies.

But others in Europe are less sanguine. European Parliament Member Jan Philipp Albrecht said he fears the EO will undermine the Privacy Shield, tweeting: “If this is true @EU_Commission has to immediately suspend #PrivacyShield & sanction the US for breaking EU-U.S. umbrella agreement.”

Albrecht’s opinion may better reflect the stance of European regulators. Comparing the EO against the Judicial Redress Act, for example, reveals that the Privacy Shield and the Umbrella Agreement between the U.S. and EU – which governs information sharing by law enforcement across the Atlantic – both remain intact.

Still, it seems impossible to think that the EO and other protectionist policies announced by the Trump Administration will not jeopardize the Privacy Shield which is enforced by the Department of State and the FTC, agencies under Trump’s control. If Trump directs them not to prosecute privacy violations, or if enforcement is reduced, the Privacy Shield is unlikely to survive in the long-term. One critical component of the Privacy Shield framework, after Safe Harbor’s invalidation, was increased U.S. enforcement of EU privacy rights. That agreement must contain a recognition by the U.S. of the right of Europeans to bring enforcement actions in the U.S. against companies that might not otherwise be reachable in the EU.

Worth remembering, too, is that that the Privacy Shield Agreement must be renewed annually by the U.S. Department of Commerce and the European Commission. A deal that was founded upon U.S. enforcement is unlikely to win renewal by the European Commission if Trump has directed his executive branch not to enforce non-citizen privacy rights.

The question may in the end turn on the FTC and whether it enforces both privacy violations generally, and the Privacy Shield specifically. U.S.-EU diplomacy in other areas may also bleed over into the Privacy Shield debate.

So far, more than 1,500 companies have self-certified under the Privacy Shield, which was approved in July 2016. Self-certifications began in August 2016 in the wake of the invalidation of the Safe Harbor agreement. U.S. companies certified under the Privacy Shield should closely monitor the situation. One smart strategic option is adoption of Model Contract Clauses as a “belt and suspenders” approach to compliance.

On Wednesday, the United States and Switzerland struck a new “Privacy Shield” agreement that mirrors the U.S.-EU Privacy Shield framework. It will allow multinationals to continue to transfer data between the U.S. and Switzerland while complying with Swiss data protection requirements.

Pixelated shield icon on digital background,, illustrating security or EU-U.S. Privacy Shield conceptThe deal replaces an existing safe harbor agreement, which has been in question since the Schrems decision was issued in October of 2015. Companies with Swiss Safe Harbor certification may begin certifying under the new U.S.-Swiss Privacy Shield framework on April 12. The 90-day delay is intended to provide companies with time to review the new Swiss principles and the commitments they entail.

Ken Hyatt, the acting Under Secretary of Commerce for International Trade, praised the accord, saying it “will enhance transatlantic data protection and support the continued growth of U.S.-Swiss commercial ties, which included two-way direct investment totaling more than $410 billion in 2015.”

And Swiss officials echoed the sentiment, highlighting that the deal aligns with the U.S.-EU Privacy Shield framework, and imposes stronger obligations on U.S. companies to protect the personal data of Europeans. Like the U.S.-EU framework, this new deal also requires more stringent monitoring and enforcement by the Department of Commerce and the Federal Trade Commission.

Last October, the European Court of Justice invalidated Safe Harbor, throwing a legal wrench into the transatlantic data transfer machinery of thousands of EU and U.S. companies. On Tuesday, the European Commission (EC) provided relief from the digital limbo that has ensued by formally approving and adopting the new Privacy Shield pact, a week after EU member states provided their own seal of approval. The agreement paves the way for new certification and the resumption of EU-U.S. data transfers for commercial purposes.

Data privacy and security

Privacy Shield was designed and negotiated to ensure an adequate level of protection for the personal data of EU individuals upon and after transfer from the EU to the U.S. Though the EC’s decision takes immediate effect, domestically the framework will first be published in the Federal Register, and companies will be able to self-certify Privacy Shield compliance to the U.S. Department of Commerce beginning August 1.

While the initial draft of the agreement was met with significant pushback in Europe, negotiators have since strengthened the independence and authority of the U.S. ombudsman, clarified what constitutes proper “bulk” data collection (and how it differs from mass surveillance), and added detail to the requirements for corporations. Among these is an obligation to delete personal data that is no longer necessary for processing purposes. Such changes cleared the way for EU member state and EC approval.

Despite the fanfare, the deal has not received universal acclaim. Max Schrems, the Austrian law student whose lawsuit ultimately led to the invalidation of Safe Harbor, has already threatened a new legal challenge. Indeed, the new framework may turn out to be only a short-term solution. If the European Court of Justice eventually considers a challenge to the agreement, there is no guarantee that it will survive. The ECJ could very well find that Privacy Shield contains the same adequacy failings as it found within Safe Harbor – a decision that was based more on U.S. surveillance programs than any business compliance failures.

Nonetheless, Privacy Shield now provides a third option for businesses’ data transfer compliance, alongside binding corporate rules (BCRs) and model contract clauses. The latter two options tend to be more costly and do not provide absolute protection against claims or enforcement actions. Yet, regulators in both the EU and U.S. have made clear that they will not look favorably on a failure to counter Safe Harbor’s invalidation. Incorporating these facts may lead companies to consider a multipronged approach to compliance.

What Are the Implications of the Privacy Shield on U.S. Companies?

Both U.S. companies and the federal government will see significant changes as a result of Privacy Shield. As we await publication of the full text, the Department of Commerce and European Commission have provided some further detail and guidance as to requirements for U.S. companies wishing to participate:

  • The Department of Commerce and the Federal Trade Commission will provide oversight and enforcement.
  • Each participating company must register with the Department of Commerce starting August 1, 2016:
    • They must publicly self-certify that they meet and will continue to meet the outlined data protection standards. These include enhanced rights for individuals whose data they collect, limitations on what data can be transferred, and new rules surrounding data retention;
    • They must renew their self-certification every year.
  • Each company must have an adequate privacy policy in place, containing:
    • a statement of its commitment to the Privacy Shield and other required language; and
    • information on individuals’ right to access their personal data and the possibility the company will disclose that data to third parties (including relevant authorities).
  • Each company must establish procedures to collect and address complaints from individuals, including free avenues to resolve disputes (for example, participating in binding arbitration).
  • Each company must institute additional safeguards and notice requirements for data transfers to third parties.