Jeffrey L. Widman writes:

Fingerprint scanner, illustrating concept of biometricsIn 2008, the Illinois legislature enacted the Illinois Biometric Privacy Act, 740 ILCS 14/1 et seq. (“BIPA”) to provide standards of conduct for private entities in connection with the collection and possession of “biometric identifiers and information.” BIPA regulates the collection, use, safeguarding, handling, storage, retention and destruction of such biometric identifiers. Biometric identifiers include retina and iris scans, fingerprints, voiceprints, and scans of hands and faces. It does not include writing samples, signatures, photographs, physical descriptions or biological materials used for medical or scientific purposes.

BIPA’s Requirements

Significantly, BIPA does not prohibit the collection or purchase of biometric identifiers. Instead, BIPA requires private entities to develop written policies to establishing a retention schedule and guidelines for the destruction of such biometric identifiers. BIPA also imposes a set of guidelines with which the entities that do possess such biometric identifiers must comply. These include requirements that such entities:

  • Inform individuals in writing that the information is being collected or stored;
  • Inform individuals in writing of the purpose and length of time for which the information is being collected and stored; and
  • Obtain written consent from individuals whose biometric information is collected;

BIPA also prohibits entities that possess biometric identifiers from (i) selling, leasing, trading or otherwise profiting from such identifiers; and (ii) otherwise disclosing or disseminating such information unless the individual consents to such disclosure, the disclosure completes a financial transaction authorized by the individual, the disclosure is required by municipal, state or federal law or the disclosure is required in response to a warrant or subpoena.

The Recent Onslaught of BIPA Class Actions

Although BIPA provides a private right of action to individuals aggrieved by a violation of the Act, plaintiff’s attorneys essentially ignored BIPA from 2008 through 2016 and few lawsuits were brought on behalf of aggrieved individuals. However, in the past year, more than 30 class actions have been filed in Illinois for purported BIPA violations. Why the trend? For one, BIPA imposes penalties of $1,000 per negligent violation of the Act and $5,000 (or actual damages, whichever is greater) for intentional or reckless violations. Second, BIPA allows for the recovery of reasonable attorneys’ fees and costs, including expert witness fees. Accordingly, BIPA is a prime target for members of the plaintiff’s bar.

Although there is little case law interpreting BIPA, the Illinois Appellate Court issued its first opinion in December 2017 addressing the Act. In Rosenbach v. Six Flags Entertainment Corp., 2017 IL App. (2d) 170317, the court, citing several Federal Court decisions, dismissed a plaintiff’s BIPA claim for failure to state a claim due to the her inability to cite actual damages. In so holding, the Court focused on whether an individual is “aggrieved” (as required by BIPA) if he or she alleges that biometric information was collected without consent, but does not allege actual injury. In dismissing the case, the appellate court found that mere technical violations are not actionable since a plaintiff is not “aggrieved” as the plain language of BIPA requires. While the opinion may deter some cases from being filed, it certainly leaves the door open for claims of actual damage and we expect BIPA cases to continue to be filed in the near future.


Jeffrey L. Widman is a partner in the firm’s Litigation Department, based in its Chicago office.

Roger Severino, director of the Department of Health and Human Services’ Office of Civil Rights, told HIMSS18 conference attendees this week that he plans no slowdown in HIPAA enforcement.

“I come from the Department of Justice Office for Civil Rights; I bring that mindset to OCR. We’re still looking for big, juicy egregious cases” for enforcement, Severino said, according to this report in Data Breach Today. That doesn’t mean smaller companies should assume they are off the radar, he added.

He said 2017 was OCR’s second biggest year for HIPAA settlements with $19.4 million collected, second only to 2016 in which OCR collected nearly $25 million.

Username and password login fields, online security
Usernames and passwords were exposed in a number of reported data breaches.

According to the monthly report from the Identity Theft Resource Center, the health care industry suffered more data breaches in January than government, educational and financial sectors combined.

Medical and health care-related data breaches accounted for 26.7 percent of the verified 116 data breaches in early 2018. The report defines a breach as a cybersecurity incident in which personal information such as emails, medical records, Social Security numbers or driver’s license information, is exposed and made vulnerable to risk.

While the report identifies “Business” as the sector most affected by data breaches, the category broadly encompasses many types of major service providers in retail, hospitality, trade, transportation and other industries.

For more detailed statistics of data breaches by industry, download the ITRC report.

Last year saw multiple high-profile data breaches, enough to place cybersecurity atop any in-house attorney’s 2018 priority list.

But the threat posed by hackers isn’t the only cyber concern on the minds of in-house counsel this year, reports Corporate Counsel magazine.

In the regulatory realm, complying with the European Union’s General Data Protection Regulation, which takes effect in May,  is expected to be companies’ top data privacy task of 2018. But it’s not the only one. The Chinese government also plans to impose new, below-the-radar data privacy regs that will make companies jump through another set of legal hoops.

The legal implications of new technologies, such as fitness devices that blur the line between medical and personal data collection, are also expected to challenge corporate counsel. And groundbreaking legal cases could change the law regarding who has standing to sue following a data breach in the U.S. and whether companies can use standard contractual clauses to transfer personal data out of Europe.

On our HIPAA & Health Information Technology Blog, associate Ankita Patel discusses how Millennials’ embrace of newer forms of social media such as Snapchat and Instagram poses HIPAA challenges for health care organizations.

“With just a few taps and swipes, an employee can post a seemingly innocuous disclosure of PHI. Interns and residents of the younger generation may innocently upload a short-term post (be it a picture for two-seconds or an eight-second long video) of a busy hospital room or even an innocent ‘selfie’ without realizing that there is visible and identifiable PHI in the corner,”  Ankita writes.

It’s an intriguing read exploring the intersection of health care and privacy law, social sharing and the rapid pace of technological change. Read the full post here.

Physicians have their hands full on the best of days. It’s not difficult to imagine why using a voice assistant such as Amazon’s Alexa or Apple’s Siri might be attractive.

In fact, a recent survey showed nearly one in four physicians uses the assistants for work-related purposes, such as researching prescription drug dosing. It’s likely many are unaware of the information security dangers they pose.

In an interview with SCG Health Blog, Fox Rothschild attorneys Elizabeth Litten and Michael Kline explain that the labor-saving devices pose a bevy of data privacy and security risks, and offer doctors six helpful tips for protecting their practices.

The Federal Trade Commission is investing nearly $3 million in technology to support an increasing need for e-discovery driven by massive data breaches such as the one disclosed recently by Equifax.

The news comes from the National Law Journal, which reports that the FTC awarded a one-year contract to Innovative Discovery LLC of Arlington, Virginia for a secure litigation support service. The agency awarded the contract without competitive bids because it “faces usual and compelling circumstances that require the immediate initiation of this pilot,” the Law Journal reported.

“The FTC is entering into an unprecedented year of investigations and litigation, including its investigation into the Equifax data breach and an usually high number of forensic data acquisitions in fraud cases,” agency officials wrote. The contract, they added, “is essential to enabling the FTC to successfully conduct investigations and litigation to stop consumer harm, thus enabling the agency to accomplish its mission.”

Elizabeth Litten (Fox Rothschild Partner and HIPAA Privacy & Security Officer) and Mark McCreary (Fox Rothschild Partner and Chief Privacy Officer) will be presenting at the New Jersey Chapter of the Healthcare Financial Management Association on August 30, 2017, from 12:00-1:00 pm eastern time.  The presentation is titled: “Can’t Touch That: Best Practices for Health Care Workforce Training on Data Security and Information Privacy.”

This webinar is a comprehensive review of information privacy and data security training, with an emphasis on imparting practical know-how and a fluency with the terminology involving phishing, ransomware, malware and other common threats. We will cover best practices for sensitizing health care industry workers to these threats as part of their ongoing HIPAA compliance efforts and, more generally, for training workers in any business on the proper handling of sensitive data. We will cover the adoption of policies and a training regimen for the entire workforce, as well as tailored training for those in positions responsible for implementing security policies.

More information and a registration link can be found here.

Acting Federal Trade Commission (FTC) Chairman Maureen K. Ohlhausen made it clear that she expects the FTC’s enforcement role in protecting privacy and security to encompass automated and connected vehicles. In her opening remarks at a June 28, 2017 workshop hosted by the FTC and National Highway Traffic Safety Administration (NHTSA), she said the FTC will take action against manufacturers and service providers of autonomous and connected vehicles if their activities violate Section 5 of the FTC Act, which prohibits unfair and deceptive acts or practices.

Such concern is warranted as new technologies allow vehicles to not only access the Internet, but also to independently generate, store and transmit all types of data – some of which could be very valuable to law enforcement, insurance companies, and other industries. For example, such data can not only show a car’s precise location, but also whether it violated posted speed limits, and aggressively followed behind, or cut-off, other cars.

Acting Chairman Ohlhausen noted that the FTC wants to coordinate its regulatory efforts with NHTSA, and envisions that both organizations will have important roles, similar to the way the FTC and the Department of Health and Human Services both have roles with respect to the Health Insurance Portability and Accountability Act (HIPAA).

Traditionally, NHTSA has dealt with vehicle safety issues, as opposed to privacy and data security. Thus, it may mean that the FTC will have a key role on these issues as they apply to connected cars, as it already has been a major player on privacy and data security in other industries.

Acting Chairman Ohlhausen also encouraged Congress to consider data breach and data security legislation for these new industries, but speakers at the workshop (video available here and embedded below) noted that legislation in this area will have difficulty keeping up with the fast pace of change of these technologies.

Part 1:

Part 2:

Part 3:

Specific federal legislation, or even laws at the state level, may be slow in coming given the many stakeholders who have an interest in the outcome. Until then, the broad mandate of Section 5 may be one of the main sources of enforcement. Companies who provide goods or services related to autonomous and connected vehicles should be familiar with the basic FTC security advice we have already blogged about here, and should work with knowledgeable attorneys as they pursue their design and manufacture plans.

On July 23, 2017, Washington State will become the third state (after Illinois and Texas) to statutorily restrict the collection, storage and use of biometric data for commercial purposes. The Washington legislature explained its goal in enacting Washington’s new biometrics law:

The legislature intends to require a business that collects and can attribute biometric data to a specific uniquely identified individual to disclose how it uses that biometric data, and provide notice to and obtain consent from an individual before enrolling or changing the use of that individual’s biometric identifiers in a database.

— Washington Laws of 2017, ch. 299 § 1.  (See complete text of the new law here).

Washington’s new biometrics act governs three key aspects of commercial use of biometric data:

  1. collection, including notice and consent,
  2. storage, including protection and length of time, and
  3. use, including dissemination and permitted purposes.

The law focuses on “biometric identifiers,” which it defines as

data generated by automatic measurements of an individual’s biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises, or other unique biological patterns or characteristics that is used to identify a specific individual.

— Id. § 3(1).

The law excludes all photos, video or audio recordings, or information “collected, used, or stored for health care treatment, payment or operations” subject to HIPAA from the definition of “biometric identifiers.” Id.  It also expressly excludes biometric information collected for security purposes (id. § 3(4)), and does not apply to financial institutions subject to the Gramm-Leach-Bliley Act.  Id. § 5(1).  Importantly, the law applies only to biometric identifiers that are “enrolled in” a commercial database, which it explains means capturing a biometric identifier, converting it to a reference template that cannot be reconstructed into the original output image, and storing it in a database that links the biometric identifier to a specific individual.  Id. §§ 2, 3(5).

Statutory Ambiguity Creates Confusion

Biometric data
Copyright: altomedia / 123RF Stock Photo

Unfortunately, ambiguous statutory language, combined with rapidly-advancing technology, virtually guarantees confusion in each of the three key aspects of the new law.

Regarding collection, the new law states that a company may not “enroll a biometric identifier in a database for a commercial purpose” unless it: (1) provides notice, (2) obtains consent, or (3) “provid[es] a mechanism to prevent the subsequent use of a biometric identifier for a commercial purpose.”  Id. § 2(1).  Confusingly, the law does not specify what type of “notice” is required, except that it must be “given through a procedure reasonably designed to be readily available to affected individuals,” and its adequacy will be “context-dependent.”  Id. § 2(2).

If consent is obtained, a business may sell, lease or disclose biometric data to others for commercial use.  Id. § 2(3).  Absent consent, a business may not disclose biometric data to others except in very limited circumstances listed in the statute, including in litigation, if necessary to provide a service requested by the individual or as authorized by other law. Id. However, the new law may ultimately be read by courts or regulators as including a “one disclosure” exception because it says disclosure is allowed to any third party “who contractually promises that the biometric identifier will not be further disclosed and will not be enrolled in a database for a commercial purpose” inconsistent with the new law.  Id.

The new law also governs the storage of biometric identifiers.  Any business holding biometric data “must take reasonable care to guard against unauthorized access to and acquisition of biometric identifiers that are in the possession or control of the person.”  Id. § 2(4)(a).  Moreover, businesses are barred from retaining biometric data for any longer than “reasonably necessary” to provide services, prevent fraud, or comply with a court order.  Id. § 2(4)(b).  Here too the law fails to provide certainty, e.g., it sets no bright-line time limits on retention after customer relationships end, or how to apply these rules to ongoing but intermittent customer relationships.

The Washington legislature also barred companies that collect biometric identifiers for using them for any other purpose “materially inconsistent” with the original purpose they were collected for unless they first obtain consent.  Id. § 2(5).  Confusingly, even though notice alone is enough to authorize the original collection, it is not sufficient by itself to authorize a new use.

Interestingly, the new Washington law makes a violation of its collection, storage or use requirements a violation of the Washington Consumer Protection Act (the state analog to Section 5 of the FTC Act).  Id. § 4(1).  However, it specifically excludes any private right of action under the statute and provides for enforcement solely by the Washington State Attorney General, leaving Illinois’s Biometric Information Privacy Act as the only state biometrics law authorizing private enforcement.  Id. § 4(2).

Washington’s new law was not without controversy.  Several state legislators criticized it as imprecise and pushed to more specifically detail the activities it regulates; proponents argued that its broad language was necessary to allow flexibility for future technological advances. Ultimately, the bill passed with less than unanimous approval and was signed into law by Washington’s governor in mid-May.  It takes effect on July 23, 2017.  A similar, but not identical, Washington law takes effect the same day governing the collection, storage and use of biometric identifiers by state agencies.  (See Washington Laws of 2017, ch. 306 here).