In its second annual review, the European Commission notes that the Privacy Shield scheme provides adequate protection for personal data but improvements are still in order.

Highlights include:

  • Since the first annual review, the Department of Commerce (DOC) referred more than 50 cases to the Federal Trade Commission (FTC), to take enforcement action where necessary.
  • New tools have been adopted to ensure compliance with Privacy Shield Principles including: spot checks, monitoring public reports about Privacy Shield participants, quarterly checks of companies flagged as potentially making false claims and issuing subpoenas to request information from participants.
  • The US is to appoint a Privacy Shield Ombudsperson by not later than February 28, 2019 or the Commission will consider taking steps under GDPR.
  • The Commission is monitoring the following areas to determine if sufficient progress has been made: (i) effectiveness of DOC enforcement mechanisms; (ii) progress of FTC sweeps; and (iii) appointment and effectiveness of complaints handling by the Ombudsperson.

Read the full report

Austin, Texas, downtown skyline at sunsetThe American Bar Association is holding its upcoming 2018 Business Law Section Annual Meeting at the Austin Convention Center in Austin, TX, from September 13 to 15.

Fox partner Matt Kittay will moderate a panel entitled “Lawyer Ethical Issues in M&A Technology.” Featuring Haley Altman of Doxly, Steve Obenski of Kira Systems, and James Walker of Richards Kibbe & Orbe. The group will discuss ethical issues facing lawyers who use both emerging and globally accepted technology platforms to execute M&A and private equity transactions. The panel will take place on Friday, September 14 from 3:30 PM to 5:00 PM at the Technology in M&A Subcommittee Meeting of the Mergers & Acquisitions Committee. The Fairmont Hotel connected to the Convention Center will host the panel.

For more information and to register to attend the section’s Annual Meeting, please visit the ABA website.

Jeffrey L. Widman writes:

Fingerprint scanner, illustrating concept of biometricsIn 2008, the Illinois legislature enacted the Illinois Biometric Privacy Act, 740 ILCS 14/1 et seq. (“BIPA”) to provide standards of conduct for private entities in connection with the collection and possession of “biometric identifiers and information.” BIPA regulates the collection, use, safeguarding, handling, storage, retention and destruction of such biometric identifiers. Biometric identifiers include retina and iris scans, fingerprints, voiceprints, and scans of hands and faces. It does not include writing samples, signatures, photographs, physical descriptions or biological materials used for medical or scientific purposes.

BIPA’s Requirements

Significantly, BIPA does not prohibit the collection or purchase of biometric identifiers. Instead, BIPA requires private entities to develop written policies to establishing a retention schedule and guidelines for the destruction of such biometric identifiers. BIPA also imposes a set of guidelines with which the entities that do possess such biometric identifiers must comply. These include requirements that such entities:

  • Inform individuals in writing that the information is being collected or stored;
  • Inform individuals in writing of the purpose and length of time for which the information is being collected and stored; and
  • Obtain written consent from individuals whose biometric information is collected;

BIPA also prohibits entities that possess biometric identifiers from (i) selling, leasing, trading or otherwise profiting from such identifiers; and (ii) otherwise disclosing or disseminating such information unless the individual consents to such disclosure, the disclosure completes a financial transaction authorized by the individual, the disclosure is required by municipal, state or federal law or the disclosure is required in response to a warrant or subpoena.

The Recent Onslaught of BIPA Class Actions

Although BIPA provides a private right of action to individuals aggrieved by a violation of the Act, plaintiff’s attorneys essentially ignored BIPA from 2008 through 2016 and few lawsuits were brought on behalf of aggrieved individuals. However, in the past year, more than 30 class actions have been filed in Illinois for purported BIPA violations. Why the trend? For one, BIPA imposes penalties of $1,000 per negligent violation of the Act and $5,000 (or actual damages, whichever is greater) for intentional or reckless violations. Second, BIPA allows for the recovery of reasonable attorneys’ fees and costs, including expert witness fees. Accordingly, BIPA is a prime target for members of the plaintiff’s bar.

Although there is little case law interpreting BIPA, the Illinois Appellate Court issued its first opinion in December 2017 addressing the Act. In Rosenbach v. Six Flags Entertainment Corp., 2017 IL App. (2d) 170317, the court, citing several Federal Court decisions, dismissed a plaintiff’s BIPA claim for failure to state a claim due to the her inability to cite actual damages. In so holding, the Court focused on whether an individual is “aggrieved” (as required by BIPA) if he or she alleges that biometric information was collected without consent, but does not allege actual injury. In dismissing the case, the appellate court found that mere technical violations are not actionable since a plaintiff is not “aggrieved” as the plain language of BIPA requires. While the opinion may deter some cases from being filed, it certainly leaves the door open for claims of actual damage and we expect BIPA cases to continue to be filed in the near future.


Jeffrey L. Widman is a partner in the firm’s Litigation Department, based in its Chicago office.

Elizabeth Litten (Fox Rothschild Partner and HIPAA Privacy & Security Officer) and Mark McCreary (Fox Rothschild Partner and Chief Privacy Officer) will be presenting at the New Jersey Chapter of the Healthcare Financial Management Association on August 30, 2017, from 12:00-1:00 pm eastern time.  The presentation is titled: “Can’t Touch That: Best Practices for Health Care Workforce Training on Data Security and Information Privacy.”

This webinar is a comprehensive review of information privacy and data security training, with an emphasis on imparting practical know-how and a fluency with the terminology involving phishing, ransomware, malware and other common threats. We will cover best practices for sensitizing health care industry workers to these threats as part of their ongoing HIPAA compliance efforts and, more generally, for training workers in any business on the proper handling of sensitive data. We will cover the adoption of policies and a training regimen for the entire workforce, as well as tailored training for those in positions responsible for implementing security policies.

More information and a registration link can be found here.

Acting Federal Trade Commission (FTC) Chairman Maureen K. Ohlhausen made it clear that she expects the FTC’s enforcement role in protecting privacy and security to encompass automated and connected vehicles. In her opening remarks at a June 28, 2017 workshop hosted by the FTC and National Highway Traffic Safety Administration (NHTSA), she said the FTC will take action against manufacturers and service providers of autonomous and connected vehicles if their activities violate Section 5 of the FTC Act, which prohibits unfair and deceptive acts or practices.

Such concern is warranted as new technologies allow vehicles to not only access the Internet, but also to independently generate, store and transmit all types of data – some of which could be very valuable to law enforcement, insurance companies, and other industries. For example, such data can not only show a car’s precise location, but also whether it violated posted speed limits, and aggressively followed behind, or cut-off, other cars.

Acting Chairman Ohlhausen noted that the FTC wants to coordinate its regulatory efforts with NHTSA, and envisions that both organizations will have important roles, similar to the way the FTC and the Department of Health and Human Services both have roles with respect to the Health Insurance Portability and Accountability Act (HIPAA).

Traditionally, NHTSA has dealt with vehicle safety issues, as opposed to privacy and data security. Thus, it may mean that the FTC will have a key role on these issues as they apply to connected cars, as it already has been a major player on privacy and data security in other industries.

Acting Chairman Ohlhausen also encouraged Congress to consider data breach and data security legislation for these new industries, but speakers at the workshop (video available here and embedded below) noted that legislation in this area will have difficulty keeping up with the fast pace of change of these technologies.

Part 1:

Part 2:

Part 3:

Specific federal legislation, or even laws at the state level, may be slow in coming given the many stakeholders who have an interest in the outcome. Until then, the broad mandate of Section 5 may be one of the main sources of enforcement. Companies who provide goods or services related to autonomous and connected vehicles should be familiar with the basic FTC security advice we have already blogged about here, and should work with knowledgeable attorneys as they pursue their design and manufacture plans.

On July 23, 2017, Washington State will become the third state (after Illinois and Texas) to statutorily restrict the collection, storage and use of biometric data for commercial purposes. The Washington legislature explained its goal in enacting Washington’s new biometrics law:

The legislature intends to require a business that collects and can attribute biometric data to a specific uniquely identified individual to disclose how it uses that biometric data, and provide notice to and obtain consent from an individual before enrolling or changing the use of that individual’s biometric identifiers in a database.

— Washington Laws of 2017, ch. 299 § 1.  (See complete text of the new law here).

Washington’s new biometrics act governs three key aspects of commercial use of biometric data:

  1. collection, including notice and consent,
  2. storage, including protection and length of time, and
  3. use, including dissemination and permitted purposes.

The law focuses on “biometric identifiers,” which it defines as

data generated by automatic measurements of an individual’s biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises, or other unique biological patterns or characteristics that is used to identify a specific individual.

— Id. § 3(1).

The law excludes all photos, video or audio recordings, or information “collected, used, or stored for health care treatment, payment or operations” subject to HIPAA from the definition of “biometric identifiers.” Id.  It also expressly excludes biometric information collected for security purposes (id. § 3(4)), and does not apply to financial institutions subject to the Gramm-Leach-Bliley Act.  Id. § 5(1).  Importantly, the law applies only to biometric identifiers that are “enrolled in” a commercial database, which it explains means capturing a biometric identifier, converting it to a reference template that cannot be reconstructed into the original output image, and storing it in a database that links the biometric identifier to a specific individual.  Id. §§ 2, 3(5).

Statutory Ambiguity Creates Confusion

Biometric data
Copyright: altomedia / 123RF Stock Photo

Unfortunately, ambiguous statutory language, combined with rapidly-advancing technology, virtually guarantees confusion in each of the three key aspects of the new law.

Regarding collection, the new law states that a company may not “enroll a biometric identifier in a database for a commercial purpose” unless it: (1) provides notice, (2) obtains consent, or (3) “provid[es] a mechanism to prevent the subsequent use of a biometric identifier for a commercial purpose.”  Id. § 2(1).  Confusingly, the law does not specify what type of “notice” is required, except that it must be “given through a procedure reasonably designed to be readily available to affected individuals,” and its adequacy will be “context-dependent.”  Id. § 2(2).

If consent is obtained, a business may sell, lease or disclose biometric data to others for commercial use.  Id. § 2(3).  Absent consent, a business may not disclose biometric data to others except in very limited circumstances listed in the statute, including in litigation, if necessary to provide a service requested by the individual or as authorized by other law. Id. However, the new law may ultimately be read by courts or regulators as including a “one disclosure” exception because it says disclosure is allowed to any third party “who contractually promises that the biometric identifier will not be further disclosed and will not be enrolled in a database for a commercial purpose” inconsistent with the new law.  Id.

The new law also governs the storage of biometric identifiers.  Any business holding biometric data “must take reasonable care to guard against unauthorized access to and acquisition of biometric identifiers that are in the possession or control of the person.”  Id. § 2(4)(a).  Moreover, businesses are barred from retaining biometric data for any longer than “reasonably necessary” to provide services, prevent fraud, or comply with a court order.  Id. § 2(4)(b).  Here too the law fails to provide certainty, e.g., it sets no bright-line time limits on retention after customer relationships end, or how to apply these rules to ongoing but intermittent customer relationships.

The Washington legislature also barred companies that collect biometric identifiers for using them for any other purpose “materially inconsistent” with the original purpose they were collected for unless they first obtain consent.  Id. § 2(5).  Confusingly, even though notice alone is enough to authorize the original collection, it is not sufficient by itself to authorize a new use.

Interestingly, the new Washington law makes a violation of its collection, storage or use requirements a violation of the Washington Consumer Protection Act (the state analog to Section 5 of the FTC Act).  Id. § 4(1).  However, it specifically excludes any private right of action under the statute and provides for enforcement solely by the Washington State Attorney General, leaving Illinois’s Biometric Information Privacy Act as the only state biometrics law authorizing private enforcement.  Id. § 4(2).

Washington’s new law was not without controversy.  Several state legislators criticized it as imprecise and pushed to more specifically detail the activities it regulates; proponents argued that its broad language was necessary to allow flexibility for future technological advances. Ultimately, the bill passed with less than unanimous approval and was signed into law by Washington’s governor in mid-May.  It takes effect on July 23, 2017.  A similar, but not identical, Washington law takes effect the same day governing the collection, storage and use of biometric identifiers by state agencies.  (See Washington Laws of 2017, ch. 306 here).

In one of the best examples we have ever seen that it pays to be HIPAA compliant (and can cost A LOT when you are not), the U.S. Department of Health and Human Services, Office for Civil Rights, issued the following press release about the above settlement.  This is worth a quick read and some soul searching if your company has not been meeting its HIPAA requirements.

FOR IMMEDIATE RELEASE
April 24, 2017
Contact: HHS Press Office
202-690-6343
media@hhs.gov

$2.5 million settlement shows that not understanding HIPAA requirements creates risk

The U.S. Department of Health and Human Services, Office for Civil Rights (OCR), has announced a Health Insurance Portability and Accountability Act of 1996 (HIPAA) settlement based on the impermissible disclosure of unsecured electronic protected health information (ePHI). CardioNet has agreed to settle potential noncompliance with the HIPAA Privacy and Security Rules by paying $2.5 million and implementing a corrective action plan. This settlement is the first involving a wireless health services provider, as CardioNet provides remote mobile monitoring of and rapid response to patients at risk for cardiac arrhythmias.

In January 2012, CardioNet reported to the HHS Office for Civil Rights (OCR) that a workforce member’s laptop was stolen from a parked vehicle outside of the employee’s home. The laptop contained the ePHI of 1,391 individuals. OCR’s investigation into the impermissible disclosure revealed that CardioNet had an insufficient risk analysis and risk management processes in place at the time of the theft. Additionally, CardioNet’s policies and procedures implementing the standards of the HIPAA Security Rule were in draft form and had not been implemented. Further, the Pennsylvania –based organization was unable to produce any final policies or procedures regarding the implementation of safeguards for ePHI, including those for mobile devices.

“Mobile devices in the health care sector remain particularly vulnerable to theft and loss,” said Roger Severino, OCR Director. “Failure to implement mobile device security by Covered Entities and Business Associates puts individuals’ sensitive health information at risk. This disregard for security can result in a serious breach, which affects each individual whose information is left unprotected.”

The Resolution Agreement and Corrective Action Plan may be found on the OCR website at https://www.hhs.gov/hipaa/for-professionals/compliance-enforcement/agreements/cardionet

HHS has gathered tips and information to help protect and secure health information when using mobile devices:  https://www.healthit.gov/providers-professionals/your-mobile-device-and-health-information-privacy-and-security

To learn more about non-discrimination and health information privacy laws, your civil rights, and privacy rights in health care and human service settings, and to find information on filing a complaint, visit us at http://www.hhs.gov/hipaa/index.html

U.S. Capitol Building, Washington, D.C.A recent bill proposed by the U.S. Senate states requirements for publicly traded companies to increase transparency about cybersecurity threats, risks and breaches. The bill includes disclosure standards such as having publicly owned companies reveal whether anyone on its board of directors has cybersecurity expertise or specialization. Companies would provide this information through U.S. Securities and Exchange Commission investor reports.

The bill stems from an urgency to combat cyber threats in light of investigative findings from cybersecurity practices of top 100 financial firms as well as recent attacks on major publicly traded companies like Sony and Home Depot. If the bill passes, investors and shareholders can monitor how well public companies secure private data and information, motivating companies to enhance security measures.

Privacy officials in Germany penned a position paper arguing that standard contract language and binding corporate rules do not adequately provide data protections necessary for legal U.S.-EU data flows. These two data transfer alternatives to Safe Harbor are not viable.

Binary code on the European continent from space, illustrating European Union data privacyThe German data protection authority (DPA) recommended a path of informed consent. U.S. companies should provide potential EU partners full disclosure of how U.S. information security and data privacy laws lack protections equivalent to the EU’s laws. Before consenting to data transfers with U.S. organizations, EU companies must be made aware of the U.S. government’s ability to access data and personal information. But it doesn’t stop there. The DPA asserted that discrepancies between individual privacy rights in the U.S. and EU should be clarified, as well as the U.S. government’s shortcomings in abiding by EU privacy standards.

However, the German DPA warned that providing these disclosures may still not be enough considering the U.S. mass surveillance programs brought to light in 2013 by Edward Snowden.

The position paper may be a harbinger of developments in the era beyond Safe Harbor invalidation. In fact, the Israeli Law, Information and Technology Authority (ILITA) has also disallowed U.S. businesses to conduct data Israel-U.S. data transfers under Safe Harbor exceptions. EU countries and allies may follow in stride under the U.S. government agrees to elevated privacy principles or limits its unchecked national surveillance program.

The freedom from automated calls at random hours of the evening may seem like the true American dream these days as more and more companies rely on these calls to reach out and communicate with customers.  Unfortunately, now that the Federal Communications Commission (“FCC”) voted to expand the Telephone Consumer Protection Act (“TCPA”) to include stringent yet vague restrictions on telemarketing robocalls, it may not be a dream for everyone. 

In June of this year, in a 3-2 vote, the FCC voted on adding the rule to the TCPA that entails barring companies from using “autodialers” to dial consumers, disallowing more than one phone call to numbers that have been reassigned to different customers, and mandating a stop to calls under a customer’s wishes.  These restriction may seem reasonable but dissenting Commissioner, Ajit Pai, recognized that the rule’s broad language will create issues because it does not distinguish between legitimate businesses trying to reach their customers and unwanted telemarketers.  Some attorneys have further commented on the rule stating that its use of “autodialer” opens up a can of worms of interpretations and can really be viewed as any device with even the potential to randomly sequence numbers, including a smartphone.  Companies using even slightly modernized tactics to reach out to their customer base are now at risk of facing litigation—and it won’t stop there.  Businesses that legitimately need to reach out to their customers will be caught between a rock and a hard place as they face a one-call restriction now and may also open themselves up to litigation if a customer decides to take that route.

The FCC Chairman, Tom Wheeler, attempted to quash concerns by stating that “Legitimate businesses seeking to provide legitimate information will not have difficulties.”  This statement unfortunately won’t stop plaintiff’s attorneys from greasing their wheels to go after companies who even make “good faith efforts” to abide by the new rule.  Attorneys who defend businesses have recognized that the rule is ridden with issues that could potentially harm companies that simply do not have the mechanisms to fully control and restrict repeated calls or the technology that makes those calls.  But, long story short, just because this rule has been put in motion, does not mean it will stand as is. Litigation and court action will likely be a natural consequence and that may result in changes for the future.  For now, businesses that utilize automated phone calls should be wary of the technology used and attempt to at least keep track of numbers and phone calls made.  When in doubt, talk to an attorney to make sure you are taking the appropriate precautions.