In its second annual review, the European Commission notes that the Privacy Shield scheme provides adequate protection for personal data but improvements are still in order.

Highlights include:

  • Since the first annual review, the Department of Commerce (DOC) referred more than 50 cases to the Federal Trade Commission (FTC), to take enforcement action where necessary.
  • New tools have been adopted to ensure compliance with Privacy Shield Principles including: spot checks, monitoring public reports about Privacy Shield participants, quarterly checks of companies flagged as potentially making false claims and issuing subpoenas to request information from participants.
  • The US is to appoint a Privacy Shield Ombudsperson by not later than February 28, 2019 or the Commission will consider taking steps under GDPR.
  • The Commission is monitoring the following areas to determine if sufficient progress has been made: (i) effectiveness of DOC enforcement mechanisms; (ii) progress of FTC sweeps; and (iii) appointment and effectiveness of complaints handling by the Ombudsperson.

Read the full report

Data-rich companies like Facebook have a unique opportunity to capitalize on the recent surge in regulatory scrutiny and turn it to their advantage.

Savvy tech companies are attuned to public opinion and won’t allow others to control the narrative. They are already taking steps to regain the upper hand in the privacy debate.

Facebook demonstrated this during Senate hearings on the Cambridge Analytica “data breach” by announcing it would upgrade privacy features and offer its users protections that mirror those in the EU’s strict General Data Protection Regulation (GDPR). Facebook has also gone out of its way to publicize its efforts to comply with GDPR. Messaging service WhatsApp, too, recently touted its decision to set a minimum age of 16 for EU users.

Some of the major tech companies – Facebook, Google and Apple – could actually benefit from increased data privacy and security regulation if they take the initiative. They have the resources to impose strict compliance requirements on smaller third-party players such as application developers and vendors in the tech eco-system, portraying themselves as trusted custodians of consumer data.

To gain the advantage, they will need to be proactive because regulators are not sitting back.

Officials at all levels of government are clamoring to get a piece of the data privacy enforcement pie. The SEC recently imposed a first-of-its-kind $35 million fine on Altaba Inc., formerly Yahoo, for failing to disclose a major data breach. The FTC struck a first-of-its-type 20-year consent decree that requires Uber Technologies Inc. to report any future data breach regardless of whether it involves harm to consumers. States are also getting into the act. Arizona and Delaware recently joined the list of states that have toughened their breach notification laws, while attorneys general have stepped up enforcement activities in Massachusetts (Equifax), New York (Facebook), Pennsylvania (Uber) and other states.

Data is the new currency. As a result, antitrust regulators have stepped up scrutiny of M&A deals in relation to the aggregation and control of data. This has already affected proposed deals. The EU halted Apple’s proposed acquisition of Shazam over possible adverse effects on other music streaming services.

In this climate, it is no time for major tech companies to lay low. The smarter path – the one that will allow them to regain the initiative – is taking proactive steps to address privacy and data security concerns before regulators do it for them.

Acting Federal Trade Commission (FTC) Chairman Maureen K. Ohlhausen made it clear that she expects the FTC’s enforcement role in protecting privacy and security to encompass automated and connected vehicles. In her opening remarks at a June 28, 2017 workshop hosted by the FTC and National Highway Traffic Safety Administration (NHTSA), she said the FTC will take action against manufacturers and service providers of autonomous and connected vehicles if their activities violate Section 5 of the FTC Act, which prohibits unfair and deceptive acts or practices.

Such concern is warranted as new technologies allow vehicles to not only access the Internet, but also to independently generate, store and transmit all types of data – some of which could be very valuable to law enforcement, insurance companies, and other industries. For example, such data can not only show a car’s precise location, but also whether it violated posted speed limits, and aggressively followed behind, or cut-off, other cars.

Acting Chairman Ohlhausen noted that the FTC wants to coordinate its regulatory efforts with NHTSA, and envisions that both organizations will have important roles, similar to the way the FTC and the Department of Health and Human Services both have roles with respect to the Health Insurance Portability and Accountability Act (HIPAA).

Traditionally, NHTSA has dealt with vehicle safety issues, as opposed to privacy and data security. Thus, it may mean that the FTC will have a key role on these issues as they apply to connected cars, as it already has been a major player on privacy and data security in other industries.

Acting Chairman Ohlhausen also encouraged Congress to consider data breach and data security legislation for these new industries, but speakers at the workshop (video available here and embedded below) noted that legislation in this area will have difficulty keeping up with the fast pace of change of these technologies.

Part 1:

Part 2:

Part 3:

Specific federal legislation, or even laws at the state level, may be slow in coming given the many stakeholders who have an interest in the outcome. Until then, the broad mandate of Section 5 may be one of the main sources of enforcement. Companies who provide goods or services related to autonomous and connected vehicles should be familiar with the basic FTC security advice we have already blogged about here, and should work with knowledgeable attorneys as they pursue their design and manufacture plans.

With 2013 being dubbed as the “Year of the Mega Breach” it comes as no surprise that the Federal Trade Commission (“FTC”), on June 30, 2015 published “Start with Security: A Guide for Businesses” to educate and inform businesses on protecting their data.  The FTC is tasked with protecting consumers from “unfair” and “deceptive” business practices and with data breaches on the rise, it has come to take that job much more seriously.  The lessons in the guide are meant to aid businesses in their practices of protecting data and the FTC cites to real examples of its data breach settlement cases to help companies understand each lesson and the real world consequences that some companies have faced.  Here are the lesson headlines:

  1. 1. Start with security;
  2. 2. Control access to data sensibly;
  3. 3. Require secure passwords and authentication;
  4. 4. Store sensitive personal information securely and protect it during transmission;
  5. 5. Segment networks and monitor anyone trying to get in and out of them;
  6. 6. Secure remote network access;
  7. 7. Apply sound security practices when developing new products that collect personal information;
  8. 8. Ensure that service providers implement reasonable security measures;
  9. 9. Implement procedures to help ensure that security practices are current and address vulnerabilities; and
  10. 10. Secure paper, physical media and devices that contain personal information.

  Katherine McCarron, the Bureau of Consumer Protection attorney, explained that the Bureau “look[s] at a company’s security procedures and determine[s] whether they are reasonable and appropriate in light of all the circumstances” when evaluating an organization’s conduct.  It is likely that this guide will become the FTC’s road map for handling future enforcement actions and will help businesses to remain on the safe side of the data breach fence.

Whether you run a mom and pop shop or a multi-million dollar company, this guide is a must-read for any business that processes personal information.

Start reading here.

https://www.ftc.gov/tips-advice/business-center/guidance/start-security-guide-business

[Also posted at http://hipaahealthlaw.foxrothschild.com/]

This case has nothing to do with HIPAA, but should be a warning to zealous covered entities and other types of business entities trying to give patients or consumers more information about data privacy than is required under applicable law.  In short, giving individuals more information is not better, especially where the information might be construed as partially inaccurate or misleading.

The Federal Trade Commission (FTC) filed a complaint against Nomi Technologies, Inc., a retail tracking company that placed sensors in clients’ New York City-area retail stores to automatically collect certain data from consumers’ mobile devices as they passed by or entered the stores.  Nomi’s business model was publicized in a July 2013 New York Times article.  The complaint alleged, among other things, that although Nomi’s published privacy policy stated that Nomi would “allow consumers to opt out of Nomi’s [data tracking] service on its website as well as at any retailer using Nomi’s technology,” Nomi actually only allowed consumers to opt-out on its website — no opt-out mechanism was available at the clients’ retail stores.

The FTC voted 3-2 to accept a consent order (published for public comment on May 1, 2015) from Nomi under which Nomi shall not:

“[M]isrepresent in any manner, expressly or by implication:  (A) the options through which, or the extent to which, consumers can exercise control over the collection, use, disclosure, or sharing of information collected from or about them or their computers or devices, or (B) the extent to which consumers will be provided notice about how data from or about a particular consumer, computer, or device is collected, used, disclosed, or shared.”

The odd aspect of this complaint and consent order is that Nomi did not track or maintain information that would allow the individual consumers to be identified.  The media access control (MAC) address broadcast by consumers’ mobile devices as they passed by or entered the stores was cryptographically “hashed” before it was collected, created a unique identifier that allowed Nomi to track the device without tracking the consumer him/herself.  As dissenting Commissioner Maureen Ohlhausen points out, as “a third party contractor collecting no personally identifiable information, Nomi had no obligation to offer consumers an opt out.”  The majority, however, focuses on the fact that the opt out was partially inaccurate, then leaps to the conclusion that the inaccuracy was deceptive under Section 5 of the FTC Act, without pausing to reflect on the fact that the privacy policy and opt out process may not have been required by law in the first place.

So while many HIPAA covered entities and other businesses may want to give consumers as much information as possible about data collection, the lesson here is twofold:  first, make sure the notice is required under applicable law (and, if it’s not, be sure the benefits of notice outweigh potential risks); and, second, make sure the notice is 100% accurate to avoid FTC deceptive practices claims.

Officials from both the Federal Trade Commission (FTC) and European Union (EU) recently called for enhancements to the Obama administration’s proposed Consumer Privacy Bill of Rights.

The White House’s proposed Consumer Privacy Bill of Rights seeks to provide “a baseline of clear protections for consumers and greater certainty for companies.”  The guiding principles of the draft bill are:  individual control, transparency, respect for context, security, access and accuracy, focused collection and accountability.

But the proposed legislation also seeks to afford companies discretion and flexibility to promote innovation, which some officials argue has led to a lack of clarity.

FTC Chairwoman Edith Ramirez had hoped for a “stronger” proposal and had “concerns about the lack of clarity in the requirements that are set forth.”  However, Chairwoman Ramirez acknowledged the significance of a privacy bill backed by the White House.  FTC Commissioner Julie Brill also expressed concern over weaknesses in the draft, calling for more boundaries.

Likewise, European Data Protection Supervisor Giovanni Buttarelli felt that the proposal lacked clarity and that, as written, “a large majority of personal data would not be subject to any provisions or safeguards.”

To review the administration’s proposed bill, click here.

 

On December 31, 2014, the Federal Trade Commission announced that it approved a final order settling charges against Snapchat.

In its complaint, the FTC charged Snapchat with deceiving consumers over the amount of personal data that it collected and the security measures in place to protect the data from disclosure and misuse.

The settlement order prohibits Snapchat from misrepresenting the extent to which a message is deleted after being viewed by the recipient and the extent to which Snapchat is capable of detecting or notifying the sender when a recipient has captured a screenshot or otherwise saved the message.  Snapchat is also prohibited from misrepresenting the steps taken to protect against misuse or unauthorized disclosure of information.

Finally, the company will be required to implement a “comprehensive privacy program” and obtain assessments of that program every two years from an independent privacy professional for the next 20 years.

In its press release, the FTC noted that its settlement with Snapchat is “part of the FTC’s ongoing effort to ensure that companies market their apps truthfully and keep their privacy promises to consumers.”  For more information from the FTC on marketing apps, click here.

The Federal Trade Commission recently announced that it settled charges against a health billing company and its former CEO that they misled consumers who had signed up for their online billing portal by failing to inform them that the company would seek detailed medical information from pharmacies, medical labs and insurance companies.

The Atlanta-based medical billing provider operated a website where consumers could pay their medical bills, but in 2012, the company developed a separate service, Patient Health Report, that would provide consumers with comprehensive online medical records.  In order to populate the medical records, the company altered its registration process for the billing portal to include permission for the company to contact healthcare providers to obtain the consumer’s medical information, such as prescriptions, procedures, medical diagnoses, lab tests and more.

The company obtained a consumer’s “consent” through four authorizations presented in small windows on the webpage that displayed only six lines of the extensive text at a time and could be accepted by clicking one box to agree to all four authorizations at once.  According to the complaint, consumers registering for the billing service would have reasonably believed that the authorizations related only to billing.

The settlement requires the company to destroy any information collected relating to the Patient Health Report service.

This case is a good reminder for companies in the healthcare industry looking to offer new online products involving consumer health information that care must always be taken to ensure that consumers understand what the product offers and what information will be collected.

 

As if compliance with the various federal privacy and data security standards weren’t complicated enough, we may see state courts begin to import these standards into determinations of privacy actions brought under state laws.  Figuring out which federal privacy and data security standards apply, particularly if the standards conflict or obliquely overlap, becomes a veritable Rubik’s cube puzzle when state statutory and common law standards get thrown into the mix.

A state court may look to standards applied by the Federal Communications Commission (“FCC”), the Federal Trade Commission (“FTC”), the Department of Health and Human Services (“HHS”), or some other federal agency asserting jurisdiction over privacy and data security matters, and decide whether the applicable standard or standards preempts state law.  The state court may also decide that one or more of these federal agencies’ standards represent the “standard of care” to be applied in determining a matter under state law.  Or, as shown in a recent Connecticut Supreme Court decision described below, a court may decide that state law is not preempted by federal law or standards in one respect, while recognizing that the federal law or standard may embody the “standard of care” to be applied in deciding a privacy or data security matter under state law.

Fox Rothschild LLP partner Michael Kline posted the following at http://hipaahealthlaw.foxrothschild.com/:

The Connecticut Supreme Court handed down a decision in the case of Byrne v. Avery Center for Obstetrics and Gynecology, P.C., — A.3d —-, 2014 WL 5507439 (2014) that:

[a]ssuming, without deciding, that Connecticut’s common law recognizes a negligence cause of action arising from health care providers’ breaches of patient privacy in the context of complying with subpoenas, we agree with the plaintiff and conclude that such an action is not preempted by HIPAA and, further, that the HIPAA regulations may well inform the applicable standard of care in certain circumstances.

Interestingly, the decision is dated November 11, 2014, the federal holiday of Veterans Day, but was available on Westlaw on November 7, 2014.  The Court’s decision was rendered 20 months after the date that the case was argued on March 12, 2013.

The decision adds the Connecticut Supreme Court to a growing list of courts that have found that HIPAA’s lack of a private right of action does not necessarily foreclose action under state statutory and common law.  The Byrne case, however, has added significance, as it appears to be the first decision by the highest court of a state that says that state statutory and judicial causes of action for negligence, including invasion of privacy and infliction of emotional distress, are not necessarily preempted by HIPAA.  Moreover, it recognized that HIPAA may be the appropriate standard of care to determine whether negligence is present.

The Byrne case has important implications for HIPAA matters beyond the rights of individuals to sue under state tort law, using HIPAA regulations as the standard of care.  For example, in the area of business associate agreements (“BAAs”) and subcontractor agreements (“SCAs”), as was discussed in a posting in October 2013 on this blog relating to indemnification provisions,

there should be a negation of potential third party beneficiary rights under the BAA or SCA. For example, HIPAA specifically excludes individual private rights of action for a breach of HIPAA – a [p]arty does not want to run a risk of creating unintentionally a separate contractual private right of action in favor of a third party under a[n indemnification] [p]rovision.

A party should, therefore, endeavor to limit the number of persons that may assert a direct right to sue for indemnification resulting from a breach of a BAA.  Failing to limit the number of persons that may assert a direct right to sue for indemnification resulting from a breach of a BAA or SCA can be costly indeed, especially if the number of states that follow the Byrne case principles increases.

Efforts to use HIPAA regulations as standards for causes of action under state law can be expected to rise as a result of the Byrne decision.  Covered entities, business associates and subcontractors should consider acquiring sufficient cybersecurity insurance with expanded coverage and limits.

As a regulatory lawyer, I frequently find myself parsing words and phrases crafted by legislators and agencies that, all too often, are frustratingly vague or contradictory when applied to a particular real-world and perhaps unanticipated (at the time of drafting) scenario.  So when an agency crafting guidance for a regulated industry has advisors on hand who have first-hand knowledge and expertise about particular real-world occurrences, such as data security breaches, it would seem that agency would be in an ideal position to create relevant, clear, and sufficiently detailed guidance that the affected industry could use to prevent certain occurrences and achieve compliance with the agency’s requirements.

As described in posts on our HIPAA, HITECH & HIT blog, the Federal Trade Commission (FTC) has brought numerous enforcement actions against businesses based on its decision that the businesses’ data security practices were “deceptive” or “unfair” under Section 5 of the FTC Act.  When I last checked the FTC’s website, there were 54 cases listed under the “Privacy and Security” topic and “Data Security” subtopic, one of which is the LabMD case filed on August 29, 2013.  Blog readers may have “discerned” (as do smart businesses when reviewing these cases and trying to figure out what the FTC’s data security “standards” might be) that I am intrigued with the LabMD case.  My intrigue arises, in part, from the stark contrast between the FTC and the Department of Health and Human Services (HHS) and the way these agencies identify data security standards applicable to regulated entities.  Of course, HHS’s standards apply specifically to the subset of data that is protected health information (PHI) – precisely the type of data involved in the LabMD case – but that hasn’t stopped the FTC from insisting that its own “standards” also apply to covered entities and business associates regulated by HIPAA.

The latest development in the LabMD case is particularly intriguing.  On May 1, 2014, FTC Chief Administrative Law Judge D. Michael Chappell granted LabMD’s motion to compel deposition testimony as to “what data security standards, if any, have been published by the FTC or the Bureau [of Consumer Protection], upon which … [FTC] Counsel intends to rely at trial to demonstrate that … [LabMD’s] data security practices were not reasonable and appropriate.”  The FTC had fought to prevent this testimony, arguing that the “FTC’s “data security standards” are not relevant to” the factual question of whether LabMD’s data security procedures were “unreasonable” in light of the FTC’s standards.

The FTC does publish a “Guide for Business” on “Protecting Personal Information” on its website.  This “Guide” is very basic (15 pages in total, with lots of pictures), and includes bullet points with tips such as “Don’t store sensitive consumer data on any computer with an Internet connection unless it’s essential for conducting your business.”  The “Guide” does not reference HIPAA, and does not come close to the breadth and depth of the HIPAA regulations (and other HHS published materials) in terms of setting forth the agency’s data security standards.

LabMD’s Answer and Defenses to the FTC’s Complaint was filed on September 17, 2013.  In that document, LabMD admits to having been contacted in May of 2008 by a third party, Tiversa, claiming that it had obtained an “insurance aging report” containing information about approximately 9,300 patients.  Tiversa, a privately-held company that provides “intelligence services to corporations, government agencies and individuals based on patented technologies” and can “locate exposed files … and assist in remediation and risk mitigation,” boasts an impressive advisory board.  According to Tiversa’s website, advisory board member Dr. Larry Ponemon “has extensive knowledge of regulatory frameworks for managing privacy and data security including … health care,” and “was appointed to the Advisory Committee for Online Access & Security” for the FTC.

Perhaps the FTC might consult with Dr. Ponemon in crafting data security standards applicable to the health care industry, since Tiversa apparently identified LabMD’s data security breach in the first place.  If (as published by the Ponemon Institute in its “Fourth Annual Benchmark Study on Patient Privacy and Data Security”) criminal attacks on health care systems have risen 100% since the Ponemon Institute’s first study conducted in 2010, the health care industry remains vulnerable despite efforts to comply with HIPAA and/or discern the FTC’s data privacy standards.  Bringing Dr. Ponemon’s real-world experience to bear in crafting clear and useful FTC data privacy standards (that hopefully complement, not contradict, already-applicable HIPAA standards) might actually help protect PHI from both criminal attack and discovery by “intelligence service” companies like Tiversa.