Health Insurance Portability and Accountability Act (HIPAA)

Elizabeth Litten (Fox Rothschild Partner and HIPAA Privacy & Security Officer) and Mark McCreary (Fox Rothschild Partner and Chief Privacy Officer) will be presenting at the New Jersey Chapter of the Healthcare Financial Management Association on August 30, 2017, from 12:00-1:00 pm eastern time.  The presentation is titled: “Can’t Touch That: Best Practices for Health Care Workforce Training on Data Security and Information Privacy.”

This webinar is a comprehensive review of information privacy and data security training, with an emphasis on imparting practical know-how and a fluency with the terminology involving phishing, ransomware, malware and other common threats. We will cover best practices for sensitizing health care industry workers to these threats as part of their ongoing HIPAA compliance efforts and, more generally, for training workers in any business on the proper handling of sensitive data. We will cover the adoption of policies and a training regimen for the entire workforce, as well as tailored training for those in positions responsible for implementing security policies.

More information and a registration link can be found here.

On July 23, 2017, Washington State will become the third state (after Illinois and Texas) to statutorily restrict the collection, storage and use of biometric data for commercial purposes. The Washington legislature explained its goal in enacting Washington’s new biometrics law:

The legislature intends to require a business that collects and can attribute biometric data to a specific uniquely identified individual to disclose how it uses that biometric data, and provide notice to and obtain consent from an individual before enrolling or changing the use of that individual’s biometric identifiers in a database.

— Washington Laws of 2017, ch. 299 § 1.  (See complete text of the new law here).

Washington’s new biometrics act governs three key aspects of commercial use of biometric data:

  1. collection, including notice and consent,
  2. storage, including protection and length of time, and
  3. use, including dissemination and permitted purposes.

The law focuses on “biometric identifiers,” which it defines as

data generated by automatic measurements of an individual’s biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises, or other unique biological patterns or characteristics that is used to identify a specific individual.

— Id. § 3(1).

The law excludes all photos, video or audio recordings, or information “collected, used, or stored for health care treatment, payment or operations” subject to HIPAA from the definition of “biometric identifiers.” Id.  It also expressly excludes biometric information collected for security purposes (id. § 3(4)), and does not apply to financial institutions subject to the Gramm-Leach-Bliley Act.  Id. § 5(1).  Importantly, the law applies only to biometric identifiers that are “enrolled in” a commercial database, which it explains means capturing a biometric identifier, converting it to a reference template that cannot be reconstructed into the original output image, and storing it in a database that links the biometric identifier to a specific individual.  Id. §§ 2, 3(5).

Statutory Ambiguity Creates Confusion

Biometric data
Copyright: altomedia / 123RF Stock Photo

Unfortunately, ambiguous statutory language, combined with rapidly-advancing technology, virtually guarantees confusion in each of the three key aspects of the new law.

Regarding collection, the new law states that a company may not “enroll a biometric identifier in a database for a commercial purpose” unless it: (1) provides notice, (2) obtains consent, or (3) “provid[es] a mechanism to prevent the subsequent use of a biometric identifier for a commercial purpose.”  Id. § 2(1).  Confusingly, the law does not specify what type of “notice” is required, except that it must be “given through a procedure reasonably designed to be readily available to affected individuals,” and its adequacy will be “context-dependent.”  Id. § 2(2).

If consent is obtained, a business may sell, lease or disclose biometric data to others for commercial use.  Id. § 2(3).  Absent consent, a business may not disclose biometric data to others except in very limited circumstances listed in the statute, including in litigation, if necessary to provide a service requested by the individual or as authorized by other law. Id. However, the new law may ultimately be read by courts or regulators as including a “one disclosure” exception because it says disclosure is allowed to any third party “who contractually promises that the biometric identifier will not be further disclosed and will not be enrolled in a database for a commercial purpose” inconsistent with the new law.  Id.

The new law also governs the storage of biometric identifiers.  Any business holding biometric data “must take reasonable care to guard against unauthorized access to and acquisition of biometric identifiers that are in the possession or control of the person.”  Id. § 2(4)(a).  Moreover, businesses are barred from retaining biometric data for any longer than “reasonably necessary” to provide services, prevent fraud, or comply with a court order.  Id. § 2(4)(b).  Here too the law fails to provide certainty, e.g., it sets no bright-line time limits on retention after customer relationships end, or how to apply these rules to ongoing but intermittent customer relationships.

The Washington legislature also barred companies that collect biometric identifiers for using them for any other purpose “materially inconsistent” with the original purpose they were collected for unless they first obtain consent.  Id. § 2(5).  Confusingly, even though notice alone is enough to authorize the original collection, it is not sufficient by itself to authorize a new use.

Interestingly, the new Washington law makes a violation of its collection, storage or use requirements a violation of the Washington Consumer Protection Act (the state analog to Section 5 of the FTC Act).  Id. § 4(1).  However, it specifically excludes any private right of action under the statute and provides for enforcement solely by the Washington State Attorney General, leaving Illinois’s Biometric Information Privacy Act as the only state biometrics law authorizing private enforcement.  Id. § 4(2).

Washington’s new law was not without controversy.  Several state legislators criticized it as imprecise and pushed to more specifically detail the activities it regulates; proponents argued that its broad language was necessary to allow flexibility for future technological advances. Ultimately, the bill passed with less than unanimous approval and was signed into law by Washington’s governor in mid-May.  It takes effect on July 23, 2017.  A similar, but not identical, Washington law takes effect the same day governing the collection, storage and use of biometric identifiers by state agencies.  (See Washington Laws of 2017, ch. 306 here).

In one of the best examples we have ever seen that it pays to be HIPAA compliant (and can cost A LOT when you are not), the U.S. Department of Health and Human Services, Office for Civil Rights, issued the following press release about the above settlement.  This is worth a quick read and some soul searching if your company has not been meeting its HIPAA requirements.

FOR IMMEDIATE RELEASE
April 24, 2017
Contact: HHS Press Office
202-690-6343
media@hhs.gov

$2.5 million settlement shows that not understanding HIPAA requirements creates risk

The U.S. Department of Health and Human Services, Office for Civil Rights (OCR), has announced a Health Insurance Portability and Accountability Act of 1996 (HIPAA) settlement based on the impermissible disclosure of unsecured electronic protected health information (ePHI). CardioNet has agreed to settle potential noncompliance with the HIPAA Privacy and Security Rules by paying $2.5 million and implementing a corrective action plan. This settlement is the first involving a wireless health services provider, as CardioNet provides remote mobile monitoring of and rapid response to patients at risk for cardiac arrhythmias.

In January 2012, CardioNet reported to the HHS Office for Civil Rights (OCR) that a workforce member’s laptop was stolen from a parked vehicle outside of the employee’s home. The laptop contained the ePHI of 1,391 individuals. OCR’s investigation into the impermissible disclosure revealed that CardioNet had an insufficient risk analysis and risk management processes in place at the time of the theft. Additionally, CardioNet’s policies and procedures implementing the standards of the HIPAA Security Rule were in draft form and had not been implemented. Further, the Pennsylvania –based organization was unable to produce any final policies or procedures regarding the implementation of safeguards for ePHI, including those for mobile devices.

“Mobile devices in the health care sector remain particularly vulnerable to theft and loss,” said Roger Severino, OCR Director. “Failure to implement mobile device security by Covered Entities and Business Associates puts individuals’ sensitive health information at risk. This disregard for security can result in a serious breach, which affects each individual whose information is left unprotected.”

The Resolution Agreement and Corrective Action Plan may be found on the OCR website at https://www.hhs.gov/hipaa/for-professionals/compliance-enforcement/agreements/cardionet

HHS has gathered tips and information to help protect and secure health information when using mobile devices:  https://www.healthit.gov/providers-professionals/your-mobile-device-and-health-information-privacy-and-security

To learn more about non-discrimination and health information privacy laws, your civil rights, and privacy rights in health care and human service settings, and to find information on filing a complaint, visit us at http://www.hhs.gov/hipaa/index.html

Last week we posted about A Brief Primer on the NIST Cybersecurity Framework.  Our partner and HIPAA/HITECH expert Elizabeth Litten took the NIST Cybersecurity Framework and created a blog post for the HIPAA, HITECH and Health Information Technology Blog on how How the NIST Cybersecurity Framework Can Help With HIPAA Compliance: 3 Tips, which can be read here.  For those facing any HIPAA-related issues, it is a worthwhile read.

I strongly urge every covered entity and business associate faced with a Business Associate Agreement that includes indemnification provisions to read Michael Kline’s “List of Considerations” before signing.  Michael’s list, included in an article he wrote that was recently published in the American Health Lawyers Association’s “AHLA Weekly” and available here, highlights practical and yet not obvious considerations.  For example, will indemnification jeopardize a party’s cybersecurity or other liability coverage?

Data use and confidentiality agreements used outside of the HIPAA context may also include indemnification provisions that are triggered in the event of a privacy or security breach.  Parties to these agreements should take a close look at these “standard” provisions and Michael’s list and proceed carefully before agreeing to indemnify and/or be indemnified by the other party.

Innovative health care-related technology and developing telemedicine products have the potential for dramatically changing the way in which health care is accessed.  The Federation of State Medical Boards (FSMB) grappled with some of the complexities that arise as information is communicated electronically in connection with the provision of medical care and issued a Model Policy in April of 2014 to guide state medical boards in deciding how to regulate the practice of “telemedicine”, a definition likely to become outdated as quickly as the next technology or product is developed.

Interestingly, the development and use of medical devices and communication technology seems to outpace agency definitions and privacy laws as quickly as hackers outpace security controls.  So how can we encourage innovation and adopt new models without throwing privacy out with the bathwater of the traditional, in-person patient-physician relationship?  A first step is to see and understand the gaps in privacy protection and figure out how to they can be narrowed.

HIPAA does not protect all information, even when the information is clearly health information and a specific individual can be identified in connection with the health information.   A guidance document issued jointly by the U.S. Department of Health and Human Services (HHS) and the Food and Drug Administration (FDA) on October 2, 2014 (FDA Guidance Document) contains the agencies’ “non-binding recommendations” to assist the medical device industry with cybersecurity.  The FDA Guidance Document defines “cybersecurity” as “the process of preventing unauthorized access, modification, misuse or denial of use, or the unauthorized use of information that is stored, accessed, or transferred from a medical device to an external recipient.”  If my medical device creates, receives, maintains, or transmits information related to my health status or condition, it’s likely I expect that information to be secure and private – but unless and until my doctor (or other covered entity or business associate) interfaces with it, it’s not protected health information (PHI) under HIPAA.

The FSMB’s Model Policy appropriately focused on the establishment of the physician-patient relationship.  In general, HIPAA protects information created, received, maintained or transmitted in connection with that relationship.  A medical device manufacturer, electronic health application developer, or personal health record vendor that is not a “health care provider” or other covered entity as defined under HIPAA, and is not providing services on behalf of a  covered entity as a business associate, can collect or use health-related information from an individual without abiding by HIPAA’s privacy and security obligations.  The device, health app, or health record may still be of great value to the individual, but the individual should recognize that the information it creates, receives, maintains or transmits is not HIPAA-protected until comes from or ends up with a HIPAA covered entity or business associate.

The FDA Guidance Document delineates a number of cybersecurity controls that manufacturers of FDA-regulated medical devices should develop, particularly if the device has the capability of connecting (wirelessly or hard-wired) to another device, the internet, or portable electronic media.  Perhaps these controls will become standard features of medical devices, but they might also be useful to developers of other types of health-related products marketed to or purchased by consumers.  In the meantime, though, it’s important to remember that your device is not your doctor, and HIPAA may not be protecting the health data created, received, maintained or transmitted by your medical device.

As a regulatory lawyer, I frequently find myself parsing words and phrases crafted by legislators and agencies that, all too often, are frustratingly vague or contradictory when applied to a particular real-world and perhaps unanticipated (at the time of drafting) scenario.  So when an agency crafting guidance for a regulated industry has advisors on hand who have first-hand knowledge and expertise about particular real-world occurrences, such as data security breaches, it would seem that agency would be in an ideal position to create relevant, clear, and sufficiently detailed guidance that the affected industry could use to prevent certain occurrences and achieve compliance with the agency’s requirements.

As described in posts on our HIPAA, HITECH & HIT blog, the Federal Trade Commission (FTC) has brought numerous enforcement actions against businesses based on its decision that the businesses’ data security practices were “deceptive” or “unfair” under Section 5 of the FTC Act.  When I last checked the FTC’s website, there were 54 cases listed under the “Privacy and Security” topic and “Data Security” subtopic, one of which is the LabMD case filed on August 29, 2013.  Blog readers may have “discerned” (as do smart businesses when reviewing these cases and trying to figure out what the FTC’s data security “standards” might be) that I am intrigued with the LabMD case.  My intrigue arises, in part, from the stark contrast between the FTC and the Department of Health and Human Services (HHS) and the way these agencies identify data security standards applicable to regulated entities.  Of course, HHS’s standards apply specifically to the subset of data that is protected health information (PHI) – precisely the type of data involved in the LabMD case – but that hasn’t stopped the FTC from insisting that its own “standards” also apply to covered entities and business associates regulated by HIPAA.

The latest development in the LabMD case is particularly intriguing.  On May 1, 2014, FTC Chief Administrative Law Judge D. Michael Chappell granted LabMD’s motion to compel deposition testimony as to “what data security standards, if any, have been published by the FTC or the Bureau [of Consumer Protection], upon which … [FTC] Counsel intends to rely at trial to demonstrate that … [LabMD’s] data security practices were not reasonable and appropriate.”  The FTC had fought to prevent this testimony, arguing that the “FTC’s “data security standards” are not relevant to” the factual question of whether LabMD’s data security procedures were “unreasonable” in light of the FTC’s standards.

The FTC does publish a “Guide for Business” on “Protecting Personal Information” on its website.  This “Guide” is very basic (15 pages in total, with lots of pictures), and includes bullet points with tips such as “Don’t store sensitive consumer data on any computer with an Internet connection unless it’s essential for conducting your business.”  The “Guide” does not reference HIPAA, and does not come close to the breadth and depth of the HIPAA regulations (and other HHS published materials) in terms of setting forth the agency’s data security standards.

LabMD’s Answer and Defenses to the FTC’s Complaint was filed on September 17, 2013.  In that document, LabMD admits to having been contacted in May of 2008 by a third party, Tiversa, claiming that it had obtained an “insurance aging report” containing information about approximately 9,300 patients.  Tiversa, a privately-held company that provides “intelligence services to corporations, government agencies and individuals based on patented technologies” and can “locate exposed files … and assist in remediation and risk mitigation,” boasts an impressive advisory board.  According to Tiversa’s website, advisory board member Dr. Larry Ponemon “has extensive knowledge of regulatory frameworks for managing privacy and data security including … health care,” and “was appointed to the Advisory Committee for Online Access & Security” for the FTC.

Perhaps the FTC might consult with Dr. Ponemon in crafting data security standards applicable to the health care industry, since Tiversa apparently identified LabMD’s data security breach in the first place.  If (as published by the Ponemon Institute in its “Fourth Annual Benchmark Study on Patient Privacy and Data Security”) criminal attacks on health care systems have risen 100% since the Ponemon Institute’s first study conducted in 2010, the health care industry remains vulnerable despite efforts to comply with HIPAA and/or discern the FTC’s data privacy standards.  Bringing Dr. Ponemon’s real-world experience to bear in crafting clear and useful FTC data privacy standards (that hopefully complement, not contradict, already-applicable HIPAA standards) might actually help protect PHI from both criminal attack and discovery by “intelligence service” companies like Tiversa.

Imagine you have completed your HIPAA risk assessment and implemented a robust privacy and security plan designed to meet each criteria of the Omnibus Rule. You think that, should you suffer a data breach involving protected health information as defined under HIPAA (PHI), you can show the Secretary of the Department of Health and Human Services (HHS) and its Office of Civil Rights (OCR), as well as media reporters and others, that you exercised due diligence and should not be penalized. Your expenditure of time and money will help ensure your compliance with federal law.

Unfortunately, however, HHS is not the only sheriff in town when it comes to data breach enforcement. In a formal administrative action, as well as two separate federal court actions, the Federal Trade Commission (FTC) has been battling LabMD for the past few years in a case that gets more interesting as the filings and rulings mount (In the Matter of LabMD, Inc., Docket No. 9357 before the FTC). LabMD’s CEO Michael Daugherty recently published a book on the dispute with a title analogizing the FTC to the devil, with the byline, “The Shocking Expose of the U.S. Government’s Surveillance and Overreach into Cybersecurity, Medicine, and Small Business.” Daugherty issued a press release in late January attributing the shutdown of operations of LabMD primarily to the FTC’s actions.

Among many other reasons, this case is interesting because of the dual jurisdiction of the FTC and HHS/OCR over breaches that involve individual health information.

On one hand, the HIPAA regulations detail a specific, fact-oriented process for determining whether an impermissible disclosure of PHI constitutes a breach under the law. The pre-Omnibus Rule breach analysis involved consideration of whether the impermissible disclosure posed a “significant risk of financial, reputational, or other harm” to the individual whose PHI was disclosed. The post-Omnibus Rule breach analysis presumes that an impermissible disclosure is a breach, unless a risk assessment that includes consideration of at least four specific factors demonstrates there was a “low probability” that the individual’s PHI was compromised.

In stark contrast to HIPAA, the FTC files enforcement actions based upon its decision that an entity’s data security practices are “unfair”, but it has not promulgated regulations or issued specific guidance as to how or when a determination of “unfairness” is made. Instead, the FTC routinely alleges that entities’ data security practices are “unfair” because they are not “reasonable” – two vague words that leave entities guessing about how to become FTC compliant.

In 2013, in an administrative action, LabMD challenged the FTC’s authority to institute these type of enforcement actions. LabMD argued, in part, that the FTC does not have the authoritiy to bring actions under the “unfairness” prong of Section 5 of the FTC Act. LabMD further argued that there should only be one sheriff in town – not both HHS and the FTC. Not surprisingly, in January 2014, the FTC denied the motion to dismiss, finding that HIPAA requirements are “largely consistent with the data security duties” of the FTC under the FTC Act.The opinion speaks of “data security duties” and “requirements” of the FTC Act, but these “duties” and “requirements” are not spelled out (much less even mentioned) in the FTC Act. As a result, how can anyone arrive at the determination that the standards are consistent? Instead, entities that suffer a data security incident must comply with the detailed analysis under HIPAA, as well as the absence of any clear guidance under the FTC Act.

In a March10, 2014 ruling, the administrative law judge ruled that he would permit LabMD to depose an FTC designee regarding consumers harmed by LabMD’s allegedly inadequate security practices. However, the judge also ruled that LabMD could not “inquire into why, or how, the factual bases of the allegations … justify the conclusion that [LabMD] violated the FTC Act.” So while the LabMD case may eventually provide some guidance as to the factual circumstances involved in an FTC determination that data security practices are “unfair” and have caused, or are likely to cause, consumer harm, the legal reasoning behind the FTC’s determinations is likely to remain a mystery.

In addition to the challenges mounted by LabMD, Wyndham Worldwide Corp., has also spent the past year contesting the FTC’s authority to pursue enforcement actions based upon companies’ alleged “unfair” or “unreasonable” data security practices. On Monday, April 7, 2014, the United States District Court for the District of New Jersey sided with the FTC and denied Wyndham’s motion to dismiss the FTC’s complaint. The Court found that Section 5 of the FTC Act permits the FTC to regulate data security, and that the FTC is not required to issue formal rules about what companies must do to implement “reasonable” data security practices.

These recent victories may cause the “other sheriff” – the FTC – to ramp up its efforts to regulate data security practices. Unfortunately, because it does not appear that the FTC will issue any guidance in the near future about what companies can do to ensure that their data security practices are reasonable, these companies must monitor closely the FTC’s actions, adjudications or other signals in an attempt to predict what the FTC views as data security best practices.

DataSecurityWe are pleased to announce the launch of our Data Breach 411 App, which is available for free download in the iTunes store at:  https://itunes.apple.com/us/app/data-breach-411/id726115837?mt=8

The Data Breach 411 App is a data breach survival guide designed to tackle a general counsel’s worst nightmare:  the loss or theft of sensitive data.

Features of the app include:

1.  State Breach Notification Statutes:  An alphabetical listing of the 46 states that have data breach notification statutes in place and links to relevant information.

2.  HIPAA/HITECH Statutes:  Breach notification rules and other pertinent information related to the loss or theft of protected health information.

3.  Other Resources:  Links to credit agencies, credit monitoring services and the FTC Website, as well as a section on COPPA — the Children’s Online Privacy Protection Act.

The below originally appeared on our HIPAA, HITECH & HIT blog on October 1.  It is authored by our partner, Michael Kline.  You can contact Michael at mkline@foxrothschild.com.

 

A party (Party) to a HIPAA Business Associate Agreement (BAA) or Subcontractor Agreement (SCA), whether a covered entity (CE), business associate (BA) or  subcontractor (SC), may struggle with the question as to whether to agree to, demand, request, submit to, negotiate or permit, an indemnification provision (Provision) respecting the counterparty (Counterparty) under a BAA or SCA.  On January 25, 2013, the U.S. Department of Health and Human Services  published “Sample Business Associate Agreement Provisions,” which were silent on the matter of indemnification.  Nonetheless, inclusion of Provisions is often a major question for Parties to BAAs and SCAs.

There are a number of common themes that, at a minimum, may determine in a specific case for a Party whether the BAA or SCA should include such a Provision.  Because a breach of HIPAA, especially in the areas of privacy and security, can result in enormous financial liability, humiliating publicity and large monetary penalties, appropriate care should be given regarding such Provisions. In addition to the items listed below, the relative bargaining power of the Parties may be a significant factor in this matter.  Below are ten items for consideration.

1.         A CE or BAA may assert that it has a “standard form” of BAA that includes a Provision running solely for such Party’s benefit.  The Counterparty may legitimately push back and demand that such Provision be removed, or at least that the BAA be revised to include a reciprocal Provision for its benefit.  (A Party may also ask its Counterparty whether the Counterparty has ever previously executed a BAA or SCA that does not contain such a Provision.)

2.         Before a Party agrees to any Provision whereby it is indemnifying the Counterparty, it should find out from its own liability insurance carrier whether such a Provision is permitted under such Party’s insurance policy or if agreeing to such a Provision will have any adverse impact on its insurance coverage.

3.         If a Provision is to be included (and perhaps as a general rule), there should be a negation of potential third party beneficiary rights under the BAA or SCA.  For example, HIPAA specifically excludes individual private rights of action for a breach of HIPAA – a Party does not want to run a risk of creating unintentionally a separate contractual private right of action in favor of a third party under a Provision.

4.         A Party should endeavor to limit its own maximum dollar amount exposure for indemnification.  For this reason alone, a Provision should be viewed as not standard.

5.         A Party should endeavor to limit the time period for indemnification under the Provision.

6.         If the BAA or SCA includes a Provision, a Party may desire to limit its monetary liability for any and all breaches under the BAA or SCA solely to the indemnification obligations under the Provision.

7.         A Party should consider expressly limiting its monetary liability under the Provisions to events directly and proximately caused by a material breach of the BAA and only to the extent that the material breach of such Party caused damages to the Counterparty.

8.         Where a BA or SC is a lawyer or law firm that is counsel (or another licensed person who has professional and ethical obligations) to a Counterparty, consider whether there are professional responsibilities of attorneys (or such other licensed person) respecting the negotiation of the Provision, including notifying the Counterparty that it should consider retaining separate counsel to advise it on the Provision (and other terms).

9.         If a regulatory authority exacts a monetary penalty from a Party in connection with a HIPAA breach or such Party is found to have been involved in a HIPAA breach, the right to indemnification of such Party by the Counterparty under a Provision may be limited or not enforceable at all as a matter of public policy.

10.       If a Provision is to be included, attention should be given to its impact on corollary matters, such as limitation on recovery of consequential, special, punitive and other damages and attorneys’ fees and legal expenses.

In light of the above and other potential considerations, careful thought should be given as to whether or not a Provision is appropriate in a specific case and merits what could become a serious and potentially irresolvable stumbling block to the underlying business relationship.  In extreme cases, the matter of indemnification and its complexities and consequences could even result in termination of the business relationship between the Parties.