The Centers for Medicare & Medicaid Services (CMS) recently published proposed rules setting forth the “Stage 2” criteria that eligible providers (EPs), eligible hospitals (EHs), and critical access hospitals (CAHs) (referred to herein collectively as “providers”) would be required to meet in order to qualify for Medicare and/or Medicaid incentive payments for the use of electronic health records (EHRs) (“Stage 2 Proposal”).  The Stage 2 Proposal is a small-font, acronym-laden, tediously-detailed 131-page document that modifies and expands upon the criteria included in the “Stage 1” final rule published on July 28, 2010 and is likely to be of interest primarily to providers concerned with  receiving or continuing to receive added payments from CMS for adopting and “meaningfully using” EHR. 

 

The Stage 2 Proposal is not, at first glance, particularly relevant reading for those of us generally interested in issues involving the privacy and security of personal information — or even for those of us more specifically interested in the privacy and security of protected health information (PHI).  Still, two new provisions caught my attention because they measure the meaningful use required for provider incentive payments based not simply on the providers’ use of EHR, but on their patients’ use of it.  

One provision of the Stage 2 Proposal would require a provider to give at least 50% of its patients the ability to timely "view online, download, and transmit" their health information ("timely" meaning within 4 business days after the provider receives it) (and subject to the provider’s discretion to withhold certain information).   Moreover, it would require that more than 10% of those patients (or their authorized representatives) actually view, download or transmit the information to a third party.  There’s an exception for providers that conduct a majority (more than 50%) of their patient encounters in a county that doesn’t have 50% or more of "its housing units with 4Mbps broadband availability as per the most recent information available from the FCC” (whew!) as of the first day of the applicable EHR reporting period.

For a continuation of this post, please refer to our sister blog at http://hipaahealthlaw.foxrothschild.com/

 

By Elizabeth Litten

The widely publicized pre-Christmas breach of confidential data held by Stratfor Global Intelligence Service (“Stratfor”), a company specializing in data security, reminded me that very little (if any) electronic information is truly secure. If Stratfor’s data can be hacked into, and the health information of nearly 5 million military health plan (TRICARE) members maintained by multi-billion dollar Department of Defense contractor Science Applications International Corporation (SAIC) (the subject of a five-part series of blog postings found on Fox Rothschild’s HIPAA, HITECH and HIT Blog. Parts 12,34 and 5 ) can be accessed, can we trust that any electronically transmitted or stored information is really safe?  

I had the pleasure of having lunch with my friend Al yesterday, an IT guru who has worked in hospitals for years. Al understands and appreciates the need for privacy and security of information, and has the technological expertise to know where and how data can be hacked into or leaked out. Perhaps not surprisingly, Al does not do his banking on-line, and tries to avoid making on-line credit card purchases. 

Al and I discussed the proliferation of the use of iPhones and other mobile technology by physicians and staff in hospitals and other settings, a topic recently discussed in a newsletter published by the American Medical Association. Quick access to a patient’s electronic health record (EHR) is convenient and may even be life-saving in some circumstances, but use of these mobile devices creates additional portals for access to personal information that should be protected and secured. Encryption technology and, perhaps most significantly, use of this technology, barely keeps pace with the exponential rate at which we are creating and/or transmitting data electronically.  

On the other hand, trying to reverse the exponential growth of electronic communications and transactions would be futile and probably counter-productive. The horse is out of the gate, and expecting it to stop mid-stride and retreat back with a false-start call is irrational. The horse will race ahead just as surely as my daughter will text and check her Facebook page, my son will recharge his iPad, and I will turn around and head back to my office if I forget my iPhone. We want and need technology, but seem to forget or fail to fully understand the vast, unprotected and ever-expanding universe into which we send information when we use this technology. 

If we expect breaches or, at least, question our assumptions that personal information will be protected, perhaps we will get better at discerning how and when we disclose our personal information. An in-person conversation or transaction (for example, when Al goes to his bank in person or when a physician speaks directly to another physician about a patient’s care) is less likely to be accessed and used inappropriately than an electronic one. We can better assess the risks and benefits of communicating information electronically when we appreciate the security frailties inherent in electronic communication and storage. 

Perhaps Congress should take the lead in enacting laws that will help protect against data breaches that could compromise “critical infrastructure systems” (as proposed in the “PRECISE Act” introduced by Rep. Daniel E. Lungren (R-CA)), but more comprehensive, potentially expensive, and/or use-impeding cybersecurity laws might have the effect of tripping the racehorse mid-lap rather than controlling its pace or keeping it safely on course.