Despite their distrust in tech giants and lack of confidence in their privacy practices, people aren’t likely to go out of their way to safeguard their information, shows a survey of nearly 4,000 people across generations.

Per the survey:

  • 33 percent of respondents claim to read end user license agreements
  • 66 percent either skim through or ignore EULAs entirely
  • 47 percent know which permissions their applications have
  • 53 percent use password managers
  • 29 percent reuse the same passwords across websites, for Millennials, that number was 37 percent

    Details from Dark Reading.

In the age of digitization, personal information your business holds about your customers (or your customers’ customers) has become a strategic enterprise asset and should be treated as such.

Privacy considerations should be incorporated into your go-to-market strategies.

Gartner with some tips:

  • Customer-facing policies and communications should clearly explain what information is collected and why, as well as any applicable customer rights.
  • Policies should be readily accessible and understandable for customers — and are reinforced internally.
  • Managers and senior leaders should echo the standards in small team discussions, all-company meetings and other forms of messaging.
  • There should be a coherent approach to working with third parties. Codify what third parties can and can’t do with user data, and define consequences for failure to comply. Make sure to follow through and monitor compliance.
  • Compare your customers’ privacy appetite to your organization’s overall risk appetite — and be prepared to manage any gaps between the two.

Details from the International Association of Privacy Professionals.

2019 presents businesses with new cybersecurity and privacy challenges: rapid advances in technology, sophisticated new cyberattacks and stricter privacy regulations here and around the world, just to name a few. Businesses that fail to plan risk significant financial and reputational damage.

Those at the front of the fight, but out of the headlines will:

  • Afford users and consumers true “data self-determination” and transparent control over data while providing a frictionless digital experience.
  • Master what data they collect, who has access to it and how long they have it: “Cradle-to-grave” control over data will win the day.
  • Master baseline data privacy and security, whether defined by statutory schemes, best practices or voluntary industry standards.
  • Remain battle-ready for the critical infrastructure breach (financial, utility and/or transportation).
  • Deploy robust methods to repel the email compromise.
  • Implement tested response plans for digital deep fakes (false video and audio recordings) and other disinformation campaigns.
  • Master vendor and supply chain data security.

Keep your passwords close…and complex, and encrypted and unique, and ever-changing.

In the wake of recent data breaches involving passwords, the French data protection authority, the CNIL, has published guidelines for adequate passwords.

Some highlights include:

  • If you use a password as your sole method of authentication, it needs to be at least 12 characters consisting of uppercase letters, numbers and special characters.
  • If you use additional measures of protection, the password may be less complex.
  • A passphrase is better than a password, and the CNIL developed a tool for producing passwords from sentences.
  • Your authentication function must (i) use a public algorithm deemed strong and (ii) have a software implementation that is free of known vulnerabilities.
  • NEVER store passwords in cleartext – require and allow periodic renewal of passwords.

For details, see the full guidelines.

A number of employers in Illinois are involved in pending class action litigation regarding violations of the Illinois Biometric Information Privacy Act, 740 ILCS 14/1, et seq. (the “BIPA”). The BIPA, which was enacted in 2008, addresses the collection, use and retention of biometric information by private entities. Any information that is captured, stored, or shared based on a person’s biometric identifiers, such as fingerprints, iris scans, or blood type, is considered “biometric information.” The Illinois Legislature enacted the BIPA because biometric information is unlike any other unique identifier in that it can never be changed, even once it has been compromised.

The BIPA requires that, before a private entity can obtain and/or possess an individual’s biometric information, it must first inform the individual, or the individual’s legally authorized representative, in writing of the following: (1) that biometric information is being collected or stored; (2) the specific purpose for the collection, storage, and use of the biometric information; and (3) the length of time for the collection, storage, and use of the biometric information. Furthermore, before collecting any biometric information, the private entity must receive a written release for the collection of the biometric information from the individual or the individual’s legally authorized representative after the above notice has been given.

The BIPA additionally requires the private entity to develop a written policy that establishes a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information. That policy must be made available to the public. The collected information must be destroyed once “the initial purpose for collecting or obtaining such information has been satisfied or within 3 years of the individual’s last interaction with the private entity, whichever occurs first.” 740 ILCS 14/15. In the pending cases, the private entity employers failed to obtain informed written consent prior to the collection, storage, and use of fingerprints and other biometric information. The employers also failed to publish any data retention and deletion policies for the biometric information.

The BIPA also restricts a private entity’s right to sell, lease, trade or otherwise profit from a person’s biometric identifier or biometric information. An employer who adheres to the requirements of the BIPA will be able to avoid class action litigation on this issue and maintain compliance with industry standards.

On Tuesday, November 7th from 2:00 to 6:30, Fox Rothschild and Kroll will be presenting the CLE: Staying One Step Ahead: Developments in Privacy and Data.  The CLE will take place at Fox Rothschild’s offices at 353 N. Clark Street in Chicago.  The speakers are Bill Dixon from Kroll, and Dan Farris and Mark McCreary from Fox Rothschild.  Cocktails and networking will follow the presentations.

If you are in the Chicago are on November 7th, I hope you will join us.  Click here to register for this free event.

Elizabeth Litten (Fox Rothschild Partner and HIPAA Privacy & Security Officer) and Mark McCreary (Fox Rothschild Partner and Chief Privacy Officer) will be presenting at the New Jersey Chapter of the Healthcare Financial Management Association on August 30, 2017, from 12:00-1:00 pm eastern time.  The presentation is titled: “Can’t Touch That: Best Practices for Health Care Workforce Training on Data Security and Information Privacy.”

This webinar is a comprehensive review of information privacy and data security training, with an emphasis on imparting practical know-how and a fluency with the terminology involving phishing, ransomware, malware and other common threats. We will cover best practices for sensitizing health care industry workers to these threats as part of their ongoing HIPAA compliance efforts and, more generally, for training workers in any business on the proper handling of sensitive data. We will cover the adoption of policies and a training regimen for the entire workforce, as well as tailored training for those in positions responsible for implementing security policies.

More information and a registration link can be found here.

Eric Bixler has posted on the Fox Rothschild Physician Law Blog an excellent summary of the changes coming to Medicare cards as a result of the Medicare Access and CHIP Reauthorization Act of 2015.  Briefly, Centers for Medicare and Medicaid Services (“CMS”) must remove Social Security Numbers (“SSNs”) from all Medicare cards. Therefore, starting April 1, 2018, CMS will begin mailing new cards with a randomly assigned Medicare Beneficiary Identifier (“MBI”) to replace the existing use of SSNs.  You can read the entire blog post here.

The SSN removal initiative represents a major step in the right direction for preventing identity theft of particularly vulnerable populations.  Medicare provides health insurance for Americans aged 65 and older, and in some cases to younger individuals with select disabilities.  Americans are told to avoid carrying their social security card to protect their identity in the event their wallet or purse is stolen, yet many Medicare beneficiaries still carry their Medicare card, which contains their SSN.  CMS stated that people age 65 or older are increasingly the victims of identity theft, as incidents among seniors increased to 2.6 million from 2.1 million between 2012 and 2014.  Yet the change took over a decade of formal CMS research and discussions with other government agencies to materialize, in part due to CMS’ estimates of the prohibitive costs associated with the undertaking.  In 2013, CMS estimated that the costs of two separate SSN removal approaches were approximately $255 million and $317 million, including the cost of efforts to develop, test and implement modifications that would have to be made to the agency’s IT systems – see United States Government Accountability Office report, dated September 2013)

We previously blogged (here and here) about the theft of 7,000 student SSNs at Purdue University and a hack that put 75,000 SSNs at risk at the University of Wisconsin.  In addition, the Fox Rothschild HIPAA & Health Information Technology Blog discussed (here) the nearly $7 million fine imposed on a health plan for including Medicare health insurance claim numbers in plain sight on mailings addressed to individuals.

The “new age” of internet and dispersed private data is not so new anymore but that doesn’t mean the law has caught up.  A few years ago, plaintiffs’ cases naming defendants like Google, Apple, and Facebook were at an all-time high but now, plaintiffs firms aren’t interested anymore.  According to a report in The Recorder, a San Francisco based legal newspaper, privacy lawsuits against these three digital behemoths have dropped from upwards of thirty cases in the Northern District of California i 2012 to less than five in 2015.   Although some have succeeded monumentally—with Facebook writing a $20 million check to settle a case over the fact that it was using users’ images without their permission on its “sponsored stories” section—this type of payout is not the majority.  One of the issues is that much of the law in this arena hasn’t developed yet.  Since there is no federal privacy law directly pertaining to the digital realm, many complaints depend on old laws like the Electronic Communications Privacy Act and Stored Communications Act (1986) as well as the Video Privacy Protection Act (1988).  The internet and its capacities was likely not the target of these laws—instead they were meant to prohibit such behavior as tapping a neighbor’s phone or collecting someone’s videotape rental history.

Further, it seems unavoidable now to have personal data somewhere somehow.  Privacy lawsuits attempting to become class actions have a difficulty in succeeding in a similar way that data breach class actions do: the plaintiffs face the challenge of proving concrete harms.  In a case later this year, Spokeo v. Robins, the Supreme Court may change this area of law because it will decide whether an unemployed plaintiff can sue Spokeo for violating the Fair Credits Reporting Act because Spokeo stated that he was wealthy and held a graduate degree.  The issue will turn on proving actual harm.  Companies that deal with private information on a consistent basis should protect themselves by developing privacy policies that, at the very least, may limit their liability.   The reality is that data is everywhere and businesses will constantly be finding creative and profitable ways to use it.

To keep up with the Spokeo v. Robins case, check out the SCOTUSblog here.

http://www.scotusblog.com/case-files/cases/spokeo-inc-v-robins/

New innovations come hand in hand with new privacy issues.  Privacy policies may seem like a last minute add-on to some app developers but they are actually an important aspect of an app.  Data breaches are an imminent risk and a business’s first defense to potential problems is a privacy policy.

Fordham University in New York hosted its Ninth Law and Information Society Symposium last week where policy and technology leaders came together to discuss current privacy pitfalls and solutions.  Joanne McNabb, the California attorney general’s privacy education director and a leader in policies affecting the privacy agreements of companies such as Google and Apple, emphasized in a panel that she “wants to make the case for the unread privacy policy.”  She noted that the policy mainly promotes “governance and accountability [and] it forces an organization to be aware of their data practices to some degree, express them and then therefore to stand behind them.”  The privacy policy still matters because it protects businesses from the risks associated with having a high level of data. It is especially necessary for those businesses that depend solely on private information because they are at a higher risk of breach.

The FTC (Federal Trade Commission) has suggested using an approach called “Privacy By Design” which is a method of imbedding privacy protections into the infrastructure of the app.  This approach removes the concern of implementing privacy policies post-development. Another method of simplifying the privacy policy is the alert prompt that some apps have employed to consistently give consumers notice of when and where their information is used. McNabb and her fellow panelists found this method of “short, timely notices” helpful in closing the gap between the unread privacy policies and the claimed “surprise” of consumers who blame an app for the dissemination of information.

As the industry moves forward, privacy will become an even greater part of the equation. Whether a privacy policy is read is insignificant. The protections it puts in place for all parties involved are crucial. As apps and technologies become more connected to the private preferences of consumers, businesses with a leg up on privacy protections will thrive against the backdrop of those who view privacy as a second tier requirement.

For more information on “Privacy By Design” click here.