Show me the money and I’ll show you my data.

“How much would you charge a marketer to use your personally identifiable information for general advertising purposes?”

About 60 percent of 2,000 U.S. adults polled in November 2018 were willing to share personal data for a price. A majority (57 percent) said it was worth a minimum of $10, while 43 percent valued it at less than $10 (28 percent) or would share it without compensation (15 percent).

The higher the income, the more likely they were to want more for their data.

This trend in how individuals regard their data may become even more interesting in the coming year as the California Consumer Privacy Act (CCPA), which will come into effect in 2020, allows companies to provide individuals with financial incentives for their information if certain conditions are met.

Details on the survey from MarTechToday.

Data rights > data ownership?

That’s the position taken by Privacy International in its response to the recent editorial by artist wil.i.am in The Economist which called for tech giants to pay individuals for their data:

  • Data rights offer a system of control and protection that is much more comprehensive than ownership, and these rights continue to exist even after you share your data with others. They apply to data that others collect about you with or without your knowledge and they also apply to the insights and conclusions that they make about you.
  • Existing data protection laws, like the EU General Data Protection Regulation (GDPR) put a strong data rights system in place. Now is the time to focus efforts on making it easy to use and widely adopted.
  • As powerful as data rights are, they are not a silver bullet. Market dominance and other distortions are a growing concern which should be addressed as well.

Read Privacy International’s Full Argument.

The Illinois Supreme Court’s Ruling

On January 25, 2019, the Illinois Supreme Court issued its long awaited opinion in Rosenbach v. Six Flags Entertainment Corp, ruling that the Illinois Biometric Privacy Act, 740 ILCS 14/1 et seq. (“BIPA”) does not require an actual injury for a plaintiff to be considered “aggrieved” under the Act. The ruling, which was widely anticipated based on the court’s comments during oral argument, is widely expected to open the flood gates on class actions brought under BIPA, given the statutory damages available to plaintiffs. Indeed, in the first week since the ruling, at least 10 new BIPA class actions have been filed.

Under BIPA, parties that possess biometric identifiers (i.e. fingerprints, retina scans and voice recognition) are prohibited from (i) selling, leasing, trading or otherwise profiting from such identifiers; and (ii) otherwise disclosing or disseminating such information unless the individual consents to such disclosure. BIPA imposes penalties of $1,000 per negligent violation of the Act and $5,000 (or actual damages, whichever is greater) for intentional or reckless violations. Second, BIPA allows for the recovery of reasonable attorneys’ fees and costs, including expert witness fees.

What Next?

The court’s ruling stands at odds with the Northern District of Illinois’ recent decision in Rivera v. Google, in which that court ruled that, unless a party suffers an actual injury, it does not satisfy the “injury in fact” requirement of Article III standing to pursue a BIPA claim in Federal Court. Consequently, expect all future BIPA cases going forward to be filed in Illinois state courts.

While the Illinois Supreme Court’s ruling opens the door for an onslaught of BIPA litigation, certain defenses to such actions remain untested and will surely be litigated. For one, expect the issue of whether a plaintiff has consented to the use of his or her biometric information to be hotly contested. For plaintiffs who are employees, that likely means arguing over a company’s policies contained in a handbook or employment agreement. Indeed, employers would be well served to review their policies and agreements to specifically address its potential collection of employees’ biometric information.

Another line of defense may rest in a defendant’s ability to remove a case to federal court and then have it dismissed. If successful, a defendant could avoid liability to a plaintiff who does not suffer an actual injury if it can successfully use the parties’ diversity jurisdiction to remove the case and then argue that the plaintiff lacks Article III standing.

One thing is for sure – expect Illinois state courts to become a hotbed of BIPA litigation.

The European General Data Protection Regulation (GDPR) comes into force on May 25, 2018.  This gives companies only two months to prepare for and comply with the GDPR. Companies should be conducting data mapping to identify all cross-border transfers of personal data so that they can determine the best way to comply with the GDPR requirements.

Illustration of binary code rippling out from the European Union flag, in relation to GDPRThe GDPR has been, perhaps, the most widely talked about privacy regulation for the past year and a half after it was approved by the EU Parliament on April 14, 2016 because of the sweeping changes it will bring to how the global digital economy operates with regard to processing personal data. GDPR will apply to all EU-based companies, irrespective of whether personal data is processed inside or outside of the EU. The GDPR will also apply to companies outside the EU that offer goods or services to individuals in the EU and/or that monitor or track the online behavior or activities of individuals in the EU.

Any transfer of personal data to a third country can take place only if certain conditions are met by the data exporter and the data importer. If a company is transferring EU personal data outside of the EU, that company must identify a valid transfer mechanism to legally transfer that personal data.  The most widely used transfer mechanisms are: (1) transfers within the EU and adequacy rulings; (2) appropriate safeguards; and (3) derogations.

Transfers Within the EU and Adequacy Rulings

Under GDPR, personal data can be moved between EU member states (and Norway, Liechtenstein, and Iceland) without restriction.

Cross-border transfers may also take place without a need to obtain further authorization if the European Commission determines that the third country’s body of national law ensures an adequate level of protection for personal data. The European Commission considers several factors when determining if the country has an adequate level of protection, including the specific processing activities, access to justice, international human rights norms, the general and sectoral law of the country, legislation concerning public security, defense and national security, public order and criminal law.

Appropriate Safeguards

In the absence of an adequacy determination, cross border personal data transfers are permitted if the controller and processor use EU-approved safeguards. The most widely used transfer mechanisms are binding corporate rules, model contractual clauses, and certification mechanisms (e.g. Privacy Shield).

Binding corporate rules (BCRs) are internal codes of conduct adopted by multinational companies to allow transfers between different branches of the organization. BCRs are a favored mechanism because of their flexibility, ability for tailored customization, and a lower administrative burden once implemented.

Model contractual clauses are legal terms contained in a template data processing agreement drafted and ratified by the EU. Model contractual clauses can be burdensome because companies are required to enter new model contractual clauses to cover each new third party and each new purpose for processing or transfer.

Because the European Commission does not recognize the U.S. as an adequate third country, U.S. companies can comply by certifying under the EU-U.S. Privacy Shield that they meet the high data protection standards set out in the Privacy Shield.  The Privacy Shield remains subject to the same criticism that ultimately resulted in the downfall of its predecessor (Safe Harbor), that it does not fully protect the fundamental rights of individuals provided under EU privacy laws.

Derogations

In the absence of either an adequacy decision or the implementation of an appropriate safeguard, a cross-border transfer can still take place in limited circumstances, where an exception applies. These circumstances include situations where the individual explicitly consents after having been informed of the risks of data transfer in the absence of an adequacy decision and appropriate safeguards, the transfer is necessary for the performance of a contract between the parties, or if the transfer is necessary for important reasons of public interest. The permitted derogations are fact-specific and are generally not intended to be relied upon as a company’s primary transfer mechanism.

Guidance for GDPR Compliance

Transferring personal data out of the EU without a valid transfer mechanism can result in significant fines and increased regulatory oversight.  Beginning on May 26, 2018, compliance with the GDPR will be essential for companies engaging in cross-border transfers of personal data.

To comply with the GDPR, companies should first identify and map all cross-border data flows.  Companies should then examine and assess for each of these flows whether the receiving country is in the EU (and Norway, Liechtenstein and Iceland) or is otherwise deemed adequate.  If not, the company should consider whether any appropriate safeguards have been put in place, and/or whether any specific derogations apply.

On July 20, 2015, in Remijas v. Neiman Marcus Group, LLC, No. 14-3122 (7th Cir. 2015), the Seventh Circuit held that the United States District Court for the Northern District of Illinois wrongfully dismissed a class action suit brought against Neiman Marcus after hackers stole their customers’ data and debit card information.  The District Court originally dismissed the plaintiffs’ claims because they had not alleged sufficient injury to establish standing.  The District Court based its ruling on a United States Supreme Court decision, Clapper v. Amnesty Int’l USA, 133 S.Ct. 1138 (2013), which held that to establish Article III standing, an injury must be “concrete, particularized, and actual or imminent.”

However, the Seventh Circuit clarified that Clapper “does not, as the district court thought, foreclose any use whatsoever of future injuries to support Article III standing.”  Rather, “injuries associated with resolving fraudulent charges and protecting oneself against future identity theft” are sufficient to confer standing.

In Remijas, the Seventh Circuit explained that there is a reasonable likelihood that the hackers will use the plaintiffs’ information to commit identity theft or credit card fraud.  “Why else would hackers break into a store’s database and steal consumers’ private information?” – the Seventh Circuit asked.  The Seventh Circuit held that the plaintiffs should not have to wait until the hackers commit these crimes to file suit.

The Seventh Circuit also considered that some of the plaintiffs have already paid for credit monitoring services to protect their data, which it held is a concrete injury.  Neiman Marcus also offered one year of credit monitoring services to its customers affected by the breach, which the Seventh Circuit considered an acknowledgment by the company that there was a likelihood that their customers’ information would be used for fraudulent purposes.

Ultimately, this decision may serve to soften the blow dealt by Clapper to data breach plaintiffs.  Specifically, based on this ruling, plaintiffs who have not incurred any fraudulent charges, but have purchased credit monitoring services, or have spent time and money protecting themselves against potential fraud may argue that they have standing.

In response to a data breach in 2014, employees of University of Pittsburgh Medical Center filed a two-count class action complaint against UPMC for (1) negligence and (2) breach of an implied contract for failing to protect their personal data. The employee plaintiffs alleged that their Social Security numbers, names, addresses, birthdates, W2 information and salaries were stolen and used to file fraudulent tax returns and open fraudulent bank accounts.

In dismissing the class action, Judge R. Stanton Wettick Jr. ruled that Pennsylvania law does not recognize a private right of action to recover actual damages as a result of a data breach. Judge Wettick stated that creating such a cause of action in the context of a data breach would overwhelm the state courts and require businesses – who are also victims in criminal activity – to spend substantial resources to respond to these claims. Judge Wettick noted that, to date, the only obligation imposed upon businesses by the Pennsylvania General Assembly is to provide notification of a data breach. Judge Wettick refused to interfere with the legislature’s direction in this area of the law.

This decision confirms that, under Pennsylvania law, plaintiffs will continue to have difficulty bringing claims against businesses who suffer data breaches.

The case is Dittman et al. v. The University of Pittsburgh Medical Center, Case No. GD-14-003285 in the Court of Common Pleas of Alleghany County, Pennsylvania.

Officials from both the Federal Trade Commission (FTC) and European Union (EU) recently called for enhancements to the Obama administration’s proposed Consumer Privacy Bill of Rights.

The White House’s proposed Consumer Privacy Bill of Rights seeks to provide “a baseline of clear protections for consumers and greater certainty for companies.”  The guiding principles of the draft bill are:  individual control, transparency, respect for context, security, access and accuracy, focused collection and accountability.

But the proposed legislation also seeks to afford companies discretion and flexibility to promote innovation, which some officials argue has led to a lack of clarity.

FTC Chairwoman Edith Ramirez had hoped for a “stronger” proposal and had “concerns about the lack of clarity in the requirements that are set forth.”  However, Chairwoman Ramirez acknowledged the significance of a privacy bill backed by the White House.  FTC Commissioner Julie Brill also expressed concern over weaknesses in the draft, calling for more boundaries.

Likewise, European Data Protection Supervisor Giovanni Buttarelli felt that the proposal lacked clarity and that, as written, “a large majority of personal data would not be subject to any provisions or safeguards.”

To review the administration’s proposed bill, click here.

 

The Supreme Court of the United States has ruled in Federal Communications Commission, et al. v. AT&T Inc., et al. (slip opinion – PDF link) that business entities have no personal privacy rights under the Freedom of Information Act (FOIA) (PDF link).  The ruling was unanimous and arose from a Third Circuit decision.

There are several exemptions built into the FOIA, whereby federal agencies do not have to make certain information available when requested.  Exemption 7(C) pertains to law enforcement records that, if disclosed, “could reasonably be expected to constitute an unwarranted invasion of personal privacy.” 5 U. S. C. §552(b)(7)(C).  The issue addressed was whether corporations have “personal privacy” for purposes of exemption 7(C).

AT&T was investigated by the Federal Communications Commission in connection with AT&T’s participation in the FCC’s E-Rate (Education-Rate) program for schools and libraries.  As a result, AT&T disclosed to the FCC that it may have overcharged the Government for its services in connection with the E-Rate program.  During the resulting investigation, AT&T disclosed various information to the Government, including billing information, name and job descriptions of employees involved and AT&T’s conclusion regarding wrongdoing by its own employees.  The matter was resolved in December 2004 and AT&T paid $500,000 and instituted a plan to ensure the incorrect billing did not occur again.

CompTel, “a trade association representing some of AT&T’s competitors,” submitted a FOIA request in connection with the E-Rate program investigation.  The FCC’s Enforcement Bureau did withhold some competitive information, as well as names and other personal information related to AT&T’s employees.  However, the Enforcement Bureau did not apply exemption 7(C) to AT&T itself because “businesses do not possess ‘personal privacy’ interests as required by the exemption.”

AT&T took the position the root term “person” in the phrase “personal privacy” refers to “persons” as defined under the Administrative Procedures Act. The definition of “person” under the Administrative Procedures Act includes several types of business entities, specifically, corporations.  The FCC concluded that AT&T’s position that it is “a ‘private corporate citizen’ with personal privacy rights that should be protected from disclosure that would ‘embarrass’ it . . . within the meaning of Exemption 7(C) . . . at odds with established [FCC] and judicial precedent,” and concluded that “Exemption 7(C) has no applicability to corporations such as [AT&T].”

The Court of Appeals for the Third Circuit agreed with AT&T, and the FCC petitioned the United States Supreme Court for review, and the Third Circuit holding was overturned.

Chief Justice Roberts delivers a thoughtful analysis of why the terms “person” and “personal” should not be read to give business entities “personal privacy rights,” which you can read in detail in the opinion (PDF link).  In a final wink, nudge and affirmation of his reasoning, Chief Justice Roberts concludes the analysis by stating that “[w]e trust that AT&T will not take it personally.” 

 With the ever-growing popularity of social networking sites, and with so many employees exercising poor judgment online, it’s easy to understand why employers are concerned about the messages and images that that their employees are disseminating on these websites.

For employers, the costs are real: Poor choices by their employees can bring with it not only bad publicity but the loss of confidential information and the risk that the employer and employee will be sued by a third party for a wide range of legal claims, including defamation, invasion of privacy, negligence, discrimination, false light publicity, public disclosure of private facts, infliction of emotional distress and violations of state and federal data breach laws.

Employees seem to comprehend the potential effect of their online rants. According to the 2009 Deloitte Ethics and Workplace Survey, 74 percent of employees believe it is easy to damage a company’s reputation on social media sites. Yet, many conduct themselves as they have a right to do so. Fifty three percent of the employees surveyed believe that an employee’s social networking page is not their employer’s business, and nearly one third said they never consider what their boss would think before posting material online. 

Social media content is also becoming a new source of evidence in employment cases. Employers view such material as a unique way to identify false statements employees make in these cases.  Employees, however, often view their employer’s interest in such content as an invasion of their privacy.

These divergent viewpoints are creating new tensions in the workplace and new issues for the courts to address.  I have written an article in the New Jersey Law Journal this week discussing these issues and trends.   To view the article, click this link.

 

 

 A complaint (PDF link) seeking class action status on behalf of all high school students at Harriton High School and Lower Merion High School (the “High Schools”) in the Lower Merion School District (the “School District”) in suburban Philadelphia was filed on February 16th.

Apparently, the School District maintains a program whereby all high school students at the High Schools are provided with a laptop in connection with their educational endeavors. Like most modern laptops, apparently these laptops include a webcam embedded in the laptop bezel.

The Complaint alleges that students and parents were never told that the School District (and its agents) have the ability (or would) to remotely activate the webcam. The Plaintiffs cite all documentation provided with the laptop and on the School District’s online resources as further support that they were never told of this remote activation/capture ability. Once activated, the School District can apparently then view and capture whatever is happening within the view of the webcam. Plaintiffs point out that this activity occurs regardless of whether anyone is sitting in front of the webcam, and captures the entire viewing area of the webcam.

Continue Reading Pennsylvania School District Sued After Allegedly Remotely Activating Student Laptop Webcam