Technology companies are rushing to aid in the fight against the global COVID-19 pandemic by developing applications that can aid in contact tracing and assist public health authorities in containing the spread of the virus. These efforts, while promising, often involve processing massive amounts of personal location and health data, which carries significant privacy risks.

On April 9, the U.S. Senate held a paper hearing on enlisting big data in the fight against coronavirus, recent use of consumer data, ways to protect consumer privacy and how to handle COVID-19 related data once the pandemic ends. The senate received testimony from the Network Advertising Initiative, the Interactive Advertising Bureau, and the Future of Privacy Forum among others. Big data refers to the huge volume of consumer data collected from sources such as smartphones, mobile apps, fitness trackers, connected cars and other connected products. Big data also refers to advanced computing capabilities that analyze large data sets to identify trends and predictions.

Government officials and health-care professionals have turned to “big data” to help fight the global pandemic on various fronts. Big data has informed contact-tracing apps that rely on self-reported data about health status and location data to inform the public of virus hotspots and alert those who may have come in contact with someone with the virus. Big data is also used to manage resources and supply chains to distribute needed medical equipment, and is used to track the effectiveness of social distancing, stay at home guidelines, and monitor public spaces in an effort to shape public policy. A global team of infectious disease epidemiologists are working with tech companies to use aggregated mobility data to update public officials on how well social distancing interventions are working using anonymized aggregated data sets, providing the data to analytics support for interpretation.

The use of big data, however, has created privacy risks. Risks include overly granular heat maps or case reporting that may make it possible to associate a positive coronavirus status with identifiable people. Symptom trackers may also pose privacy risks if they collect personal information. Witnesses also expressed concerns of cookies on mobile apps that inevitably connect IP addresses and other mobile data to individuals.

Witnesses strongly encouraged a privacy by design framework as companies race to address the pandemic with technology solutions. First and foremost, privacy should be embedded into the design and operation of the product, network infrastructure and business practices. Witnesses also suggested utilizing privacy risk assessment frameworks to identify, and mitigate or eliminate risks.

The witnesses agreed that the current pandemic only highlights the need for a comprehensive national privacy law, but until then, witnesses suggested various ways to limit privacy risks including various contractual controls like the following:

  • Purpose and time limitation. The purpose for data collection should be specified at or before the time of data collection and subsequent data use should be limited to fulfilling a specific and well-defined purpose. Any personal data collection and use utilized to address COVID-19 should be limited in time. Companies should also utilize contractual controls to limit downstream secondary uses and onward sharing of data.
  • Data minimization. Public health officials should not be provided with troves of individualized location and behavioral data with the unrestricted choice of how to use the data. Rather, the types of data exchanged should be agreed upon before the data is exchanged and then only the data necessary and proportionate to fulfill that purpose should be shared.
  • Retention and deletion policies. Institutions should consider the objective of sharing personal information and provide contractual controls that specify appropriate retention and deletion policies. This should include appropriate decentralized storage and processing uses.
  • Consumer choice. Even while a public health crisis may not require the level of consent that would be obtained in normal circumstances, companies should still permit for the appropriate consent where possible, as well as other consumer controls. Companies should limit unexpected uses of sensitive data and provide strong consumer transparency and consumer rights, including the ability to control or opt out of disclosure of data, particularly as it concerns location data.
  • Strict accountability measures. Put simply, history reveals that it is difficult to discontinue irresponsible privacy practices started in an emergency. Organizations should establish an exit strategy up front to protect against continued emergency practices after the crisis.
  • Other measures. Consider the use of privacy-enhancing technologies (PETs) including synthetic data or contractual controls that require use of statistics that are aggregated and cannot be associated with specific individuals.