At COVID-19 'paper hearing,' industry, government struggle with balancing privacy, public health

Seven witnesses submitted statements and answered questions, leaving Senators with promising tech ideas, legislative priorities and words of caution.
By Jonah Comstock
04:38 pm

With the COVID-19 pandemic preventing large gatherings of any kind, the Senate Committee on Commerce, Science and Transportation last week was forced to hold its hearing on big data and privacy protections in response to the pandemic as a “paper hearing.” Participants submitted written testimony only. Lawmakers sent those experts questions, and the panel responded with written answers.

Despite the unusual format, the hearing was packed with insights about which tracking and tracing technologies have the best chance of being helpful in the fight against COVID-19, and which constitute too high a risk to Americans’ civil liberties. 

Most witnesses were representatives from industry associations, but the list also included Ryan Calo, a professor of law at the University of Washington, and Kinsa CEO Inder Singh.

Other countries’ examples

Part of the impetus for this discussion is the relative success some other countries seem to have had with digital contact tracing – using cell phone activity, GPS and/or Bluetooth data to monitor the path of infection and quarantine people accordingly.

For instance, lawmakers pointed to Taiwan, South Korea, Singapore and Israel as countries that have managed to control the disease. But witnesses questioned whether those cases were truly illustrative, as well as whether that level of tracking could work in the United States.

“In responses like Taiwan’s, the availability of high-quality and complete data sets helped enable a policy response that effectively stopped the spread of COVID-19,” Graham Dufault, senior director for public policy at ACT | The App Association, wrote in his remarks. “However, the ready availability of an extraordinarily complete picture about individuals’ movements to a government authority is not generally a feature of American policy, which tends to avoid such invasive surveillance and enforcement without due process.”

The question at hand is whether voluntary tracking can be as effective as mandatory tracking. To that point, one of Israel’s initiatives could be a serviceable model.

“[A] program launched by the Ministry of Health has been supported by leading privacy academics in Israel,” said Stacey Gray, senior counsel for the Future of Privacy Forum. “This program involves an app, ‘HaMagen,’ which individuals may use voluntarily, and leverages GPS data, Wi-Fi data, Google Timeline history (upon separate consent) and Bluetooth data to enable alerts to users who have been in the proximity of a known infected person. Alerts trigger a recommendation for users to voluntarily self-quarantine. HaMagen is open source, voluntary and according to the Ministry of Health has been adopted by approximately 1.4 million people, or 25% of the desired population.”

University of Washington's Ryan Calo and Michelle Richardson, director of the Privacy and Data Project at the Center for Democracy and Technology, both noted that it’s impossible to know for sure whether results in other countries are replicable here.

“To the extent that technology-based contact tracing has been effective in these jurisdictions, they have not been voluntary, self-reported, or involved self-help,” Calo said. “Rather, public officials have forced compliance and dispatched investigators to interview and, if necessary, forcibly quarantine exposed individuals. I see it as an open question whether Americans would be comfortable with this level of state expenditure and intervention. At any rate, the experiences of these nations are not a ready analogy.”

“Even though location and proximity tracing apps have been deployed in other countries,” Richardson added, “their impact has not been disentangled from contemporaneous efforts like widespread testing, compulsory quarantines, public information on the movement of infected individuals and other responses.”

Calo pointed out that voluntary self-reporting apps sound good in theory from a privacy perspective, but they introduce a new danger: bad actors co-opting the platform.

“It is not hard to imagine nefarious use cases as well,” he wrote. “A foreign operative who wished to sow chaos, an unscrupulous political operative who wished to dampen political participation, or a desperate business owner who sought to shut down the competition, all could use self-reported instances of COVID-19 in an anonymous fashion to achieve their goals.”

How much are privacy and security at odds?

Many senators framed their questions to the panel in terms of the balance between privacy and security.

“I want governments and businesses to be mindful that, in a complex world where absolutes like total anonymity and privacy are rare, we have to balance the value of privacy with other core values, and that the quest for that equilibrium is a constant challenge,” Leigh Freund, president and CEO of the Network Advertising Initiative, wrote. “I am optimistic that we can, collectively, retain a strong belief in the value of data for both societal and commercial benefit, and that its use can be governed by respect, rather than fear.”

Most witnesses suggested that, with the right technology, it was possible to have both privacy and an effective response to the virus. But to do so, companies and governments will have to choose the right approaches and carry them out correctly.

“As one can see, there are a number of ways big data processing can advance the coronavirus response without unduly risking individual privacy,” Richardson said. “Some of this data does not reflect personal information at all – such as state level statistics that are aggregated and cannot be associated with specific individuals. But there are also uses of data that are riskier. For example, if heat maps or case reporting become too granular, it may be easy to associate a positive coronavirus status with identifiable people. Symptom trackers may also pose privacy risks if they collect personal information.”

In general, the conflict comes from the use of location tracking to trace the spread of the disease. In order to do so effectively, the government would have to collect an uncomfortable amount of data about individuals.

“Contact tracing apps collect and combine two highly sensitive categories of information: location and health status,” said Calo. “It seems fair to wonder whether these apps, developed by small teams, will be able to keep such sensitive information private and secure. To the extent digital contact tracing – or any private, technology-driven response to the pandemic – involves the sharing of healthcare data with private parties, there is also the specter of inadequate transparency or consent.”

Many members of the panel pointed out that “de-identifying” data is not a good solution, as location data is especially hard to de-identify and easy to re-identify. But there are other ways to anonymize data, such as processing data locally and sending only aggregate information to companies or governments.

“In an era of big data, super computers and highly sophisticated hackers, even using sophisticated anonymization techniques cannot completely prevent the possibility of anonymized data being associated with an individual,” Freund said. “For this reason, it is necessary to also incorporate technical and administrative controls that protect against this unintended outcome, like strict data usage limitations, data minimization practices, employee training and data retention restrictions.”

And privacy and safety may not be the only values being traded. Many witnesses noted that if these systems aren’t carefully designed, they can also contribute to health inequalities.

“We are alarmed by the early reports of COVID-19-related death disparities in African American communities,” said Gray. “Understanding how and why these disparities exist is only possible with the collection of sensitive data combined with health information reflecting racial demographics. For example, voluntary contact tracing apps must be adopted by sufficient numbers of app users within high-risk populations, including those who cannot afford the latest mobile technology. To the extent possible, mobile apps should be designed so they are not unduly limited to users of only the newest or more sophisticated devices that can accommodate the recent updates to iOS and Android operating systems.”

Ultimately, Calo urged the legislators to be transparent about the trade-offs they are comfortable making.

“The American people through their representatives may decide that these extraordinary times call for invasive measures in order to slow and contain the spread of coronavirus,” he said. “For example, some Americans may embrace testing and reporting requirements, mandatory quarantine, and ‘badges’ that indicate who is free of coronavirus or possess antibodies against it. I am not an elected official and so it is hard for me to speak on anyone’s behalf but my own.”

But, along with several other witnesses, he was quick to warn the government that whatever extraordinary power they choose to use to fight the virus, they must commit now to giving it up when the threat has passed.

“To paraphrase the late Justice Robert Jackson, a problem with emergency powers is that they tend to kindle emergencies,” Calo wrote. “My hope is that policymakers will expressly ensure that any accommodations privacy must concede to the pandemic will not outlive the crisis.”

Privacy-friendly solutions

Two approaches to using big data to fight COVID-19 seem promising when it comes to ensuring both safety and privacy.

One, Bluetooth-based contact tracing, is the approach that was already developed by researchers at MIT and has since formed the basis for the Apple-Google partnership

“Contact tracing in particular presents some privacy challenges because it involves associating a positive COVID-19 diagnosis with a specific device,” Dufault explained. “But MIT's system would avoid some of the privacy risks by anonymously associating a positive COVID-19 diagnosis to certain Bluetooth identifiers (or "chirps"), which make it more difficult for a human to associate the diagnosis with the person who owns the device. In practice, the MIT method would have participants' smartphones send out periodic chirps through Bluetooth." 

If a chirp determined that the participant's smartphone was close to someone with a positive COVID-19 diagnosis, then it would be known that that person was exposed to infection. But the system itself would not do anything more than match the positive COVID-19 diagnosis to the anonymous string of numbers associated with an individual's Bluetooth connection, so theoretically it would protect participants' privacy.”

The buy-in of those two tech giants may solve one of the initial problems with the approach, Dufault said, namely that it would be hard to get large enough buy-in for the app to be useful.

Freund added that a number of privacy protections are built into that system: it requires explicit user consent, doesn’t collect personally identifiable information or user location data, the list of people the user has been in contact with never leaves their phone, people who test positive are not identified to other users, and it will only be used for contact tracing by public health authorities for COVID-19 pandemic management, according to the terms of use.

“While these developments are promising, it’s important to note that contact tracing relies on the availability of testing,” David F. Grimaldi, Jr., executive chair of public policy at the Interactive Advertising Bureau, wrote. “If the availability of testing is limited, the ability to rely on contract tracing is limited as well. Apps should also address risks related to potential mis-use (trolling or other false alerts) or abuse (spoofing). Public health authorities, rather than tech companies, must lead the way in helping shape the development of these apps. Healthcare professionals should also play a role in approving the triggering of alerts for individuals to self-quarantine.”

Calo echoed the call for the government to lead, rather than follow, in the development of these platforms.

“I worry that some government officials seem to be calling upon technology companies such as Google, Apple and Facebook to come up with unspecified solutions, rather than describing a specific government need,” he wrote. “By way of contrast, the government has identified a clear need for ventilators and so has directed an auto manufacturer go carry through on plans to build ventilators. Government officials did not simply charge the auto industry with coming up with whatever technologies it felt would help in the pandemic.”

The other privacy-friendly approach was Kinsa’s smart thermometer system, which has already seen success in forecasting areas where the virus will break out next.

“This is an early warning system,” Kinsa CEO Inder Singh wrote. “Think of it as a flashlight going off, illuminating a geography and saying, “send the test kits in, because something unusual is happening.” This real-time information on where and when illness is starting, and where community spread is occurring, is vital in appropriately allocating limited supplies and manpower to the areas most in need of early intervention. It is possible to stem the spread of an epidemic or pandemic such as COVID-19 with this kind of immediate intervention.”

Singh pushed Congress to draft his own company into service to fight the pandemic.

“It would take but 4.4 million additional smart thermometers – at a cost of less than $100 million – to effectively detect COVID-19 spread on a county-by-county basis,” he wrote. “I believe this system is essential infrastructure today and will continue to be essential as our country faces worsening flu seasons and the possibilities of additional epidemics and pandemics. By knowing where and when outbreaks are occurring in real time, we can help our communities, the health care system and public health agencies direct resources effectively. Such an investment would most importantly save lives, while creating a significant return on investment, both in reopening the domestic and global economies and mitigating the impact of future outbreaks.”

A legislative need

In addition to advising Congress on how to face the current crisis, many of the witnesses took the opportunity to advocate for a national privacy framework, an idea that’s been kicking around Congress for some time but has yet to make it to law.

“The United States does not have a comprehensive privacy law to protect Americans’ personal information,” Richardson said. “Instead, we have a patchwork of federal, state and local laws that regulate specific sectors or data sets like education records, financial records, children’s information and health records if they are held by certain entities. This has led to the explosion of risky and exploitive data-driven behaviors in the vast unregulated space in between. It has reduced public trust in technology companies, and as a result, may discourage people from using legitimate services or waste precious time and resources on untested products. This in turn may inhibit the coronavirus response.”

“COVID-19 underscores the need for privacy legislation that protects consumers, allows for responsible uses of data and protects innovation,” Freund added.

The paper hearing contained several other suggestions for longer-term government action, including the creation of a National Infectious Disease Forecasting Center similar to the National Weather Service, an idea out of Johns Hopkins championed by Senator Ted Cruz, and an arm of the CDC tasked with working with the private sector, especially startups.

“As it stands today, only a handful of federal agencies, and an even smaller number of teams within those federal agencies, are staffed, equipped, and funded enough to engage with new kinds of data or innovations,” wrote Singh. “This risks that many promising data sets and technologies will be overlooked.”


The latest news in digital health delivered daily to your inbox.

Thank you for subscribing!
Error! Something went wrong!