Technologies that track a person’s location and temperature were on the table as part of the University of Maryland’s COVID-19 response, but they were ultimately dismissed due to privacy concerns, according to a member of the school’s health, safety and risk management task force.
The university’s response to the coronavirus pandemic has been heavy on technology, including by estimating the campus’ density using wireless device connectivity and virtual daily symptom monitoring. University staff have also been told swipe card and Wi-Fi use data have been used to track compliance with other reopening requirements, according to university emails. And as more technology solutions become available, privacy concerns are increasingly an issue.
“I am a broad pessimist on where privacy rights are going in this country, and I think almost regardless of the efficacy of centralized trackers, we’re going to see that at least trying to be pushed onto us,” said John Dickerson, a professor in the computer science department.
Public health authorities in Maryland are expected to adopt software that would notify someone if they were in contact with another person who voluntarily disclosed that they tested positive for COVID-19. A former professor at this university, Bill Pugh, is working with Apple to develop the program.
The software is different from beaconing technology, in which people carry unique identification tags that emit signals received by devices called beacons. Beaconing technology, which was considered at this university, would allow an institution to track a person’s specific location within a certain area.
“As a public health person we would love these data, but there is a justice concern about who is managing access to those data,” said Neil Sehgal, an assistant professor in the public health school.
Sehgal was a member of two groups that worked on plans for reopening the campus, including the health, safety and risk management task force. Sehgal said there was “a very strong push” to require people to use a temperature monitoring app created by a company that University of Maryland, Baltimore researchers had a relationship with. However, the university did not adopt the technology, since the company could have sold personal data to commercial companies.
“We didn’t want students who spiked fevers in their dorms to all of a sudden get direct mail about Lysol or Clorox wipes,” Sehgal said.
Both Sehgal and Dickerson said that widespread adoption of a given technology is dependent on the people’s trust of the organization behind it.
Dickerson cited a study co-led by Elissa Redmiles, a recent graduate of this university who founded research consultancy firm Human Computing Associates, that found roughly two-thirds of Americans would be willing to install an app that would collect information about their health and location. The study found those surveyed were more likely to install an app provided by a public health agency or their health insurance provider than the government. Sehgal implied lower trust in a product created by the federal government may be due to its handling of the COVID-19 crisis.
“The federal response forced a degree of libertarianism on everybody where you were left to fend for yourself.” Sehgal said.
Aaron Gorman, a junior computer science major, said he feels like the university’s intentions for technology privacy are mostly good, beyond his discomfort with having his contact information available on the campus directory to anyone with a university login.
“I lived on campus for two years and I never felt like there was someone peering over my shoulders all the time,” Gorman said.
Gorman identified some concerns about this university’s use of technology this semester. He said he has low confidence in the results from the daily symptom monitoring form and that the amount of email communication related to COVID-19 feels excessive.
“I completely glaze over them at this point. They could be saying really important stuff,” Gorman said.
But according to Sehgal, those daily reminders of COVID-19 have a purpose. While Sehgal said he understands the potential for alert fatigue, he thinks those reminders also likely keep people more mindful that the virus is a persistent threat.
“There is an added layer of vigilance that a repeated task confers,” Sehgal said.
Sehgal and Dickerson expressed concern about other issues surrounding the use of technology to deal with COVID-19, including the lack of open-source software, how to ensure voluntary participation in programs that capture private data, and transparency about how data will be managed and utilized.
Both said they are concerned about the impact of using automated technology on marginalized communities that typically have less access to and knowledge about technology. Sehgal pointed out that many of the university’s housekeeping and facilities staff members who work on the campus — even during the height of the pandemic — are more likely to come from marginalized communities and have less internet access at work and home.
Dickerson cited a study that he worked on with Michelle Mazurek, another professor in the computer science department, that found people with a poor understanding of technology are less likely to think badly of it. Taken a step further, Dickerson said marginalized communities may be more likely to agree to using automated technologies that could put them at higher risk for being further marginalized by that technology.
Sehgal also said he thinks the most important innovation to come during this pandemic won’t be an app, but instead an inexpensive COVID-19 test that can be used daily. Sehgal anticipates COVID-19 will remain a problem into next year.
“I don’t think technology is the solution to the pandemic,” Sehgal said. “I think technology may help us better manage our lives around the pandemic, but I think we’re probably going to be dependent on lower technology solutions that we have today to manage population spread going forward.”