Safety or surveillance: drones and the COVID-19 pandemic
Police departments in some states have started using drones to deal with the COVID-19 crisis. In California, Florida and New Jersey, officials have used drones to get messages to homeless communities or warn people to stay apart and respect physical distancing. But some drone companies are making even bigger claims, like they can remotely detect a fever or use facial recognition to see if someone’s wearing a mask.
I spoke with Ryan Calo, a professor of law at the University of Washington who studies emerging tech and policy. He said it’s good that people are thinking creatively, but there is cause for concern. The following is an edited transcript of our conversation.
Ryan Calo: I think that the use of a drone is going to make people feel uncomfortable. The place I really think is problematic is trying to use drone technology coupled with artificial intelligence to try to figure out if people are far enough away from one another, or to try to figure out whether people are sick. While it might be technically lawful if you have the right license to do it, to use drones to keep people apart, that contributes to an already anxious environment. It can be a distraction, and especially I wouldn’t take it to surveillance.
Molly Wood: When you say drones that claim they can tell if people are sick, is this the thing where companies claim their drones can sense the body temperature of people they’re hovering over?
Calo: There’s no shortage of people out there selling technical snake oil at any time. These systems have already been deployed in places like grocery stores and airports. What these devices are detecting is unlikely, really, to correlate for what you’re looking for and is likely to lead to many false positives and negatives. Again, you already have a group of people who are anxious, who are concerned, and then to put some inscrutable flying robot in the mix just strikes me as profoundly unwise.
Wood: However, even if it’s not drones, we do seem to be seeing a steady march toward surveillance. How much are you worried that this is going to be an excuse to implement all kinds of surveillance?
Calo: I’m very worried. It’s interesting because there are a number of efforts to try to create an app that you can just download on your phone that’s supposed to keep you safe. The idea would be that if you went near anybody that had tested positive for coronavirus, you get a ping, and that somehow this is going to be a tool that can get us out of our houses without risking infection. Either you might be reassured that there’s no instance of coronavirus in your community when in fact there is, or conversely you might be cowering at home because every couple days you get a ping telling you you need to self-quarantine. I think that there’s a big difference between using tech as part of the solution and falling back on tech-solutionism.
Wood: However, you’ve got to think that the latter is an opportunity for tech companies who have come out in force and said, “We have an app for that. We have a camera, we have an artificial intelligence algorithm that can more accurately predict the movements of the crowds.” Should we ignore all of that, or are you saying there’s some of that that could be good, but we’ve got to be careful?
Calo: I think that some set of companies are looking to their own data and asking questions, like how can we be useful? For example, Kinsa, which is that thermometer that’s connected that purports to tell you where there is a spike in fevers. They want to be useful with that data, according to the testimony of their CEO at a recent Senate Judiciary Committee hearing that I also testified at. They want to be useful with that data, and it could be a tool. It has some problems and some perils. Conversely, you have situations where it seems at least to me personally, that companies are just responding to social pressure, especially from the White House, to do something. “You’re innovative, you have all this data, you have all this attention of the public, do something.” These companies are then doing whatever they can to discharge their obligation, and to get rid of the pressure to do something, without actually confronting any of the hard problems. I think both are going on, even sometimes within the same company. The mobility reports that Google was doing were useful, but conversely, when Google teamed up with Apple to create a framework, or an [application programming interface], for contact-tracing apps, they really did kick a lot of the hard issues down the road. And they just created an architecture that might be privacy-friendly, being that it is decentralized, but doesn’t solve a lot of the hard problems that trying to use apps to automate contact tracing create. I think sometimes companies are legitimately trying to look at the data and say how we can help. Other companies are trying to do something so that they look like they’re doing something, and still others are selling snake oil. That whole ecosystem is out there.
Related links: More insight from Molly Wood
The ACLU and other civil rights advocates say that temperature-sensing drones are in some ways just the start of health-surveillance efforts. Thermal cameras are being installed in some courthouses. Reuters reported that Amazon bought heat-sensing cameras from a Chinese company that’s been accused of helping China track and detain Muslim citizens, and the FDA says it won’t necessarily intervene if companies buy or use thermal-sensing technologies that aren’t FDA-approved because apparently there’s a huge shortage of temperature-sensing tech. A piece in the Telegraph last month called the trend “fever surveillance.”
However, good news in the drone department. Apparently CVS and UPS will use drones to start delivering prescription medicine to at least one retirement community in Florida. More life-saving med drops, fewer flying panopticons. Everybody wins.
Also watching
In Amazon’s ongoing labor disputes, a vice president and internal activist quit the company and wrote an open letter saying Amazon was firing whistleblowers in order to “create a climate of fear.”
Apple and Google put out rules for developers about the contact-tracing tools they’ve co-designed, now called exposure notification. Among reassuring details: a requirement that anyone developing an app using the tools must be a government public health authority, that they cannot access any location data on the user’s devices and that ideally there will be only one app per country so people aren’t confused about which one to use, thereby creating an inefficient patchwork that helps no one.
However, with nearly three in five Americans saying they’re suspicious or unwilling to use such an app, there is still a large PR battle to be fought.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.
Support “Marketplace Tech” in any amount today and become a partner in our mission.