Learning from the People: Responsibly Encouraging Adoption of Contact Tracing Apps
While significant focus was put on developing privacy protocols for these apps, relatively less attention was given to understanding why, and why not, users might adopt them. Yet, for these technological solutions to benefit public health, users must be willing to adopt these apps. In this talk I showcase the value of taking a descriptive ethics approach to setting best practices in this new domain. Descriptive ethics, introduced by the field of moral philosophy, determines best practices by learning directly from the user — observing people’s preferences and inferring best practice from that behavior — instead of exclusively relying on experts’ normative decisions. This talk presents an empirically-validated framework of user’s decision inputs to adopt COVID19 contact tracing apps, including app accuracy, privacy, benefits, and mobile costs. Using predictive models of users’ likelihood to install COVID apps based on quantifications of these factors, I show how high the bar is for achieving adoption. I conclude by discussing a large-scale field study in which we put our survey and experimental results into practice to help the state of Louisiana advertise their COVID app through a series of randomized controlled Google Ads experiments.
Human-centric Privacy Design and Engineering
Privacy is ultimately about people. User studies and experiments provide insights on users’ privacy needs, concerns, and expectations, which are essential to understand what a system’s actual privacy issues are from a user perspective. Drawing on the speaker’s research on privacy notices and controls in different contexts, from cookie consent notices to smart speakers, this talk discusses how and why privacy controls are often misaligned with user needs, how public policy aimed at protecting privacy often falls short, and how a human-centric approach to privacy design and engineering can yield usable and useful privacy protections that more effectively meet users’ needs and might also benefit companies. Zoom meeting link: https://newcastleuniversity.zoom.us/j/84890082823?pwd=TEJTKzEvVDJPZy9mYU1GUzNORTRKdz09 Meeting ID: 848 9008 2823 Passcode: 944316 Youtube Live Streaming: https://youtu.be/8WBlfTLoO2k Slides Youtube VoD
“Do This! Do That! And nothing will happen”: Do specifications lead to securely stored passwords? (ICSE ’21)
Does the act of writing a specification (how the code should behave) for a piece of security-sensitive code lead to developers producing more secure code? We asked 138 developers to write a snippet of code to store a password: Half of them were asked to write down a specification of how the code should behave before writing the program, the other half were asked to write the code but without being prompted to write a specification first. We find that explicitly prompting developers to write a specification has a small positive effect on the security of password storage approaches implemented. However, developers often fail to store passwords securely, despite claiming to be confident and knowledgeable in their approaches, and despite considering an appropriate range of threats. We find a need for developer-centered usable mechanisms for telling developers how to store passwords: lists of what they must do are not working.
Usage Patterns of Privacy-Enhancing Technologies [CCS ’20]
The steady reports of privacy invasions online paints a picture of the Internet growing into a more dangerous place. This is supported by reports of the potential scale for online harms facilitated by the mass deployment of online technology and by the data-intensive web. While Internet users often express concern about privacy, some report taking actions to protect their privacy online.
We investigate the methods and technologies that individuals employ to protect their privacy online. We conduct two studies, of N=180 and N=907, to elicit individuals’ use of privacy methods, within the US, the UK and Germany. We find that non-technology methods are among the most used methods in the three countries. We identify distinct groupings of privacy methods usage in a cluster map. The map shows that together with non-technology methods of privacy protection, simple privacy-enhancing technologies (PETs) that are integrated in services, form the most used cluster, whereas more advanced PETs form a different, least used cluster. We further investigate user perception and reasoning for mostly using one set of PETs in a third study with N=183 participants. We do not find a difference in perceived competency in protecting privacy online between advanced and simpler PETs users. We compare use perceptions between advanced and simpler PETs and report on user reasoning for not using advanced PETs, as well as support needed for potential use. This paper contributes to privacy research by eliciting use and perception of use across 43 privacy methods, including 26 PETs across three countries and provides a map of PETs usage. The cluster map provides a systematic and reliable point of reference for future user-centric investigations across PETs. Overall, this research provides a broad understanding of use and perceptions across a collection of PETs, and can lead to future research for scaling use of PETs.