an inconvenient trust: user attitudes toward security and … · 2016-06-22 · june 2–24 01...

19
This paper is included in the Proceedings of the Twelfth Symposium on Usable Privacy and Security (SOUPS 2016). June 22–24, 2016 • Denver, CO, USA ISBN 978-1-931971-31-7 Open access to the Proceedings of the Twelfth Symposium on Usable Privacy and Security (SOUPS 2016) is sponsored by USENIX. An Inconvenient Trust: User Attitudes toward Security and Usability Tradeoffs for Key-Directory Encryption Systems Wei Bai, Doowon Kim, Moses Namara, and Yichen Qian, University of Maryland, College Park; Patrick Gage Kelley, University of New Mexico; Michelle L. Mazurek, University of Maryland, College Park https://www.usenix.org/conference/soups2016/technical-sessions/presentation/bai

Upload: others

Post on 19-Jun-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

This paper is included in the Proceedings of the Twelfth Symposium on Usable Privacy and Security (SOUPS 2016).

June 22–24, 2016 • Denver, CO, USA

ISBN 978-1-931971-31-7

Open access to the Proceedings of the Twelfth Symposium on Usable Privacy

and Security (SOUPS 2016) is sponsored by USENIX.

An Inconvenient Trust: User Attitudes toward Security and Usability Tradeoffs for

Key-Directory Encryption SystemsWei Bai, Doowon Kim, Moses Namara, and Yichen Qian, University of Maryland, College Park; Patrick Gage Kelley, University of New Mexico; Michelle L. Mazurek,

University of Maryland, College Park

https://www.usenix.org/conference/soups2016/technical-sessions/presentation/bai

Page 2: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

USENIX Association 2016 Symposium on Usable Privacy and Security 113

An Inconvenient Trust: User Attitudes Toward Security andUsability Tradeoffs for Key-Directory Encryption Systems

Wei Bai, Doowon Kim, Moses Namara, Yichen Qian, Patrick Gage Kelley,∗ and Michelle L. MazurekUniversity of Maryland, ∗University of New Mexico

{wbai, doowon, mnamara, yqian1, mmazurek}@umd.edu, ∗[email protected]

ABSTRACTMany critical communications now take place digitally, but recentrevelations demonstrate that these communications can often be in-tercepted. To achieve true message privacy, users need end-to-endmessage encryption, in which the communications service provideris not able to decrypt the content. Historically, end-to-end encryp-tion has proven extremely difficult for people to use correctly, butrecently tools like Apple’s iMessage and Google’s End-to-End havemade it more broadly accessible by using key-directory services.These tools (and others like them) sacrifice some security proper-ties for convenience, which alarms some security experts, but littleis known about how average users evaluate these tradeoffs. In a52-person interview study, we asked participants to complete en-cryption tasks using both a traditional key-exchange model and akey-directory-based registration model. We also described the se-curity properties of each (varying the order of presentation) andasked participants for their opinions. We found that participants un-derstood the two models well and made coherent assessments aboutwhen different tradeoffs might be appropriate. Our participants rec-ognized that the less-convenient exchange model was more secureoverall, but found the security of the registration model to be “goodenough” for many everyday purposes.

1. INTRODUCTIONAs important communications become primarily digital, privacybecomes an increasingly critical concern. Users of communicationservices (e.g., email and chat) risk breaches of confidentiality dueto attacks on the service from outsiders or rogue employees, or evengovernment subpoenas. The only way to truly assure confidential-ity is to use encryption so that the communication service has noaccess to the content. Despite considerable evidence of and front-page reporting about content breaches [3, 5, 9, 21, 24], encryptionhas generally not been widely adopted for person-to-person com-munications such as email and chat [20].

Researchers have given considerable thought to the reasons for thislack of adoption. More than 15 years of research have identi-fied major usability problems with encryption tools, ranging frompoorly designed user interfaces to the fundamental challenges ofsafe and scalable key distribution [16, 33, 35, 43].

Copyright is held by the author/owner. Permission to make digital or hardcopies of all or part of this work for personal or classroom use is grantedwithout fee.Symposium on Usable Privacy and Security (SOUPS) 2016, June 22–24,2016, Denver, Colorado.

Recently, however, progress toward better usability and thus wideradoption has been made. Apple applied seamless end-to-end en-cryption to its iMessage and FaceTime services [2,25]. By centrallydistributing public keys, Apple ensures the encryption is transpar-ent to users, bringing end-to-end encryption to millions of iPhone,iPad, and Mac users. This design, however, leaves open the possi-bility that Apple itself could carry out a man-in-the-middle attackto break its users’ privacy, for example at the request of law en-forcement authorities [8, 42]. Popular messaging app WhatsApphas also implemented end-to-end encryption for text, voice, andvideo communications [27]. As with iMessage, WhatsApp cen-trally distributes public keys; however, users can optionally verifyeach other’s keys manually or via QR code. Google and Yahoo!are currently developing similar approaches, with an added moni-toring protocol that allows users and third parties to audit the keydirectory for consistency and transparency [19, 30, 37]. Some pri-vacy experts have suggested that given this potential man-in-the-middle attack, these services should not be recommended to endusers. As just one example, one security researcher suggests that“iMessage remains perhaps the best usable covert communicationchannel available today if your adversary can’t compromise Apple.... If one desires confidentiality, I think the only role for iMessageis instructing someone how to use Signal1" [42].

In a sense, the issue comes down to whether the benefit from manymore people adopting encrypted communications is outweighed bythe reduced security inherent in the central key distribution model.While security experts are best positioned to understand the techni-cal differences between models, end users will ultimately be facedwith the choice of which platforms and products to install and use.Researchers have considered the needs of some highly privacy-sensitive users, such as journalists and activists [18, 28]. To ourknowledge, however, no one has asked average users for their opin-ions about these tradeoffs. This means that although security re-searchers may understand the risks and benefits of different tools,as a community we do not understand how an average user willweight different factors in deciding whether to adopt or ignore var-ious encrypted communication technologies.

To understand how non-expert users feel about these tradeoffs, weundertook a 52-person lab study. We introduced participants to twoencryption models: an exchange model in which participants man-ually exchange keys (analogous to traditional PGP) and a regis-tration model in which participants sign up with a central servicethat distributes keys (analogous to iMessage). For each model, weasked them to complete several encrypted communication tasks;we also gave them a short, high-level explanation of each model’s

1An encryption tool: https://whispersystems.org/. Last accessed on05/16/2016.

Page 3: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

114 2016 Symposium on Usable Privacy and Security USENIX Association

security properties. (We varied the order of presentation to accountfor biases.) We then asked participants to comment on the securityand usability of each model, as well as their overall opinion of thetradeoffs involved. The experiment was designed, insofar as possi-ble, to avoid comparisons based on user-interface design and focusinstead on the underlying properties of each encryption model.

We found that participants understood the two models fairly welland expressed nuanced insights into the tradeoffs between them.As predicted, participants found the registration model consider-ably more convenient than the exchange model. More interestingly,while the exchange system was considered more secure overall, thedifference was slight: both general trust that large email providerswould not risk their reputations by cheating and reasonable con-cerns about participants’ own ability to correctly implement the ex-change model mitigated this difference. Separately, we asked abouthalf of our participants to evaluate the auditing model proposed inCONIKS [29], which is similar to that in development by Googleand Yahoo!, and we found that for many users it provides a mean-ingful additional degree of confidence in the registration model’sprivacy.

Overall, our results suggest that users recognize the benefit of theexchange model for very sensitive communications, but find themore-usable registration model sufficient for the majority of every-day communications they engage in. While there are risks to thismodel, some of which can be alleviated by auditing, we argue thatthe marginal benefit of broad adoption will outweigh these risks.Historically, encryption schemes that require significant user efforthave never gained broad popularity. Trying to convince averageusers to exclusively use more complicated schemes, when they of-ten don’t see a need for the added protection, may instead keepthem away from using any encryption at all. Rather than spreadingundue alarm about the risks of registration models, or forcing usersinto only exchange models, we recommend that policymakers anddesigners present tradeoffs clearly and encourage adoption of us-able but imperfect security for the many scenarios where it may beappropriate.

2. BACKGROUND AND RELATED WORKWe briefly discuss the history of public-key-encrypted email sys-tems and encryption usability studies.

2.1 A brief history of encrypted emailDiffie and Hellman proposed public-key cryptography in 1976, sug-gesting that a public directory would allow anyone to send privatemessages to anyone else; in 1978, the RSA algorithm made the ideapractical [11]. In 1991, John Zimmerman developed PGP, whichsupported sending public-key encrypted email. In the second ver-sion, to alleviate the key verification problem, he proposed a “webof confidence” (later known as web of trust) for establishing keyauthenticity [44]. In a web of trust, users can sign each others’keys to endorse their authenticity, and can choose to accept keysthat come with signatures from “trusted introducers." Despite this,key verification has remained problematic for many years.

In 1999, RFC 2633 defined Secure/Multipurpose Internet Mail Ex-tensions (S/MIME), which takes a centralized approach to key dis-tribution: all users have public-key certificates signed by a certifi-cation authority (CA), which are distributed along with any signedemails sent by that user [31]. S/MIME allowed straightforward in-tegration of encryption to email clients like Microsoft Outlook andNetscape Communicator and was adopted by some corporate orga-nizations with the capability to manage keys hierarchically, but wasnot adopted broadly by consumers.

More recently, several researchers and companies have exploredways to split the difference between completely decentralized andcompletely centralized key management. Gutmann proposed ap-plying key continuity management, in which keys are trusted onfirst use but key changes are detected, to email [22]. In Apple’siMessage, private keys are generated on users’ devices and the cor-responding public keys are uploaded to Apple’s proprietary direc-tory service. To send a message to a user with multiple devices,the message is encrypted once for each device [1, 25]. WhatsAppuses a related approach based on the Signal Protocol, but allowsusers to confirm the authenticity of each other’s keys if they chooseto [27]. A recently reported vulnerability in the iMessage encryp-tion mechanism points to the importance of validating the securityof any end-to-end-messaging system [17]; however, this is orthog-onal to our consideration of the underlying key exchange model.

In certificate transparency, publicly auditable append-only logs canbe used to determine whether rogue certificates have been signedusing a stolen CA key [26]. Ryan extended this approach for end-to-end email encryption [34]. CONIKS extends certificate trans-parency to allow users to efficiently monitor their own key entriesand to support privacy in the key directory [29]. Google and Yahoo!are adopting a variation of certificate transparency for their end-to-end encryption extension [19, 30]. Each of these approaches tradesoff a different amount of security for convenience.

Other researchers have considered alternatives to standard public-key encryption that are designed to be more usable. Fahl et al.proposed Confidentiality as a Service (CaaS) [13], which operateson a registration model mostly transparent to users. This approachuses symmetric cryptography and splits trust between the commu-nications provider and the CaaS provider. Neither individually canread private messages, but if the two collude they can.

2.2 The usability of encrypted emailIn 1999, Whitten and Tygar published the now-seminal Why JohnnyCan’t Encrypt: A Usability Evaluation of PGP 5.0 [43]. This paperevaluated the interface for PGP 5.0 and found that most users (two-thirds) were unable to successfully sign and encrypt an email in the90 minute session. This led to a series of follow-on papers: evalu-ating PGP 9 (key certification is still a problem) [35], S/MIME andOutlook integration (KCM seems promising) [16], Facebook en-cryption (using CaaS) [14], and several others (e.g., [33,39]). Thesestudies largely ask users to do tasks they are unfamiliar with and fo-cus on success rates (key pairs generation and collection, sendingand decrypting messages, etc.). They provide valuable insight intohow effectively novices can learn a particular system, how specificuser interface design choices impact users, and where the difficul-ties lie. However, users are rarely presented with multiple potentialencryption infrastructure models. Ruoti et al. compared the usabil-ity of three email encryption systems using pairs of novice users[32], but this work did not consider the security tradeoffs of thesystems users were evaluating.

Tong et al. re-evaluated the test of Johnny with a different set ofterms and documentation, including using a lock-and-key metaphorfor public and private keys [40]. In preliminary results, they foundthat the metaphors aided understanding. We adopt the lock metaphorin our study, as detailed below.

Researchers have also studied social and cultural norms that alsolead to aversion to encryption. Often users believe that they have noreason to encrypt their email because they have “nothing to hide,”or because they cannot imagine anyone being interested in the mes-sages they are sending [36]. In an interview study at an unnamed

2

Page 4: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

USENIX Association 2016 Symposium on Usable Privacy and Security 115

non-violent, direct-action organization (which one might expect tobe more interested and aware of the benefits of encryption), Gawet al. found that employees believed “routine use of encryption[was] paranoid [behavior]” [18]. In this work, we do not directlyaddress social norms regarding encryption, but several participantsdid discuss paranoia and suggested using different systems to ac-commodate different levels of privacy concern.

McGregor et al. considered the specific security and encryptionneeds of journalists protecting confidential sources, finding thatadoption is frequently driven by source preferences and that ex-isting models do not meet some important journalistic needs, suchas verifying the authenticity of sources [28]. Considering the needsand preferences of users with critical privacy sensitivity, such asactivists and journalists, is an important topic, but is orthogonal toour emphasis on general users.

3. METHODOLOGYWe used a within-subjects lab study to examine participants’ con-cerns and preferences regarding the usability and security of end-to-end email encryption. Each participant was introduced to twogeneral models for key management, exchange and registration.For both models, we described a public key as a public lock. Thisapproach, inspired by Tong et al., avoids overloading the term “key”and was used to provide a more intuitive understanding of howpublic-key pairs operate [40].

In the exchange model, similar to traditional PGP, participants gen-erate a key pair and then distribute the public locks to people theywant to communicate with. We offered participants several meth-ods for exchanging locks: the same email account they would usefor encrypted communication, a secondary email account, postingthe public lock on Facebook or sending via Facebook Messages, orusing a simulated “key server" to upload their lock to a public di-rectory. (These options were presented to each participant in a ran-dom order.) Simulated correspondents (played during the study bya researcher) sent back their own public locks via the same mech-anism the participant chose, or via the mechanism the participantrequested.

In the registration model, participants again generate a key pair.In this case, they “register" their public lock with a simulated keydirectory service; correspondents’ locks were pre-installed to sim-ulate automatically retrieving them from the directory. Participantswere thus able to send and receive encrypted email from all sim-ulated correspondents immediately upon creating and registeringtheir own keys. In iMessage, the key generation step itself is com-pletely transparent to users, who may never realize a key was cre-ated; we chose instead to make key generation explicit to help usersunderstand the process.

Within each model, participants were asked to complete a seriesof simulated tasks, such as exchanging encrypted emails in a role-playing scenario (see details below); they were also introduced to abrief, non-technical review of the security properties of each model.Participants were asked to give their opinions about each model im-mediately after completing the tasks and security learning for thatmodel. We also conducted an exit interview regarding the overallusability and security of each model, whether participants woulduse it themselves or recommend it to others, and in what circum-stances it might or might not be appropriate.

We chose a within-subjects study because we were primarily inter-ested in how participants would understand and value the tradeoffsamong the options. As shown in Table 1, we varied the order of ac-tivities to account for ordering effects. Participants were assigned

round-robin to one of these four possible orders of activities.

First activity Second Third Fourth

ET (Exchange, Tasks) ES RT RSES (Exchange, Security learning) ET RS RTRT (Registration, Tasks) RS ET ESRS (Registration, Security learning) RT ES ET

Table 1: The order of activities varied across participants. Eachparticipant worked with either the Exchange (E) or the Registration(R) model first. Within each model, participants either completedthe encryption Tasks (T) first or learned about Security properties(S) first. Throughout the paper, participants are labeled by firstactivity; e.g., participant RT3 completed encryption tasks for theregistration model first.

3.1 Encryption tasksThe set of encryption-related tasks for each model is shown in Ta-ble 2. In both models, participants were asked to generate a key pairlocally. In the exchange model, participants then exchanged pub-lic locks with simulated friend Alice, including both sending Alicetheir lock and importing the lock received in return. In the regis-tration model, participants registered with a simulated central ser-vice and had their public lock automatically “uploaded" and others’locks automatically “imported." After the locks were exchanged orthe participant registered, participants composed and sent an en-crypted email to Alice. A researcher, posing as Alice, sent an en-crypted response. As a slightly more complex task, participantswere asked to send an encrypted email to a group of two recipi-ents. This task was designed to get participants to consider how thetwo models scale. Finally, we asked participants to consider howthey would handle several other situations, including communicat-ing with larger groups of people and various possible errors relatedto losing or publicizing one’s own private key or losing other users’public locks. The possible errors were specific to each model andare shown in Table 2. In the interest of simplicity, we did not in-clude any email signing (or signature verification) tasks.

Encryption tasks were completed using a Gmail account createdespecially for the study and a Chrome browser extension basedon Mailvelope.2 We modified Mailvelope to remove its branding,change the labels to match our lock/key metaphor, and reduce theinterface to include only those features relevant to the study tasks.Figure 1, right shows a screenshot of sending encrypted email withour extension. As in Mailvelope, users of our extension compose anemail and then use an “Encrypt" button to select recipients. Uponreceiving encrypted email, users are prompted to enter their pass-word to decrypt it (with the option to save the password and avoidfuture prompting).

We created two versions of our extension, one for exchange andone for registration, taking care to make them as similar as possi-ble. The only two visible differences were (1) changing the “Gen-erate lock/key pair" menu item and subsequent screen (exchangemodel, Figure 1, left) to read “Register" (registration model) and(2) a lock import screen (Figure 1, center) that was only relevant inthe exchange model.

We also provided participants with detailed instructions to helpthem use the Chrome extension. By simplifying the interface, keep-ing it consistent, and providing detailed instructions, we hoped par-ticipants’ reactions would better reflect the inherent properties of

2https://www.mailvelope.com/. Last accessed on 05/16/2016.

3

Page 5: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

116 2016 Symposium on Usable Privacy and Security USENIX Association

Figure 1: To use our extension, participants first generated (or registered) a key pair. Participants using the exchange model then needed toimport recipients’ locks. Finally, when composing encrypted emails, they clicked the Encrypt button (shown in the lower right of Step 3) tobring up a modal dialog to select recipients.

Task # Exchange Model Registration Model

1 Generate public lock/private key pair Register public lock/private key pair

2 Exchange public locks with Alice N/A3 Send encrypted email to Alice Send encrypted email to Alice4 Decrypt received email from Alice Decrypt received email from Alice

5 Exchange public locks with Bob and Carl N/A6 Send encrypted email to Bob and Carl Send encrypted email to Bob and Carl7 Decrypt received email from Bob and Carl Decrypt received email from Bob and Carl

8 Imagine sending encrypted email to 10 people. Imagine sending encrypted email to 10 people.

9 Consider misconfigurations: Consider misconfigurations:a. Lose Alice’s public lock N/Ab. Lose own private key b. Lose own private keyc. Publicize own private key c. Publicize own private key

Table 2: The encryption-related tasks completed by participants. The tasks differed slightly in the two models.

each model rather than idiosyncrasies of a particular interface.

3.2 Description of security propertiesWe provided participants with short, non-technical descriptions ofpossible attacks on each model.

Exchange modelFor the exchange model, we described a man-in-the-middle attackin which the attacker could intercept or replace keys during theexchange process: “For example, when you try to get the publiclock from Dave, the attacker secretly switches the public lock to hisown. You think you have Dave’s public lock, but in fact you havethe attacker’s. ... As a result, the attacker can read your email. Theattacker will then use Dave’s public lock and send the encryptedemail to Dave, so that neither you nor Dave realize the email hasbeen read." We also showed participants the illustration in Figure 2.

We decided not to include an option for key signing in our exchangemodel both because we thought it would add unnecessary complex-ity to our explanations and because it does not change the underly-ing requirement to trust some keys that are manually exchanged.

Registration modelFor the registration model, we primarily described a man-in-the-middle attack enabled by the key directory service: “When youtry to send encrypted emails to Dave, you think the database will

Figure 2: Possible attacks on the exchange model

return Dave’s public lock to you. But in fact, it returns the attacker’slock, so the attacker can read your email. Therefore, you need totrust the email provider in this system." We showed participants theillustration in Figure 3.

In addition, we described two variations on the basic key directoryapproach: the Confidentiality as a Service (CaaS) variation [13,14],and an auditing model similar to the one proposed by Google andCONIKS [19, 29]. Because these approaches are not currently inwide use the way the iMessage-analogous system is, they weretreated as secondary options. The auditing model was added (tothe end of the interview, to maintain consistency with earlier in-terviews) during recruiting, and was therefore presented only to 24

4

Page 6: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

USENIX Association 2016 Symposium on Usable Privacy and Security 117

Figure 3: Possible attacks on the registration model

participants.

The security of the CaaS variation was described as follows: “Thereis a third-party service (not the email provider) as an intermedi-ary. In this version, neither the third-party service nor your emailprovider can read your email themselves. However, if your emailprovider and the third-party service collaborate, they can both readyour email. Therefore, you need to trust that the two services arenot collaborating."

We described the auditing variation as follows: “The email providerstores all users’ public locks, just like [the primary registrationmodel]. But there are other parties (auditors) who audit the emailprovider, to ensure it is giving out correct public locks. These audi-tors may include other email providers, public interest groups, andsoftware on your devices. If the email provider gives you a publiclock that doesn’t belong to the recipient, or gives someone else thewrong public lock for you, these auditors will notify you. You (orsomeone else) may use the wrong lock temporarily (for an hour or aday) before you are notified. In this model, you don’t need to trustyour email provider, but you need to trust the auditors and/or thesoftware on your device. Because there are several auditors, evenif one auditor does not alert you another one probably will."

3.3 Participant feedbackParticipants were asked questions after completing tasks for eachmodel and at the end of the process. After completing tasks andlearning about security for each model, participants were asked fortheir agreement (on a five-point Likert scale) with the followingstatements:

• The task was difficult (for each task).• The task was cumbersome (for each task).• The system effectively protected my privacy.

The first two questions were repeated for each task in Table 2.Before answering, participants were reminded that difficult taskswould require intellectual effort or skill, while cumbersome taskswould be tedious or time-consuming. After each Likert question,we asked participants to briefly explain their answer choice (freeresponse).

After completing all tasks and learning about all security mod-els, participants were asked several summative questions, includ-ing:

• Willingness to use each system, on a five-point Likert scale,and why.• Willingness to recommend each system, on a five-point Lik-

ert scale, and why.• What the participant liked and disliked about each system.

3.4 Recruitment

We recruited participants 18 or older who were familiar with Gmailand Chrome and who send and receive email at least 3 times perweek. We placed flyers around our university campus and thesurrounding area, advertised via email listservs for the university,and advertised on web platforms like Craigslist. All interviewswere conducted in person at our university campus; interviews werevideo recorded with the explicit consent of participants. Partici-pants were paid $20 for a one-hour study and were reimbursed forparking if utilized. Our study protocol was approved by the univer-sity’s Institutional Review Board.

Participants took part in the study in multiple batches between Au-gust 4, 2015 and Feb 5, 2016. For context, all of the participantsengaged in the study well after Edward Snowden revealed details ofthe National Security Agency’s broad surveillance of digital com-munications [12], but before Apple publicly fought the Federal Bu-reau of Investigation to not weaken the security of a locked andencrypted iPhone [4]. We include this to note that average users’views of security, privacy, and here specifically, encryption are amoving target. Future events may continue to shift public opinionon the importance of encrypted communications.

3.5 Data analysisWe used statistical analysis to investigate participants’ responses tothe exchange and registration models. To account for our within-subjects design, we used the standard technique of including ran-dom effects to group together responses from each participant. Weused a cumulative-link (logit) mixed regression model (CLMM),which fits ordinal dependent variables like the Likert scores we an-alyzed [23]. We included three covariates: whether the participantperformed tasks or learned about security first, whether the encryp-tion model she was evaluating was seen first or second, and theencryption model itself (exchange or registration). This approachallows us to disentangle the ordering effects from the main effectswe are interested in. For each encryption model, we tested re-gression models with and without the obvious potential interactionof encryption type with order of exposure to that type, selectingthe regression model with the lower Akaike information criterion(AIC) [6].

Qualitative data was independently coded by two researchers usingtextual microanalysis [38]. After several iterative rounds of devel-oping a coding scheme, the researchers each independently codedthe full set of participant responses, with multiple codes allowedper response. The researchers originally agreed on more than 94%of the codes, then discussed the instances of disagreement untilconsensus was reached. Where appropriate, we report prevalencefor the final qualitative codes to provide context.

3.6 LimitationsOur methodology has several limitations. Our lab study partici-pants had only limited exposure to the different encryption mod-els, and their opinions might change after working with the mod-els for a longer period. Participants also only imagined their re-sponses to misconfigurations, rather than actually handling them.Nonetheless, we argue that first impressions like the ones we col-lected influence whether people will try any tool for long enoughto develop more-informed opinions. It is well known that studyparticipants may rate tools they examine more favorably (acquies-cence bias) [41], which may explain the high rate of participantsreporting they wanted to use or recommend each model. Becausewe are primarily interested in comparing results between models,we believe this has limited impact on our overall results; however,the absolute ratings should be interpreted as a ceiling at best.

5

Page 7: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

118 2016 Symposium on Usable Privacy and Security USENIX Association

In order to provide participants with any understanding of the secu-rity properties of each model, we had to prime them with descrip-tions of possible attacks. While this priming was unavoidable, weendeavored to keep the security descriptions as neutral as possibleso that priming would affect both models approximately equally.

To avoid overwhelming participants, we evaluated a limited subsetof possible encryption models and possible tasks; in particular, weleft out key signing as well as any email signing or signature verifi-cation tasks. We did this because we believe signing to be the mostdifficult aspect of cryptography for non-experts to understand (seee.g., [43]), but including it might have provided a broader spectrumof user opinions.

Our registration model, unlike for example iMessage, was not com-pletely invisible to participants. We believe it was necessary togive participants something to do other than just sending a normalemail, in order to help them think through the tradeoffs involved.While presumably using a fully transparent variation would onlyhave increased the convenience gap between the two models, priorwork indicates that taking any steps at all increases feelings of secu-rity [33]. This may have contributed to the small observed securitygap between the two models, but we argue that a version with nointervention required would lead to underestimations of security.Because we added the auditing model late, we were not able toget as much feedback about it or to compare it quantitatively to theother models we examined. In addition, because all participants en-countered it last, their responses may reflect some ordering effects.Nonetheless, we believe the qualitative data we collected does pro-vide interesting insights. Future work can examine all these alter-natives in more detail.

As with many lab studies, our participants do not perfectly reflectthe general population, which may limit the generalizability of ourresults.

4. PARTICIPANTSA total of 96 people completed our pre-screening survey. We in-terviewed the first 55 who qualified and scheduled appointments.Three participants were excluded for failing to understand or re-spond coherently to any directions or questions.

Demographics for the 52 participants we consider are shown in Ta-ble 3. Among them, 60% were male and 80% are between the agesof 18-34, which is somewhat maler and younger than the generalAmerican population. Almost 85% of participants reported “pri-marily" growing up in the United States, South Asia, or East Asia.40% of participants reported jobs or majors in computing, math, orengineering.

Despite this high rate of technical participants, most had little ex-perience with computer security. We measured security expertiseusing a slightly adapted version of the scale developed by Campet al. [7]. Higher scores indicate security expertise; the maximumscore is 5.5 and the minimum score is zero. Only two of our partic-ipants scored 3 or higher.

Using a Kruskal-Wallis omnibus test, we found no significant dif-ferences among our four conditions in age, gender, country of ori-gin, or security expertise (p > 0.05).

5. RESULTS AND ANALYSISWe present participants’ reactions to the convenience and securityof each model, followed by a discussion of their overall preferencesamong the models.

5.1 Registration is more convenient

Security WhereID Gend. Age Occupation Expertise grew up

ET1 F 25-34 Other 0 United StatesET2 F 45-54 Education 0.5 United StatesET3 M 21-24 Education 1.5 United StatesET4 M 25-34 Education 2 Middle EastET5 M 21-24 Computers/math 1 South AsiaET6 M 25-34 Engineering 2 East AsiaET7 M 45-54 Life Sciences 2 United StatesET8* M 18-21 Engineering 0.5 East AsiaET9* F 21-24 Computers/math 1 South AsiaET10* F 35-44 Computers/math 2 United StatesET11* M 35-44 Transportation 0.5 United StatesET12* M 21-24 HealthCare 1.5 United StatesET13* M 21-24 Social Service 0.5 Western Europe

ES1 M 35-44 Engineering 0 United StatesES2 M 21-24 Sales 0.5 United StatesES3 F 25-34 Health Care 0.5 United StatesES4 M 21-24 Computers/math 4 South AsiaES5 M 21-24 Computers/math 1 East AsiaES6 M 25-34 Computers/math 1.5 South AsiaES7 F 21-24 Education 0.5 United StatesES8* M 25-34 Engineering 0.5 East AsiaES9* F 21-24 Engineering 1 South AsiaES10* M 25-34 Engineering 1 United StatesES11* F 45-54 Business 0.5 United StatesES12* F 21-24 Communications 0 United StatesES13* F 25-34 Education 0.5 Latin America

RT1 M 25-34 Computers/math 3 East AsiaRT2 F 25-34 Sales 0.5 United StatesRT3 M 21-24 Engineering 2.5 South AsiaRT4 F 21-24 Engineering 1.5 United StatesRT5 M 21-24 Business 2 East AsiaRT6 F 25-34 Other 1.5 United StatesRT7 F 25-34 Health Care 0 United StatesRT8* F 18-20 Sales 0.5 United StatesRT9* M 18-20 Education 0.5 United StatesRT10* M 25-34 Engineering 2 Middle EastRT11* F 35-44 Admin. Support 0.5 United StatesRT12* M 35-44 Admin. Support 0.5 United StatesRT13* M 21-24 Production 0 United States

RS1 M 21-24 Other 1 East AsiaRS2 M 25-34 Life Sciences 1.5 Middle EastRS3 M 21-24 Computers/math 0 AfricaRS4 M 21-24 Computers/math 0.5 South AsiaRS5 M 25-34 Life Sciences 2 Middle EastRS6 M 25-34 Other 0.5 United StatesRS7 F 25-34 Health Care 0 United StatesRS8* F 45-54 Sales 0 United StatesRS9* F 25-34 Engineering 1.5 East AsiaRS10* M 21-24 Engineering 1 United StatesRS11* M 25-34 Architecture 0.5 United StatesRS12* F 25-34 Life Sciences 0.5 United StatesRS13* M 25-34 Construction 0 United States

Table 3: Participant Demographics. The columns show: partici-pant identifiers (coded by activity order), gender, age, occupation,security expertise, and place where the participant grew up. The *indicates participants who were exposed to the auditing model.

Unsurprisingly, our participants found the registration system con-siderably more convenient, rating the exchange system as signifi-cantly more cumbersome and more difficult. Figure 4 and Tables 4and 5 show the results of the CLMM for cumbersome and difficult,respectively, for Task 8: imagining sending email to a group of 10people. In reading the CLMM tables, the exponent of the coeffi-cient indicates how much more or less likely participants were tomove up one step on the Likert scale of agreement.

6

Page 8: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

USENIX Association 2016 Symposium on Usable Privacy and Security 119

Figure 4: Participants’ ratings of difficulty and cumbersomeness(Task 8) as well as whether participants thought the model pro-tected their privacy. Labels indicate which model participants eval-uated along with whether they saw that model first or second; e.g.,“Exchange, First" indicates ratings for the exchange model amongthose who saw it first, which includes ET and ES participants.

For cumbersomeness the exchange model was associated with al-most a 20x increase in likelihood of indicating more agreement.The exchange model was also about 5x more likely to be perceivedas more difficult.

Factor Coef. Exp(coef) SE p-value

tasks first 0.010 1.010 0.589 0.986second model -0.166 0.847 0.393 0.673exchange 2.978 19.656 0.567 <0.001*

Table 4: Regression table for cumbersomeness, Task 8. The non-interaction model was selected. Non-significant values are greyedout; significant values are indicated with an asterisk.

Factor Coef. Exp(coef) SE p-value

tasks first -0.282 0.754 0.747 0.706second model -0.726 0.484 0.460 0.115exchange 1.674 5.333 0.520 0.001*

Table 5: Regression table for difficulty, Task 8. The non-interactionmodel was selected. Non-significant values are greyed out; signifi-cant values are indicated with an asterisk.

Participants’ comments generally supported this finding: that theexchange model was dramatically more cumbersome and some-what more difficult. Within the exchange model, the most tedioustask was manually exchanging locks and the most commonly men-tioned reason was waiting for a correspondent’s public lock. ES9was concerned that the exchange model was “time-consuming, es-pecially sending urgent emails. I have no choice but to wait for"the correspondent’s public lock. RS5 agreed, saying “There are somany steps to exchange locks." RS13 mentioned that the cumber-someness of exchanging locks was mainly related to initialization:“If their locks are already there, it would not be cumbersome. Butif I have to ask them to send me locks person by person, it’s morecumbersome.” One participant (ET10) worried it would be addi-tionally cumbersome to use the exchange model on a phone.

Several participants expressed concern that users with low digitalliteracy might have trouble with the exchange model or prefer theregistration model. For example, RS12 recommended the registra-tion model “especially to people that don’t know very well how touse a computer . . . old people, like my father." ET2 agreed that theregistration model is “easy to teach others to use."

While few participants considered any of the tasks very difficult,choosing a mechanism for exchanging locks was considered themost difficult step by a few participants, such as RS4, who men-tioned having to “think about a safe way to exchange public locks,”and RS10, who was concerned about making an exchange errorwhile multitasking.

Other concerns related to the general issue of convenience includedscalability and misconfiguration. As RT9 said, “When I send tomore people, I have to be very careful, especially when I choose tosend them my public locks separately. I need to control every step iscorrect.” ET13 said, “When I exchange locks with ten people, I cansend my lock, which is kind of easy. But I have to get ten repliesfor their locks. I can easily get lost. And if I exchange with 100people, it’ll be a nightmare.” A few participants were concernedabout the difficulty of recovering from misconfiguration, and ET10was particularly worried that others’ mistakes could cause addi-tional hassle for her: “If other people lose their private keys andsend me new public locks, I will be overwhelmed." RS12 agreedthat “if accidents or mistakes happen, it bothers both parties to doextra steps.”

The inconvenience of the exchange model could potentially be mit-igated somewhat by posting the key publicly or semi-publicly (on akey server or Facebook profile), rather than sending it individuallyto different recipients. About a third of our participants chose thisoption: 34 used the primary email, 20 used the secondary email,10 used Facebook chat, five posted to the Facebook profile, and13 used the key server. (Some participants chose multiple meth-ods during different tasks.) However, few of the participants whoused the public or semi-public methods mentioned the added con-venience as a reason for their choice. RT12 said exchanging locksis “not too cumbersome, it’s manageable through the lock serverto exchange locks”. On the other hand, a few participants chosethe key server because they thought it was more secure than otherchoices we provided.

5.2 The perceived security gap is smallWe found that participants understood and thought critically aboutthe security properties we explained to them for each model. Sur-prisingly, they found the exchange model to be only marginallymore secure than the registration model, for a variety of reasons.

7

Page 9: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

120 2016 Symposium on Usable Privacy and Security USENIX Association

Exchange: Manual effort may lead to vulnerabilityMost participants believed the exchange model was most secureoverall, with 48 (out of 52, 92.3%) agreeing or strongly agreeingthat this model protected their privacy. Nonetheless, participantsalso expressed concern that managing key exchange themselveswould create vulnerabilities. More than half (27 out of 52) of par-ticipants were concerned about the security of the medium usedto exchange locks.—ET4 worried that ‘the key server [could] bemanipulated or compromised." RT7 had several such concerns, in-cluding that an attacker could break into her Facebook account topost an incorrect public lock, or that public Wi-Fi in a coffee shopcould be unsafe for transmitting locks. Overall, she said, “There aretoo many exchanges between different people. Exchanging [locks]to many people may go wrong." Others, like RS5, worried that theirinternet service provider could “sit between my recipient and me"and switch locks to execute a man-in-the-middle attack. ET7 wasone of several participants who noted that “If I send the public locksand encrypted emails using the same email provider, it’s not verysecure." RT9 thought the ability to choose from different mecha-nisms to exchange locks provided added security, but worried that“people may choose a particular way in real life. It’s their habits,so that attackers may anticipate" their choices and take advantageof their known routine. ES10 asked his recipients to send back hispublic lock, both through Facebook and via email, so he could ver-ify for himself that the received public locks were not altered.

Other participants were concerned about making a mistake duringthe ongoing responsibility of managing keys. As ET10 put it, “Ev-ery time when I send or get a public lock ... there is a probability,even though not high, that my privacy is compromised. Then whenI exchange public locks with many people, this probability will in-crease exponentially." RS12 worried that “I don’t know what I ac-tually need to do when I lose or publicize my private key. I amnot confident about my answers. Non-tech experts may make mis-takes."

Other participants mentioned that careless or compromised userscould ruin the security of a whole group. ES12 said, “If I send toAlice, and she decrypts and goes away, then other people can seethe email or even copy that email.” ET8 said that “Within a com-pany, if one person is hacked, then the whole company is hacked.It’s hard to track the source, just like rotten food in the refrigerator."ET4 agreed that “There can be attacks on users with weak security,which may impair the whole user system.”

Registration: Some concern but generally trustedAs expected, many participants were concerned about the need totrust email providers in the registration model. As ES5 said, hav-ing the email provider store “all public locks ... is not very com-fortable." Despite this, however, most participants (38 out of 52)trusted the system protecting their privacy. Also, the CLMM resultsin Table 6 and Figure 4 indicate that the order in which the modelswere introduced played a significant role. Participants who saw theregistration model first were more comfortable with it: 9 of 26 whosaw registration first strongly agreed that the model protected theirprivacy, compared to only 3 of 26 who had already heard about themore-secure exchange model. None of the participants who sawregistration first disagreed that the model protected their privacy,while 3 did so after seeing the exchange model first.

This general confidence in the registration model reflects many par-ticipants’ belief that even though email providers could compro-mise the security of the primary registration model, they would beunlikely to. Ten participants mentioned that they trust their own

email provider (presumably if they didn’t they would switch ser-vices). ET11 mentioned that his email provider “knows me, I havemy name there," and ET12 said that “All public locks are storedin a database, and I trust the database. This database provides ex-tra security.” ET13 provided a slightly different view: that someemail providers are untrustworthy in well-known ways. “Every-one knows the Gmail potential vulnerabilities. And some peo-ple who are particularly hiding some information from the U.S.government, they will choose Yandex email from Russia, becausethey’d rather be intercepted by the Russian government, instead ofthe U.S. government. . . . If you are an activist in US, and you don’twant the U.S. government to know what you are up to, so I willchoose some email services I feel comfortable with.”

Several (7 participants) were specific about which kind of providersthey would trust: RT8 would trust “certain big companies, not smallcompanies," because big companies must protect their reputations.RT10 felt similarly, with an important caveat, mentioning that bigcompanies like “Google and Yahoo! don’t do such things [violateusers’ privacy] usually, unless the government forces them to doso. In general, it’s secure." ET11 would choose an email providerwith many users since “the more people using it, the more reliable.”RT2, on the other hand, preferred to trust institutions like universi-ties that “own their own email server" to better protect her privacy.

Also contributing to the general comfort level with the registra-tion model is that participants do not believe most or any of theircommunication requires high security. RT4 said “encryption is notnecessary for me," and RS8 agreed, saying “If I have some privateinformation, I won’t put it on the Internet."

CaaS and auditing: Some additional perceived secu-rity for registrationTwenty-two participants preferred the CaaS variation to the pri-mary registration model, and 12 preferred the primary model toCaaS; the rest rated the two variations the same. The most popularexplanation for preferring CaaS was a belief that different compa-nies would not collude. RS7 said that the two parties would notcollude because they do not “even trust each other." ES12 wascautiously positive, saying “This separation makes me feel good.However, [the two parties] still can possibly collaborate." Relat-edly, ES8 suggested that the CaaS approach was more secure be-cause “If one party is screwed up, you have another one to protect[your email]. You are still safe.” These comments have implica-tions for the auditing model as well; belief that different parties areunlikely to collude and recognition that distributing trust spreadsout possible points of failure would also point to more trust in theauditing model.

On the other hand, four users thought the primary registration modelwas more secure than the CaaS variation because adding more in-termediate systems and providers reduces overall security. RS1,

Factor Coef. Exp(coef) SE p-value

tasks first -0.332 0.717 0.527 0.528second model -1.684 0.186 0.699 0.016*exchange -0.288 0.750 0.670 0.668second model :: exchange 2.818 16.740 1.124 0.012*

Table 6: Regression table for privacy. The interaction model wasselected. Non-significant values are greyed out; significant valuesare indicated with an asterisk.

8

Page 10: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

USENIX Association 2016 Symposium on Usable Privacy and Security 121

for example, said that “involving more systems may complicate thesystem, so it is less trustful.” A few users said that the possibilityof collaboration invalidates the entire model: for example, ES13said “I don’t trust that much the whole system. I am afraid theymay collaborate." Two participants (ET4, RS13) were afraid that inCaaS the two parties might collaborate for a sufficiently large gain.For example, RS13 said “They will not collaborate for one or twoperson’s email, but for many, a group of people.”

Other participants were concerned about whether the third party inCaaS was trustworthy. ET11 worried that “the third party serviceis not verified,” and RT9 said his opinion “depends on who the twoentities are. If the two companies are big names, like Gmail andFacebook, it seem more secure. Also if they do different types ofservices [from each other], it’s more secure.”

The 24 participants who were briefly exposed to the auditing varia-tion gave generally positive feedback. ES9 was happy that “some-body is supervising" lock distribution and watching for problems,and ET13 said “Obviously it’s extra secure. Other parties are ver-ifying it, like an anti-virus system telling me if something goeswrong." ET8 appreciated that “if something goes wrong, I will benotified." The presence of many auditors reassured participants thatcollusion was unlikely; for example, RT10 commented that “it’sless likely that all auditors [would] do something bad," and RS12appreciated that “there are many auditors who can notify me."

Several participants, however, were concerned about the reliabil-ity of the auditors: RS9 said, “I want to know who these auditorsare, . . . their reputations, and whether they are truly independent."Similarly, RT13 said, “Am I able to choose auditors? This is a bigquestion. The principle is good . . . but I want to know who theyare and how to choose them, because I need to trust them." Oneuser (ET10) was concerned that auditors from competing compa-nies might have incentives to lie about each others’ behavior, mak-ing it hard to know who to trust. According to ET11, involvingmore parties reduced the overall trustworthiness: “Putting trust toonly one party is better.”

Ten participants expressed concern about the time lag for notifica-tion, noting that “a lot emails have already been sent" with evenan hour’s delay (ES10). RT11 said “It should be immediate noti-fication. Even an hour is too late. . . . Something bad has alreadyhappened.” Others, however, were more pragmatic: “Immediatenotification is ideal, but I don’t expect immediateness in reality"(RT9). ET13 said the time lag “is a vulnerability. It depends onhow often I send encrypted emails. If I use it very often, then it’svulnerable.” Similarly, RT12 pointed out that “If I don’t send theemail, it doesn’t matter, but in this case, I don’t receive the wronglocks. . . . Notification happens after the fact that I already receivedthe wrong lock.”

5.3 Overall comparison between systemsAfter exposing them to both models, we asked participants whetherthey would use or recommend the exchange model, the primaryregistration model, or the CaaS registration model. Figure 5 showsthat the exchange model and CaaS variation were slightly preferredto the primary registration model. The number of participants whoagreed or strongly agreed to use or recommend each model were27, 23, and 28 (use) and 29, 21, and 28 (recommend). The CLMMresults (Tables 7 and 8), which take the exchange model as a base-line, show no significant difference between exchange and eithervariation of registration for would-use, but do show that the primaryregistration was recommended less frequently than the exchangemodel. The 95% confidence intervals for each model indicate no

Figure 5: Participants’ ratings of whether they would use or recom-mend each model.

significant differences between the primary and CaaS registrationmodels in either case.

The regression models also indicate that participants who com-pleted the encryption tasks before hearing about security proper-ties were less likely to use or recommend any model than thosewho heard about security properties first. We hypothesize that par-ticipants who used the encryption extension before hearing aboutsecurity anchored on the inconvenience of the tool rather than itsprivacy benefits. While this does not provide useful insight aboutcomparing the different systems, it does underline the need for care-ful consideration about how new encryption tools are presented tothe public.

Factor Coef. Exp(coef) SE p-value

tasks first -0.606 0.546 0.308 0.049*second model -0.026 0. 975 0.291 0.930registration (primary) -0.376 0.687 0.358 0.294registration (CaaS) -0.077 0.926 0.360 0.823

Table 7: Regression table for whether participants would use eachmodel. The non-interaction model was selected. Exchange is thebase case for model type. Only whether participants completedtasks first or heard about security first was significant.

Factor Coef. Exp(coef) SE p-value

tasks first -0.678 0.508 0.303 0.025*second model -0.198 0.820 0.291 0.496registration (primary) -0.915 0.401 0.368 0.013*registration (CaaS) -0.490 0.613 0.366 0.180

Table 8: Regression table for whether participants would recom-mend each model to others. The non-interaction model was se-lected. Exchange is the base case for model type. Primary regis-tration is significant (less recommended vs. exchange), while CaaSis not significantly different from exchange. Participants who com-pleted the encryption tasks before hearing about security propertieswere signficantly less likely to recommend any model.

We asked participants why they would or would not use each sys-tem, and categorized each participant’s self-reported most impor-tant reason as related to security, usability, or both. (Details ofparticipants’ usability and security opinions for each system werediscussed in Sections 5.1 and 5.2 respectively.) For participantswho would not use a system, we also included having no need for

9

Page 11: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

122 2016 Symposium on Usable Privacy and Security USENIX Association

Figure 6: The most significant reason why participants would andwould not use each model. We note here that while the number ofparticipants who would use each system is similar, their reasoningvaries. For example, prospective users of the exchange model uni-formly cite security, while prospective users of the two registrationmodels cite a mixture of security and usability.

encryption as a separate category. Figure 6 shows the results. Someparticipants gave more than one answer; a few did not give mean-ingful responses.

Unsurprisingly, the perception of better security attracted partici-pants to the exchange model, while poor usability drove them away.Participants’ reactions to the two registration models were morecomplicated. In both cases, insufficient security was the most com-mon reason for rejecting the systems; however, participants whosaid they would use the primary registration model were evenlysplit between whether its usability or its security was more impor-tant. Participants who said they would use the CaaS model largelybut not uniformly cited its security properties.

Participants who said they would use the exchange model generallydescribed using it for high-security information only, or only at asmall scale. ES6 exemplified this trend, saying the exchange modelis “the safest one. I want to use it in small scale, like one or twopeople, ... like private and personal things. But I don’t want to use itevery day." RS9 felt similarly: “I think this system is more effectivewith fewer people, maybe under ten. I would use it when I send mycredit card information to my Mom, instead of QQ or Wechat [twoinstant messaging services]." ES10 said he would use the exchangemodel for client projects, which should be kept secret until they arefinished. Among the 27 participants who agreed they would wantto use the exchange model, none mentioned using it with a largegroup; 16 said they would use it for very private information whileonly one said she would use it for general or everyday emails.

In contrast, participants who said they would use either variationof the registration model mentioned “contacting a large number ofcustomers" for payroll or financial information (ET6) as well as“party and meeting announcements" (ET9, RS13). RT8 said shewould use the registration model for information that was “overallprivate, but would not be a disaster if disclosed, e.g., my daughteris sick." ES7, a teacher, said she would use the exchange modelonly for “extremely sensitive information, such as SSNs," whileshe would use the registration model to send “location informationor grade information." In total, 15 participants who wanted to useeither variation of the registration model mentioned general emailor large-scale communications.

These results suggest that although most participants said they woulduse both systems at least sometimes, quite a few wanted encryptiononly in specific circumstances. Between the exchange and registra-tion models, however, our participants found the registration modeluseful in a broader variety of circumstances.

Using vs. recommendingAs expected, most participants (44) who said they would use a sys-tem also said they would recommend it to others, and vice versa,but a few gave interesting reasons for answering differently. ET4said he would not use the exchange model because it was too cum-bersome, but would recommend it to others who have stronger pri-vacy requirements. Similarly, RT4 said that “encryption is not nec-essary for me," but recommended the CaaS variation of the regis-tration model because it is “easier to use [than the exchange model]and more secure than the vanilla [primary] registration system."

Registration vs. no encryptionWe did not explicitly ask participants to compare these encryptionmodels to unencrypted email. However, 5 participants who hadconcerns about the security of the registration model (total 14 ratedless than 4) also mentioned that it does provide a valuable securityimprovement over unencrypted email. ET7 said “The email is notraw, which is another layer of security. ... Doing encryption givesme a security sense that I lock my door myself." RT12, explainingwhy he would use the primary registration model, noted that “I haveto trust the email provider, which is problematic, but . . . it’s betterthan raw email.”

In line with findings from prior work [33], for some participantsthe process of taking any manual steps (such as generating a keypair in either model) increased their confidence that protection wasoccurring; for example, RS6 said “extra steps give me a securitysense."

Auditing modelWe asked participants who heard about the auditing model whetherthey would use it; overall, it proved popular. Of the 24 participantswho were introduced to the auditing model, 15 said they would liketo use it. Of these, 10 preferred it to any other model discussed.For example, ES11 said, “It’s best among all systems mentionedin the experiment, because somebody else is policing them, justlike watchdogs. If someone is reading your email, they might becaught.” RT8 preferred the auditing model to any other option be-cause “unlike the other models . . . instead of using [the attacker’s]public lock blindly, I will get the update, ‘Oh, that’s the wrong pub-lic lock, you should not use this.’"

Four found the auditing model superior to the other registrationmodels, but preferred the exchange model in at least some circum-stances. RS10 said he would send personal information includingbanking data using the auditing model, but “if I worked in a gov-ernment department, I would still use the exchange model." RT12said the audit model is “slightly better than [the primary] registra-tion model . . . because in [the primary registration] model I don’tknow if wrong locks happened. But overall, the lock exchange sys-tem has extra steps, extra layers of security, so I like it best amongall the systems.” Several of these 15 participants noted the possi-ble time lag in notification as an important disadvantage, but werewilling to use the model anyway. This generally positive reaction,combined with the preference to split risk among different partiesin the CaaS model, suggests that the auditing model has strong po-tential to meet with user approval.

Eight participants said they would not use the auditing model (onewas unsure). One of these (RS11) preferred it to all other modelsbut believed he had no need to encrypt his email, and three found itworse than the exchange model. Four said it was worst among allmodels discussed, either because they did not trust the auditors or

10

Page 12: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

USENIX Association 2016 Symposium on Usable Privacy and Security 123

because the time lag was too great.

5.4 Participant understandingDespite receiving only a short introduction to each encryption sys-tem, most of our participants demonstrated thoughtful understand-ing of key concepts for each, suggesting that they provided credibleopinions throughout the experiment.

Handling misconfigurationWe asked participants to consider how they would handle variouspossible misconfigurations in each model. Our primary goal wasto prompt them to consider usability issues related to longer-termkey maintenance, but this section of the interviews also offered achance to evaluate participants’ understanding of the different secu-rity models. Most participants were capable of reasoning correctlyabout these error scenarios.

Participants were presented with five different misconfiguration sce-narios across the two models (see Table 2). Thirty-nine of 52 par-ticipants (75%) responded to all five scenarios with a straightfor-wardly correct answer, such as asking Alice to resend a lost publickey (task 9a, exchange) or generating a new lock-key pair and redis-tributing the lock to all correspondents (task 9b, exchange). Sevenadditional participants (13.5%) provided such answers to at leastthree of the scenarios. One participant (RS13) mentioned recover-ing keys from a backup (such as a USB drive) rather than generatinga new key pair.

We note several interesting misconceptions among those partici-pants who got at least one scenario wrong. Four participants re-sponding to task 9c (accidentally publicizing their own private key,in either model) suggested changing their password within the en-cryption extension; the password unlocks access to the private key,but a new password would not help if the key has already been ex-posed. Another participant (RS7) suggested for 9c that “I will sendmy email to a third person I trust, and ask that person to encrypt theemail for me and send to my recipients. Similarly, he will decryptthe [response] email for me and forward it to me.” This shows in-teresting security thinking but misses the potential for the messageto be captured during the first step. Other common answers in-cluded getting tech support from the company that developed theencryption extension3 and simply “I don’t know."

Overall, participants were largely able to engage with these mis-configuration scenarios, demonstrating working understanding ofthe encryption tools; remaining misconceptions highlight areas inwhich more education, clearer directions in the tools, and more fre-quent use of encryption may be helpful.

Thinking about securityOur participants made several thoughtful points about encryption,security, and privacy that apply across models. ES4 mentioned thatan extra benefit (of any encryption model) is a reduction in targetedads: The “email provider can collect data through my emails, andthen present ads. . . . I don’t want that. [Using this tool] the ads willnot appear.”

ES10 expressed concern that an email encryption provider (in ei-ther model) might collect your private key, especially if you areusing Apple email on an Apple device or Google email in Chrome,

3While completely reasonable in practice, this answer does notdemonstrate understanding of the encryption model’s securityproperties and so was not counted as “correct" for this purpose.

etc. One participant (RS9) was concerned about using public com-puters. This is potentially a problem for both encryption models,which assume the private key is securely stored on the user’s localdevice. She was also concerned that the act of sending a lock mightitself catch the interest of attackers; another participant (RS11)liked the sense of security provided by both encryption models butthought it might seem paranoid to worry about others reading hisemails. Similar concerns were raised in [18]. ES12 expressed con-cern that the centralized nature of the registration model would pro-vide a juicier target for an attacker than many individuals partici-pating in the exchange model. ET10 worried that encryption wouldbypass an email provider’s virus-detection system.

Several (11) participants liked that the exchange model allowedthem to explicitly control who would be able to send them en-crypted email. ES2 said he would “know the person whom I sentthe public locks to," and RT3 liked that “who can send me en-crypted emails [is] controlled by myself." RS13 said that “if I com-municate with a group of people, it’s easy to kick someone out ofthe group.” A similar level of control can be implemented in a reg-istration model; our findings suggest this is a feature at least someusers value.

Although many participants understood and reasoned effectivelyabout the security properties we presented, some retained incorrectmental models that have implications for the ongoing design of en-cryption systems. RS1 incorrectly believed that since he could notunderstand an encrypted message, no one else (including his emailprovider) would be able to either. Others were concerned aboutkeeping their public locks secret in the exchange model; three splittheir locks across different channels in an effort to be more secure.For example, RS2 sent half of his public lock through the secondaryemail account and posted the other half on the key server. RT7thought it would be insecure to store public locks: “After I sendmy lock to other people, others may not delete my public lock. . . . Imay also forget to do so after I import others’ locks. The fewer peo-ple know my public lock, the safer.” Relatedly, ES13 worried thatin the auditing model, the auditors “are scanning my lock. It soundslike more people are watching me besides the email provider, and Idon’t feel good."

Several participants also had concerns and misconceptions abouthow keys are managed across multiple devices, regardless of model.System designers may want to provide help or information on thesepoints.

Evaluating tradeoffsIn deciding which system(s) they preferred, participants explicitlyand deliberately made tradeoffs among their security and usabilityfeatures. For example, ES13 said he would use the exchange modelbecause “Exchanging locks makes it more private for me”, despitethe fact that “it takes time to exchange locks”. ES10 also preferredthe exchange model: “Having something better than baseline is oneapproach. But if I compare to perfect security I am trying to get,it’s another approach. . . . When you want to use it, you really wantit to be very well protected.”

On the other hand, RT13, who said he would not use the exchangemodel, commented that “The negotiating process maybe gives mesafer feelings, more protection. But on the other hand . . . the disad-vantage is it is time consuming, cumbersome, tedious, more com-plicated, and this is the price I have to pay for more protection."

RS7 said she would use the primary registration model because itis “easy to use, and I think most of us trust our email provider”, al-

11

Page 13: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

124 2016 Symposium on Usable Privacy and Security USENIX Association

though she understood that “there are some possible threats.” ET8,in contrast, would not use the primary registration model because“It’s easy to send encrypted emails, especially to many people. Butsecurity concern is the reason I don’t want to use it.” Accordingto ES12, the exchange model “is more straightforward. Only I andthe other person [recipient] get involved in the communication, andno others.” These comments and others demonstrate that partici-pants understood pros and cons of the different models and thoughtcarefully about how to balance them.

6. DISCUSSION AND CONCLUSIONWe conducted the first study examining how non-expert users, brieflyintroduced to the topic, think about the privacy and conveniencetradeoffs that are inherent in the choice of encryption models, ratherthan about user-interface design tradeoffs.

Our results suggest that users can understand at least some high-level security properties and can coherently trade these propertiesoff against factors like convenience. We found that while partic-ipants recognized that the exchange model could provide betteroverall privacy, they also recognized its potential for self-defeatingmistakes. Similarly, our participants acknowledged potential secu-rity problems in the registration model, but found it “good enough"for many everyday purposes, especially when offered the option toaudit the system and/or split trust among several parties. This re-sult is particularly encouraging for approaches like CONIKS andGoogle’s end-to-end extension, which spread trust among manypotential and actual auditors. It is important to note that under-standing the identities and motivations of third-party auditors wasimportant to several of our participants, so making this auditingprocess as open and transparent as possible may prove important toits success.

We believe our results have important implications for designersof encryption tools as well as researchers, policymakers, journal-ists, and security commentators. First, our results suggest that itmay be reasonable to explain in clear language what the high-levelrisks of a given encryption approach are and trust users to make de-cisions accordingly. The Electronic Frontier Foundation’s SecureMessaging Scorecard, which tracks the security properties of en-cryption tools, provides an excellent start in this direction [15]. Ofcourse, participants in our study were directly instructed to read thematerials we gave them; real users often have neither the time northe motivation to seek out this kind of information. This magnifiesthe role of journalists, security commentators, and other opinion-makers whose recommendations users often rely on instead.

As a result, alarmed denunciations of tools that do not offer perfectprivacy may only serve to scare users away from any encryptionat all, given that many users already believe encryption is eithertoo much work or unnecessary for their personal communications.Instead, making clear both the marginal benefit and the risk cansupport better decision making. This also underscores the criticalimportance of making risks explicit up front, in plain non-technicallanguage; users who are misled into a false sense of security maymisjudge tradeoffs to their detriment.

We do, however, advise some caution. Although most participantsunderstood the encryption models and their security properties ata high level, there were some smaller misunderstandings that im-pacted their ability to make informed decisions. Despite yearsof effort from the security community, effectively communicatingthese subtleties remains difficult; however, we believe our findingsdemonstrate the benefits of continuing to try. Continued educa-tion, discussions in the media, and more frequent engagement with

encryption tools in daily life may all assist this effort. Our own ed-ucational materials were improved through early pilot testing butnot rigorously developed into an ideal or standard format; there isroom to develop better materials for those users who are interestedin learning more about encryption.

As end-to-end encryption is increasingly widely deployed, design-ers and companies must make choices about which models to adopt.We believe our results can provide some additional context formaking these decisions, relative to the targeted use cases and userpopulation. Further work in this area—for example, testing how acompletely transparent registration model affects decision makingand perception of security, examining an auditing model in greaterdetail and with reference to specific trusted auditors and notifica-tion lags, and comparing different approaches to framing securityproperties for non-experts—can provide further insight into how tooptimize these choices.

7. ACKNOWLEDGMENTSThe authors wish to thank Elaine Shi and Christopher Soghoianfor discussions that helped lead to this work; Nikos Kofinas andYupeng Zhang for contributing to an early version of this study;members of the University of Maryland HCI Lab for their helpfulfeedback; and Cody Buntain for suggesting the title.

8. REFERENCES[1] Apple. iOS security guide, iOS 9.0 or later.https://www.apple.com/business/docs/iOS_Security_Guide.pdf, Sept. 2015. (Last accessed on05/16/2016).

[2] Apple. The most personal technology must also be the mostprivate. https://www.apple.com/privacy/approach-to-privacy/,Mar. 2016. (Last accessed on 05/16/2016).

[3] J. Ball. GCHQ views data without a warrant, governmentadmits. The Guardian, Oct. 2014.http://www.theguardian.com/uk-news/2014/oct/29/gchq-nsa-data-surveillance (Last accessed on05/16/2016).

[4] D. Bisson. A timeline of the Apple-FBI iPhone controversy.The State of Security, Mar. 2016.http://www.tripwire.com/state-of-security/government/a-timeline-of-the-apple-fbi-iphone-controversy/ (Last accessed on 05/16/2016).

[5] O. Bowcott. Facebook case may force european firms tochange data storage practices. The Guardian, Sept. 2015.http://www.theguardian.com/us-news/2015/sep/23/us-intelligence-services-surveillance-privacy(Last accessed on 05/16/2016).

[6] K. P. Burnham and D. R. Anderson. Multimodel inference:Understanding AIC and BIC in model selection. SociologicalMethods & Research, 33(2):261–304, Nov. 2004.

[7] L. J. Camp, T. Kelley, and P. Rajivan. Instrument formeasuring computing and security expertise. TechnicalReport TR715, Indiana University, Feb. 2015.

[8] C. Cattiaux and gg. iMessage privacy.http://blog.quarkslab.com/imessage-privacy.html,Oct. 2013. (Last accessed on 05/16/2016).

[9] C. A. Ciocchetti. The eavesdropping employer: Atwenty-first century framework for employee monitoring.American Business Law Journal, 48(2):285–369, 2011.

[10] K. Conger. Google engineer says he’ll push for defaultend-to-end encryption in Allo, May 2016.

12

Page 14: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

USENIX Association 2016 Symposium on Usable Privacy and Security 125

http://techcrunch.com/2016/05/19/google-engineer-says-hell-push-for-default-end-to-end-encryption-in-allo/ (Last accessed on06/02/2016).

[11] W. Diffie and M. E. Hellman. New directions incryptography. Information Theory, IEEE Transactions on,22(6):644–654, Nov 1976.

[12] K. Elliott and T. Rupar. Six months of revelations on NSA.Washington Post, June 2013.http://www.washingtonpost.com/wp-srv/special/national/nsa-timeline/m/ (Last accessed on05/16/2016).

[13] S. Fahl, M. Harbach, T. Muders, and M. Smith.Confidentiality as a Service – usable security for the cloud.In Trust, Security and Privacy in Computing andCommunications (TrustCom), 2012 IEEE 11th InternationalConference on, pages 153–162, June 2012.

[14] S. Fahl, M. Harbach, T. Muders, M. Smith, and U. Sander.Helping johnny 2.0 to encrypt his facebook conversations. InProceedings of the Eighth Symposium on Usable Privacyand Security, SOUPS ’12, pages 11:1–11:17. ACM, 2012.

[15] E. F. Foundation. Secure messaging scorecard, 2016.https://www.eff.org/secure-messaging-scorecard(Last accessed on 05/16/2016).

[16] S. L. Garfinkel and R. C. Miller. Johnny 2: a user test of keycontinuity management with S/MIME and Outlook Express.In Proceedings of the 2005 Symposium on Usable Privacyand Security, SOUPS ’05, pages 13–24. ACM, 2005.

[17] C. Garman, M. Green, G. Kaptchuk, I. Miers, andM. Rushanan. Dancing on the lip of the volcano: Chosenciphertext attacks on Apple iMessage, 2016.

[18] S. Gaw, E. W. Felten, and P. Fernandez-Kelly. Secrecy,flagging, and paranoia: Adoption criteria in encrypted email.In Proceedings of the SIGCHI Conference on HumanFactors in Computing Systems, CHI ’06, pages 591–600,New York, NY, USA, 2006. ACM.

[19] Google. Google End-To-End wiki.https://github.com/google/end-to-end/wiki, Dec.2014. (Last accessed on 05/16/2016).

[20] Google. Email encryption in transit. Transparency report,Mar. 2016. https://www.google.com/transparencyreport/saferemail/?hl=en (Lastaccessed on 05/16/2016).

[21] B. Greenwood. The legality of eavesdropping in theworkplace. Chron, Dec. 2012. http://work.chron.com/legality-eavesdropping-workplace-15267.html.

[22] P. Gutmann. Why isn’t the Internet secure yet, dammit. InAusCERT Asia Pacific Information Technology SecurityConference 2004; Computer Security: Are we there yet?AusCERT Asia Pacific Information Technology Security,May 2004.

[23] D. Hedeker. Mixed models for longitudinal ordinal andnominal outcomes, 2012. http://www.uic.edu/classes/bstt/bstt513/OrdNomLS.pdf(Last accessed on 05/16/2016).

[24] C. Johnston. NSA accused of intercepting emails sent bymobile phone firm employees. The Guardian, Dec. 2014.http://www.theguardian.com/us-news/2014/dec/04/nsa-accused-intercepting-emails-mobile-phone-employees (Last accessed on 05/16/2016).

[25] G. Kumparak. Apple explains exactly how secure iMessagereally is. TechCrunch, Feb. 2014. http:

//techcrunch.com/2014/02/27/apple-explains-exactly-how-secure-imessage-really-is/ (Lastaccessed on 05/16/2016).

[26] B. Laurie, A. Langley, and E. Kasper. Certificatetransparency. RFC 6962, RFC Editor, June 2013.

[27] N. Lomas. WhatsApp completes end-to-ed encryptionrollout.http://techcrunch.com/2016/04/05/whatsapp-completes-end-to-end-encryption-rollout/, Apr.2016.

[28] S. E. McGregor, P. Charters, T. Holliday, and F. Roesner.Investigating the Computer Security Practices and Needs ofJournalists. In 24th USENIX Security Symposium (USENIXSecurity 15), pages 399–414. USENIX Association, 2015.

[29] M. S. Melara, A. Blankstein, J. Bonneau, E. W. Felten, andM. J. Freedman. CONIKS: Bringing key transparency to endusers. In 24th USENIX Security Symposium (USENIXSecurity 15), pages 383–398. USENIX Association, Aug.2015.

[30] I. Paul. Yahoo Mail to support end-to-end PGP encryption by2015. PCWorld, Aug. 2015. http://www.pcworld.com/article/2462852/yahoo-mail-to-support-end-to-end-pgp-encryption-by-2015.html (Last accessed on05/16/2016).

[31] B. Ramsdell. S/MIME version 3 message specification. RFC2633, RFC Editor, June 1999.

[32] S. Ruoti, J. Anderson, S. Heidbrink, M. O’Neill,E. Vaziripour, J. Wu, D. Zappala, and K. Seamons. “We’reon the same page”: A usability study of secure email usingpairs of novice users. In Proceedings of the SIGCHIConference on Human Factors in Computing Systems, CHI’16, pages 4298–4308, New York, NY, USA, 2016. ACM.

[33] S. Ruoti, N. Kim, B. Ben, T. van der Horst, and K. Seamons.Confused Johnny: when automatic encryption leads toconfusion and mistakes. In Proceedings of the NinthSymposium on Usable Privacy and Securit, SOUPS ’13,pages 5:1–5:12. ACM, July 2013.

[34] M. D. Ryan. Enhanced certificate transparency andend-to-end encrypted mail. In 21st Annual Network andDistributed System Security Symposium, NDSS’14, 2014.

[35] S. Sheng, L. Broderick, C. A. Koranda, and J. J. Hyland.Why Johnny still can’t encrypt: evaluating the usability ofemail encryption software. In Proceedings of the SecondSymposium on Usable Privacy and Security, SOUPS ’06,2006.

[36] D. J. Solove. ’I’ve got nothing to hide’ and othermisunderstandings of privacy. San Diego Law Review,44:745, 2007.

[37] S. Somogyi. Making end-to-end encryption easier to use.Google online security blog, June 2014.http://googleonlinesecurity.blogspot.com/2014/06/making-end-to-end-encryption-easier-to.html(Last accessed on 05/16/2016).

[38] A. L. Strauss and J. M. Corbin. Basics of QualitativeResearch: Techniques and Procedures for DevelopingGrounded Theory. Sage Publications, Inc, Thousand Oaks,CA, USA, 1998.

[39] M. Sweikata, G. Watson, C. Frank, C. Christensen, andY. Hu. The usability of end user cryptographic products. In2009 Information Security Curriculum DevelopmentConference, InfoSecCD ’09, pages 55–59. ACM, 2009.

[40] W. Tong, S. Gold, S. Gichohi, M. Roman, and J. Frankle.

13

Page 15: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

126 2016 Symposium on Usable Privacy and Security USENIX Association

Why King George III can encrypt.http://randomwalker.info/teaching/spring-2014-privacy-technologies/king-george-iii-encrypt.pdf, 2014.

[41] M. Viswanathan. Measurement Error and Research Design.Sage Publications, 2005.

[42] N. Weaver. iPhones, the FBI, and Going Dark. Lawfare, Aug.2015. https://www.lawfareblog.com/iphones-fbi-and-going-dark (Last accessed on 05/16/2016).

[43] A. Whitten and J. D. Tygar. Why Johnny can’t encrypt: Ausability evaluation of PGP 5.0. In Proceedings of the 8thConference on USENIX Security Symposium - Volume 8,SSYM’99, pages 14–14, 1999.

[44] P. Zimmermann. PGP version 2.6.2 user’s guide.ftp://ftp.pgpi.org/pub/pgp/2.x/doc/pgpdoc1.txt,Oct. 1994.

APPENDIXThis appendix contains the full survey and instructional instrumentused in our research.

• Section A introduces the task and role-play to the participant.

• Section B contains the introduction and explanation of the Ex-change Model.

• Section C contains the introduction and explanation of theRegistration Model.

• Section D contains the post-task survey instrument.

• Section E contains the demographic questionnaire.

A. OVERALL INTRODUCTIONWelcome to our experiment. Today you will use two systems.These systems are developed to encrypt your emails so that youremails can be protected from being read by email providers (suchas Google and Yahoo!), governments (e.g. NSA), as well as mali-cious attackers.

In this experiment, pretend you are Henry, and you want to sendand receive encrypted emails to some people. Below are email ad-dresses you may use in this experiment.

• Henry: [email protected]

• Henry2: [email protected]

• Alice: [email protected]

• Bob: [email protected]

• Carl: [email protected]

B. EXCHANGE MODELBelow is how Lock Exchange System works.

1. Every user can get a public lock and a private key.

2. Users have to exchange their public locks in some way.

3. You can send encrypted emails with others’ public locks, sothat others’ can read the emails with their private keys.

4. Similarly, you can also read any encrypted emails that areencrypted to you using your private key.

Task Instructions:

14

Page 16: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

USENIX Association 2016 Symposium on Usable Privacy and Security 127

• Click the extension on upper right corner on tool bar in Chrome.

• Click “Options” for configuration.

1. Generate a public lock/private key pair.Go to “Generate Lock and Key” to generate a public lock/private key pair. Note: The password is only for this study,and is NOT your email password. DON’T use your real pass-words associated with any of your account in real life.

2. Exchange Public Locks with Alice.

(a) Go to “Display Lock/Key Pair” and click the lock/keypair you just generated. Then export your public lock toAlice.The public lock will start with “-----BEGIN PGP PUB-LIC LOCK BLOCK-----”, and end with “-----ENDPGP PUBLIC LOCK BLOCK-----” (Note: there areFIVE “-” in the beginning and in the end).You can send your public lock by one or combination ofways that we provide you.

(b) Then you will receive Alice’s public lock.(c) Import Alice’s public lock into the extension.

3. Send an encrypted email to AliceIn the email interface, first click the encryption icon to write“What is your favorite color” to Alice. If the icon doesn’tshow up, please refresh the website.Note: you need to encrypt for Alice.

4. Decrypt the received email from Alice.Move your mouse to the email body. When a lock icon ap-pears, click on the icon. You need your password (you createdin step 3) to decrypt email.Next you will send encrypted email to two recipients Boband Carl.

5. Exchange Public Locks with Bob and Carl.You can use the same way or different way provided in step 4to exchange public locks with Bob and Carl.

6. Send encrypted email to Bob and Carl.Imagine that you are a financial secretary in your department,and you want to send the payroll reports to Bob and Carl byencrypted email. For simplicity, you can simply write “Hereis your biweekly payroll summary: Salary is $888.88, Tax is$88.88. Your subtotal: $800.00.” in the email body. You canrefer to previous steps to send encrypted email.

7. Decrypt the received email from Bob and Carl.

8. Imagine that you are still the financial secretary in your de-partment, and you will send the payroll reports to 10 peopleby encrypted email, what will you do? Please specify thesteps.

9. Misconfiguration

(a) If you accidentally delete or lose Alice’s public lock,what will you do if you want to send/receive encryptedemail to/from Alice?

(b) If you accidentally delete or lose your own private key,what will you do if you want to send/receive encryptedemail to/from other recipients?

(c) If you accidentally publicize your own private key, whatwill you do if you want to send/receive encrypted emailto/from other recipients?

Possible Threats for Lock Exchange System:

These systems are developed to encrypt your emails so that youremails can be protected from being read by email providers (such asGoogle and Yahoo!), governments (e.g. NSA), as well as maliciousattackers.

The threat may happen when you exchange public locks with oth-ers. When you try to get the public lock from Dave, Mallet (can beany type of attacker from above) secretly switches the public lockto his own. You think you get Dave’s public lock, but in fact youget Mallet’s.

Then when you send encrypted email to Dave, you actually useMallet’s public lock. As a result, Mallet can read your email. Mal-let will consequently use Dave’s public lock and send the encryptedemail to Dave, so that both you and Dave don’t realize the email hasbeen read.

This threat doesn’t happen usually, because it requires Mallet tohave much power and resources to achieve this.

Please give your feedback about Lock Exchange System:

15

Page 17: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

128 2016 Symposium on Usable Privacy and Security USENIX Association

Note: We are evaluating these systems. We are not testing you.These systems are not developed by us. Please leave your feedbackas honestly as you can. Your honest feedback, positive or negative,will help with our research.

For the first two questions, please note the difference between dif-ficulty and cumbersomeness. Difficult tasks are intellectually chal-lenging and need effort or skills to accomplish. Cumbersome tasksare tedious and need an unnecessarily long time to accomplish.

Please rate your agreement with the following statements.

1. The following tasks were difficult.

(a) Generate the public lock and private key pair

(b) Exchange public lock with Alice

(c) Send encrypted email to Alice

(d) Decrypt email from Alice

(e) Exchange public locks with Bob and Carl

(f) Send encrypted email to Bob and Carl

(g) Decrypt email from Bob and Carl

(h) Send and receive encrypted emails to 10 people

2. The following tasks were cumbersome.

(a) Generate the public lock and private key pair

(b) Exchange public locks with Alice

(c) Send encrypted email to Alice

(d) Decrypt email from Alice

(e) Exchange public locks with Bob and Carl

(f) Send encrypted email to Bob and Carl

(g) Decrypt email from Bob and Carl

(h) Send and receive encrypted emails to 10 people

3. This system effectively protected my privacy.

C. REGISTRATION MODELInstruction: Below is how Registration System works.

1. Every user can get a public lock and a private key when youregister.

2. Every user’s public lock will be automatically stored in acloud database that is run by the email provider.

3. You can send encrypted emails with others’ public locks, sothat others’ can read the emails with their private keys. Thecloud database will return others’ public locks for you.

4. Similarly, you can also read any encrypted emails that areencrypted to you using your private key.

Task Instructions:

• Click the extension on the upper right corner on the tool barin Chrome.

• Click “Options” for configuration.

1. RegisterGo to “Register” to register your email account to the emailprovider server. The registration will give you a public lockand a private key.

2. Send an encrypted email to AliceIn the email interface, first click the encryption icon to write“What is your favorite color” to Alice. If the icon doesn’tshow up, please refresh the website.Note: you need to encrypt for Alice.

3. Decrypt the received email from AliceYou need your password (you created in step 3) to decryptemail.Next you will send encrypted email to two recipients Boband Carl.

16

Page 18: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

USENIX Association 2016 Symposium on Usable Privacy and Security 129

4. Send encrypted email to Bob and Carl.Imagine that you are a financial secretary in your department,and you want to send the payroll reports to Bob and Carl byencrypted email. For simplicity, you can simply write “Hereis your biweekly payroll summary: Salary is $888.88, Tax is$88.88. Your subtotal: $800.00.” in the email body. You canrefer to previous steps to send encrypted email.

5. Decrypt the received email from Bob and Carl

6. Imagine that you are still the financial secretary in your de-partment, and you will send the payroll reports to 10 peopleby encrypted email, what will you do? Please specify thesteps.

7. Misconfiguration

(a) If you accidentally delete or lose your own private key,what will you do if you want to send/receive encryptedemail to/from other recipients?

(b) If you accidentally publicize your own private key, whatwill you do if you want to send/receive encrypted emailto/from other recipients?

Possible Threats for Registration System:

These systems are developed to encrypt your emails so that youremails can be protected from being read by email providers (such asGoogle and Yahoo!), governments (e.g. NSA), as well as maliciousattackers.

There are two prototypes for Registration System. For the first pro-totype (Model 1), the possible threats are as follows.

The threat may happen when you send encrypted emails to others.For example, when you try to send encrypted emails to Dave, youthink the email provider database will return Dave’s public lock toyou. But in fact it returns Mallet’s, so that Mallet can read youremail. Therefore, you need to trust the email provider in this sys-tem.

In the second prototype (Model 2), there is a third-party service (notthe email provider) as an intermediary. In this prototype, neitherthe third-party service nor your email provider can read your emailthemselves. However, if your email provider and the third-party

service collaborate, they can both read your email. Therefore, youneed to trust that the two services are not collaborating.

Please give your feedback about Registration System:

Note: We are evaluating these systems. We are not testing you.These systems are not developed by us. Please leave your feedbackas honestly as you can. Your honest feedback, positive or negative,will help with our research.

For the first two questions, please note the difference between dif-ficulty and cumbersomeness. Difficult tasks are intellectually chal-lenging and need some effort or skills to accomplish. Cumbersometasks are tedious and need an unnecessarily long time to accom-plish.

Rate your agreement with the following statements.

1. The following tasks were difficult.

(a) Register(b) Send encrypted email to Alice(c) Decrypt email from Alice(d) Send encrypted email to Bob and Carl(e) Decrypt email from Bob and Carl(f) Send and receive encrypted emails to 10 people

2. The following tasks were cumbersome.

(a) Register(b) Send encrypted email to Alice(c) Decrypt email from Alice(d) Send encrypted email to Bob and Carl(e) Decrypt email from Bob and Carl(f) Send and receive encrypted emails to 10 people

3. This system effectively protected my privacy.

D. OVERALL FEEDBACKPlease give your overall feedback about these two systems:

Note: Again, please give your honest feedback to help with ourresearch.

1. Please rate your willingness to use these two systems in thefuture.

(a) I would like to use Lock Exchange System.(b) I would like to use Registration System with Model 1.(c) I would like to use Registration System with Model 2.

2. Below please rate your willingness to recommend these sys-tems to others.

(a) I would like to recommend Lock Exchange System to oth-ers.

(b) I would like to recommend Registration System with Model1 to others.

(c) I would like to recommend Registration System with Model2 to others.

3. Please rate your agreement with the following statements.

(a) I think that I would need the support of a technical personto be able to use Lock Exchange System.

(b) I think that I would need the support of a technical personto be able to use Registration System.

17

Page 19: An Inconvenient Trust: User Attitudes toward Security and … · 2016-06-22 · June 2–24 01 Denver O USA ISBN 78-1-931971-31-7 Open access to the roceedings of the Twelft Symposium

130 2016 Symposium on Usable Privacy and Security USENIX Association

(c) I would imagine that most people would learn to useLock Exchange System very quickly.

(d) I would imagine that most people would learn to use Reg-istration System very quickly.

(e) I would need to learn a lot of things before I could getgoing with Lock Exchange System.

(f) I would need to learn a lot of things before I could getgoing with Registration System.

4. What do you like or dislike for each system? Why?In Model 3, the email provider will still store all users’ publiclocks, just like Model 1. But there are other parties (auditors)who audit the email provider, to ensure that the email provideris giving out correct public locks. These auditors may includeother email providers, public interest groups, and software onyour devices. If the email provider gives you a public lockthat doesn’t belong to the recipient, or gives someone else thewrong public lock for you, these parties will notify you. You(or someone else) may use the wrong lock temporarily (for anhour or a day) before you are notified.In this model, you don’t need to trust any email provider, butyou need to trust the auditors and/or the software on your de-vice. Because there are several auditors, even if one auditordoes not alert you another one probably will.

E. DEMOGRAPHICS1. Which of the following best describes your current occupa-

tion?

(a) Healthcare Practitioners and Technical Occupations(b) Office and Administrative Support Occupations(c) Production Occupations(d) Farming, Fishing, and Forestry Occupations(e) Computer and Mathematical Occupations(f) Community and Social Service Occupations(g) Life, Physical, and Social Science Occupations(h) Management Occupations(i) Legal Occupations(j) Installation, Maintenance, and Repair Occupations(k) Food Preparation and Serving Related Occupations(l) Architecture and Engineering Occupations

(m) Arts, Design, Entertainment, Sports, and Media Occupa-tions

(n) Building and Grounds Cleaning and Maintenance Occu-pations

(o) Healthcare Support Occupations(p) Construction and Extraction Occupations(q) Education, Training, and Library Occupations(r) Protective Service Occupations(s) Sales and Related Occupations(t) Business and Financial Operations Occupations(u) Transportation and Materials Moving Occupations(v) Other (please specify)

2. Where did you grow up (primarily)?

(a) United States(b) Other North America(c) South or Central America(d) Western Europe

(e) Eastern Europe(f) Africa(g) South Asia (India, Bangladesh, Pakistan, etc.)(h) East Asia (China, Japan, Korea, etc.)(i) Central Asia(j) The Middle East(k) Australia / Oceania(l) Other: [please specify]

(m) I prefer not to answer

3. What is your age?

(a) 18-20(b) 21-24(c) 25-34(d) 35-44(e) 45-54(f) Above 54(g) I prefer not to answer

4. What is your gender?

(a) Male(b) Female(c) I prefer not to answer

5. Please tell us whether you have the following experiences (yesor no).

(a) I have attended a computer security conference in thepast year.

(b) I have taken or taught a course in computer security be-fore.

(c) Computer security is one of my primary job responsibil-ities.

(d) I have used SSH before.(e) I have configured a firewall before.(f) I have a degree in an IT-related field (e.g. information

technology, computer science, electrical engineering, etc.)?(g) I have an up-to-date virus scanner on my computer.

18