security day-to-day: user strategies for managing security

17
Institute for Software Research University of California, Irvine www.isr.uci.edu/tech-reports.html Paul Dourish Univ. of California, Irvine [email protected] Rebecca E. Grinter Palo Alto Research Center [email protected] Brinda Dalal Palo Alto Research Center [email protected] Jessica Delgado de la Flor Univ. of California, Irvine [email protected] Melissa Joseph Univ. of California, Irvine [email protected] Security Day-to-Day: User Strategies for Managing Security as an Everyday, Practical Problem June 2003 ISR Technical Report # UCI-ISR-03-5 Institute for Software Research ICS2 210 University of California, Irvine Irvine, CA 92697-3425 www.isr.uci.edu

Upload: others

Post on 22-May-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Security Day-to-Day: User Strategies for Managing Security

Institute for Software ResearchUniversity of California, Irvine

www.isr.uci.edu/tech-reports.html

Paul DourishUniv. of California, [email protected]

Rebecca E. GrinterPalo Alto Research Center [email protected]

Brinda DalalPalo Alto Research Center [email protected]

Jessica Delgado de la FlorUniv. of California, [email protected]

Melissa JosephUniv. of California, [email protected]

Security Day-to-Day: User Strategies for ManagingSecurity as an Everyday, Practical Problem

June 2003

ISR Technical Report # UCI-ISR-03-5

Institute for Software ResearchICS2 210

University of California, IrvineIrvine, CA 92697-3425

www.isr.uci.edu

Page 2: Security Day-to-Day: User Strategies for Managing Security

Security Day-to-Day: User Strategies for Managing Security as an Everyday, Practical Problem

Paul Dourish*, Rebecca E. Grinter+, Brinda Dalal+,

Jessica Delgado de la Flor*and Melissa Joseph*

*Institute for Software Research

School of Information and Computer Science

University of California, Irvine

Irvine, CA 92697-3425, USA

[email protected]

+Palo Alto Research Center (PARC)

3333 Coyote Hill Road

Palo Alto, CA 94304, USA

[email protected]

ISR Technical Report # UCI-ISR-03-5June 2003

ABSTRACT

Effective security solutions depend not only on the mathematical and technical properties of those solu-tions, but also on people’s ability to understand them and use them as part of their work. As a step towardssolving this problem, we have been examining how people experience security as a facet of their daily life,and how they routinely answer the question, “is this system secure enough for what I want to do?” Wepresent a number of findings concerning the scope of security, attitudes towards security, and the social andorganizational contexts within which security concerns arise, and point towards emerging technical solu-tions.

Page 3: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 20031

Security Day-to-Day: User Strategies for ManagingSecurity as an Everyday, Practical Problem

Paul Dourish*, Rebecca E. Grinter+, Brinda Dalal+,Jessica Delgado de la Flor*and Melissa Joseph*

*School of Information and Computer ScienceUniversity of California, IrvineIrvine, CA 92697-3425, USA

[email protected]

+Palo Alto Research Center (PARC)3333 Coyote Hill Road

Palo Alto, CA 94304, [email protected]

ISR Technical Report #UCI-ISR-03-5, June 2003

AbstractEffective security solutions depend not only on themathematical and technical properties of thosesolutions, but also on people’s ability to understandthem and use them as part of their work. As a steptowards solving this problem, we have beenexamining how people experience security as a facetof their daily life, and how they routinely answer thequestion, “is this system secure enough for what Iwant to do?” We present a number of findingsconcerning the scope of security, attitudes towardssecurity, and the social and organizational contextswithin which security concerns arise, and pointtowards emerging technical solutions.

1 IntroductionThe security and integrity of computer and digitalcommunication systems has always been asignificant concern. The practical application ofdigital computing to real-world tasks in military,commercial and even academic settings has alwaysrequired that storage, communication, andcomputation be protected by reliable securitymechanisms. However, in recent years, the nature ofthe security problem has changed, due to theemergence and proliferation of online banking,electronic commerce, widespread electroniccommunication, and other activities associated withthe rapid expansion of the Internet into daily life.Security is no longer merely a concern for systemadministrators, nor is it restricted to the relativelysmall, technically adept community of academicsand scientists who, thirty years ago, had access toelectronic communication systems. The pervasivespread of the Internet, and its decentralized nature,means that the security and integrity of computer

and communication systems is now a practical, day-to-day problem for everyday, casual users ofcomputer systems. Anyone who sends an emailmessage, pays a bill online, or purchases a bookfrom amazon.com must, on some level, beconcerned with the security of the infrastructure thatthey use in carrying out those activities.

We believe that this transition implies a radicalchange in the way that security should be understoodand studied. Specifically, we distinguish betweentwo aspects of security as they arise in differentresearch communities. First, we use the term“theoretical security” to describe the range ofconcerns that have traditionally been topics ofinvestigation for the computer and communicationsecurity community. Theoretical security is the studyof the mathematical basis of forms of protection andattack – the nature of cryptographic protocols, theproblems of key exchange, the potential “leakage” ofinformation via secondary channels, etc. Second, weuse the term “effective” security” to refer to thedegree of security that is practically achievable inreal settings; in other words, how much and whatkind of security people can actually use andunderstand. Theoretical security sets an upper boundfor effective security, but, for a host of practicalreasons, effective security typically lags theoreticalsecurity, often considerably. Consider two examples:

• Turning it off. The Placeless Documents project(Dourish et al., 2000) created an environment forassociating active code with operations in adocument repository, code that would travelwith the documents. Clearly this required astrong security mechanism. An informal schemehad been initially developed, but was replacedby a more comprehensive, academically

Page 4: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 20032

rigorous system. However, in daily use, thissystem proved too rigorous – not only did itinterfere with development activities, but theperformance degradation that it imposed wasjudged too onerous. Most people, therefore, ranthe system without the security features turnedon – rendering it, effectively, less secure than ithad been when the simple, ad hoc scheme was inplace.

• One-time Pad. At one point, a research labsecured its firewall using a password mechanismbased on one-time pads. Local software wasprovided to generate new one-time pads,protected by a password. The effectiveness ofthis solution hinged on the integrity of thispassword, and users were urged never to run theprogram and enter the password on anythingother than their local computer, inside thefirewall, to ensure that the password wouldnever be broadcast over an insecure channel.However, this proved troublesome since manyusers had difficulty interpreting what might bemeant by “their local computer.” Users of olderworkstations reconfigured as X terminals, forexample, naturally thought of the computer ontheir desk as “their local computer,” despite thefact that it was running no client software; evenusers of regular workstations would havedifficulty distinguishing those connections thatmight traverse an insecure network and thosethat might not, while, when people connected totheir desktop machines from remote locations,the fact that they were making use of a networkwas sometimes hard to notice. Despite theirPhDs in technical topics, many users found theterms of reference too difficult to interpret.

It is important and instructive to note that these arenot cases involving naïve and non-technical end-users; rather, they both occurred in high-techenvironments populated by well-qualified scientificpersonnel.

One response to the disparity between theoreticalsecurity and effective security is to dismiss it as amatter of the implementation of policy. It is ascribedto a lack of training, perhaps, or even to a willfulstubbornness on the part of ill-educated users who,at best, don’t understand their own best interests or,at worst, driven by petty spite, actively conspire tobe revenged upon their employers for the apparent

ignominies of daily working life by deliberatelyflaunting rules and procedures. However, we believethat to regard this disparity as purely a matter ofprocedure is to ignore some important issues –issues that, in fact, become more pressing as thereach of the Internet grows. We believe that it isimportant to treat the disparity between effective andtheoretical security as a research problem.

Our working approach is this. Rather than focusingon the mathematical foundations of theoreticalsecurity, we want to examine the practicalfoundations of effective security. As they usecomputers and networks, people are continuallyconfronted with the question, “is this system secureenough for what I want to do now?” Whether theirtask is to file a tax return, submit class grades, sendemail to a family member, or make a purchase, theymust determine whether or not the configuration oftechnologies available to them meets the needs thatthey associate with this task. We regard everyoccasion on which someone enters a password orcredit card number – or every occasion on whichthey decide not to – as an occasion on which thisquestion has been answered (not alwaysconclusively.) What is particularly notable aboutmost solutions is that, in their attempts to providesecurity seamlessly and transparently, they deprivepeople of the resources they might need to answerthis question. Our research goal is to examine justhow people come to an answer.

In this paper, we report on a series of investigationsinto security as an everyday, practical problem,routinely encountered and resolved by computerusers as they go about their activities. We havelooked at how a range of users, in differentorganizational settings and with different degrees oftechnical skill, have addressed this question, andsome of the strategies they use to resolve it. Webegin by discussing related research. We thenintroduce the methodological approach that we havetaken. Next, we present the results of our studies,and then conclude by considering their implicationsfor the design of applications and infrastructures.

2 Background and Related WorkSecurity research has long acknowledged the role ofhuman, social, and organizational factors in creatingeffective solutions.

Page 5: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 20033

In some cases, the complexity of making securitywork is as much a matter of interface design asanything else. Whitten and Tygar (1999) present ausability analysis of PGP 5.0, demonstrating thedifficulties that users have in completingexperimental tasks (in their user study, only 3 out of12 test subjects successfully completed a standardset of tasks using PGP to encrypt and decrypt email.)The problems that they uncovered were largelyproblems of interface design, and in particular thepoor matching between user needs and the structureof the encryption technology provided to meet theseneeds. Others such as Yee (2002) or Zurko andSimon (1996) have similarly explored therelationship between interaction design and security,and attempted to draw out a set of general principlesconcerning the interaction of the two.

In a series of studies, researchers at UniversityCollege, London have explored some of theinteractions between usability and security (Adams,Sasse and Lunt, 1997; Adams and Sasse, 1999).They have focused particularly on user-visibleelements of security systems, such as passwords.Although many information systems professionalsregard users as being uninterested in the security oftheir systems (and, indeed, likely to circumvent it bychoosing poor passwords, etc), Adams and Sasse’sinvestigations demonstrate that users are certainlymotivated to support the security of the system, butoften unable to determine the security implicationsof their actions. The specific problems that theyidentify with passwords have also led to interestingdesign alternatives (Brostoff and Sasse, 2000;Dhamija and Perrig, 2000).

One area at the intersection of usability and securitythat has received some attention is the role of accesscontrol in interactive and collaborative systems. Forexample, Dewan and Shen (Shen and Dewan, 1992;Dewan and Shen, 1998) have explored the use ofaccess control and meta-access control models as abasis for describing and controlling degrees ofinformation access and management in collaborativesystems. This is not simply a technical matter, sincethe structure and behavior of these “internal”components can have a significant effect on theforms of interactivity and collaboration they cansupport (Greenberg and Marwood, 1994).

Many collaborative systems involve privacy issuesand need to provide users with control over the

disclosure of information to the other parties usingthe system. This has spurred a number of researchersto explore the development of privacy controlsystems that are tailored to the needs of end users.For instance, Dourish (1993) describes therelationship between three different securitymechanisms for similar multimedia communicationsystems, each of which reflects assumptions andrequirements of the different organizations in whichthey were developed. Bellotti and Sellen (1993)draw on experiences with multimedia and ubiquitouscomputing environments to identify the source of anumber of potential privacy and security problems.Their primary concepts – disembodiment anddissociation – are both visibility problems, related tothe disconnection between actors and actions thatrenders either actors invisible at the site of action, oractions invisible to the actor.

Based on their investigations of privacy problems inonline transactions, Ackerman and colleaguespropose the idea of privacy critics, semi-autonomousagents that monitor online action and can informusers about potential privacy threats and availablecountermeasures (Ackerman et al., 1999; Ackermanand Cranor, 1999). Again, this mechanism turns onthe ability to render invisible threats visible.

Given our concern with how users manage securityas an ongoing concern, one important related topic iscontrol over the degree of security available. One ofour criticisms of traditional security systems hasbeen their “all or nothing” approach. However, therehas been some work that attempts to characterizedegrees of security provision, as embodied by theidea of “quality of security service.” (Irvine andLevin, 2001; Spyropoulou et al., 2000). This buildson earlier work establishing a taxonomy of securityservice levels (Irvine and Levin, 1999). Thefundamental insight is that organizations andapplications need to trade-off different factorsagainst each other, including security of variousforms and degrees, in order to make effective use ofavailable resources (Thomsen and Denz, 1997;Henning, 1999). While this work is directed towardsresource management rather than user control, itbegins to unpack the “security” black box andcharacterize degrees and qualities of security.

While the findings of these investigations arevaluable and instructive, our goal here is somewhatdifferent. We are not concerned with specific

Page 6: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 20034

security technologies or techniques such aspasswords or quality of service, but rather with howusers manage security as a practical, day-to-dayproblem. In this area, Freidman et al (2002) haveinvestigated users’ perceptions of security in Web-based interactions, and note that even in thisrestricted domain, problems arise in how peopleassess the security of settings they encounter.Rimmer et al. (1999) and Sheeran et al (2001)discuss the mental models that end users develop ofnetwork structure and behavior, and illustrate theimpacts that these have on system use. Weirich andSasse (2001) present a preliminary investigation ofusers mental models of security; their investigationis similar to ours, although they focus largely ontechnologically sophisticated users. In addition, ourconcern here is not with “mental models,” per se, butwith the practices through which these modelsoperate – practices whose operation may be socialrather than cognitive.

3 Methodological IssuesOur approach is broadly ethnographic in nature,based largely on semi-structured interviews thathave been subjected to a qualitative analysis,drawing on the grounded theory approach (Glaserand Strauss, 1967). Grounded theory provides a setof procedures for developing analytic accounts ofqualitative data, based on the iterative generation,validation, and refinement of coding schemes. Aqualitative approach is more appropriate than aquantitative at this stage, given that our goal is not toprovide definite answers to definite questions, butrather, to determine what questions we might wantto ask in the first place. In particular, the goal of ourinvestigations is not simply to document what usersdo, but rather to understand their experience ofsecurity as they encounter it. It is this concern – tobe able to see security as it appears to the users weare studying – that drives our methodologicalapproach. From this perspective, we gain more byunderstanding the experience of a small number ofusers in depth and detail than we would from abroader statistical account of the activities of a largernumber of people.The results presented here have been collected froma number of investigations conducted at three sitesover the past twelve months or so.

Site A is an academic institution. Our interviewsubjects here are drawn from two pools –administrative staff members of both an academicdepartment and a research institute, and graduatestudents in a management program. We wereinterested in these people because of their range ofinstitutional affiliations and responsibilities. Inaddition, previous research (Sheehan, 2002) suggeststhat new graduate students should be an interestinggroup of people because of (as well as since they arelikely to have recently reconsidered theirinfrastructure arrangements due to relocation.) Weinterviewed a total of eleven participants at Site A.

Site B is an industrial research lab. At site B wewere particularly interested in finding people who, inaddition to any end-user security problems, also hadsecurity needs that related to their jobs. In particularwe were interested in what additional kinds ofsecurity needs these people had because their jobsrequired certain levels of confidentiality resultingfrom institutional (Federal Government) andorganizational (the corporate owner of the site) rulesand policies. At site B we conducted nine interviewswith various members of staff in jobs that includedmedia relations, human resources, executiveadministration, and legal.

Site C is a law firm. The motivation forinterviewing lawyers came from a belief that theywould be highly sensitized to issues of privacy,confidentiality, and trust, and because of that theywould have interesting security practices. Weconducted one interview in this firm.

In what follows, we discuss our findings from thesesites. We will discuss these findings in three clustersof related topics. The first deals with the “scope” ofsecurity, or the range of concerns that manifestthemselves when end users think about securityissues. The second concerns the range of attitudesthat people display towards security problems.Finally, we examine relevant aspects of the socialand organizational contexts within which peopleencounter and solve security problems.

4 The Scope of SecurityAlthough the mathematical and technicalfoundations of security systems delimit the scope of“security” for the research community, end users’see their encounters with security quite different andset the scope of concerns more broadly.

Page 7: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 20035

4.1 Security as a BarrierOne feature of our interview data that immediatelystood out was the set of issues that arose under theauspices of security. Our questions were orientedaround problems of security and informationprotection. However, we found that respondentswould persistently turn to other issues that, to them,are intimately related to information security. Ofthese, perhaps the most prevalent is unsolicitedemail (spam). For our subjects, security and spamare two aspects of the same problem; as practicalproblems, viruses, network scanners, passwordsniffers, and unsolicited email form a “naturalclass,” even though they may be technically quitedifferent. What is more, conventional users not onlyregard these as the same problem, but also think ofthe same technologies as providing solutions;firewalls, for example, are described both astechnologies to keep out unwelcome visitors but alsounwelcome email messages. In other words, peopleseem to both imagine and seek unitary solutions tothese problems. When we think of the real-worldexperiences on which people base their experiences,they think of security as a barrier, akin to a gate or alocked door. Security is, generically, something to“keep things out,” and so the various threats – thethings that are being kept out – become co-constructed as the common entities against whichsecurity protects.

There are three immediate implications of thisobservation. The first is that a security solution thatsolves only one problem but not others (e.g. a virusscanner) is likely to be seen as inadequate, andpotentially to be rejected for that reason. Conversely,a second observation is that a technology deployedto solve one problem may be mistakenly interpretedas providing protection against the others; forexample, one user at Site A talked of being protectedfrom virii by a new filtering system installed by thenetwork service (although, in fact, this was a spamfilter with no virus detection facilities.) Third, thefocus on barriers or “choke-points” diverts attentionfrom channels, as exemplified, for instance, by userswho install advanced firewalls but then rununencrypted 802.11b wireless networks.

4.2 Online and OfflineThe relationship between online and offlineexperience is a complex one, but is also centrally

important. Online conduct seems to be continuallyshaped by aspects of the offline world. This happensin a number of ways.

One is that a range of experiences in the offlineworld provide metaphors and analogies by whichpeople understand the online world. The brandidentity of large organizations, for example, is asignificant factor in how people treat online entities,and particularly people felt more comfortable whendealing with organizations that had a physicalpresence. Institutional arrangements are perhapseven more important; banks, for instance, are seen asinherently more concerned about security, andtherefore inherently more trustworthy.

A second relationship, one of greater import to oursubjects, was the potential leakage of informationbetween online and offline settings. While our initialexpectation was that people would relate Internetsecurity problems to internet-based fraud (e.g.forging email, identity theft, unauthorized financialtransactions), a much more immediate concern for asignificant number of our subjects was thepossibility that inadvertent information disclosureonline could create a threat offline. Most frequently,these were problems of personal security. Stalkerswere an especially common reported threat, andmuch of people’s attention to online informationdisclosure concerned information that might result indirect personal threat. We were struck by theregularity with which this issue arose, especiallyamongst women.

Online and offline come together in third way also:in the practical management of space and security.Computers inhabit not just a electronic world thatneeds to be protected but also a physical world thatneeds to accommodate them. Arranging the offlineworld to support practices in the online world was aconstant source of practical security for a number ofdifferent people.

This relationship was made particularly clear to usby two separate people at Site B. The first personworked with many sensitive hard copy legaldocuments. A practice of locking away these hardcopy documents was enforced. However, at thesame time, many of these documents would berequired for day to day online processing activities.The solution was to use a library cart, which wasloaded with all the documents that would be requiredin the day’s work and then wheeled into and out of

Page 8: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 20036

the locked area each day. The very presence of acart full of documents (as many as 400 sheets ofpaper) was an indication of the relationship betweenonline security and offline security.

The second person at Site B worked with employeedata, but also received many employees as visitors toher office. She described several relationshipsbetween online and offline security. First, shepositioned the screen to face her, and critically, awayfrom the first point of entry into her office. Ifsomeone showed up to turn in a document or start aconversation they could not see what was on herscreen, potentially (and often) data about anemployee.

The office layout more generally was used as ameans of protecting information. She had arrangedher office (which she alone occupied) into twodistinct sections. On entering the office, the firstsection was where office visitors would sit and talkwith her. All the spare seats were in this part of theoffice, and there was a barrier (in the form of a desk)between the public visiting part of the office and thesecond—her private—part of the office. The frontpart was recognizable by the lack of paper of anysort. As she explained, the desk barrier was alwayskept clear, and the chairs were placed so that theywould not make either the offline or onlineinformation visible to visitors.

By contrast, the back part of the office was coveredin papers, her outstanding work assignments. Withno spare chair in the back, along with the difficultyof walking around the desk, she had created a privatepart of the office. Although, in theory, someonewho wanted to access the back of the office could,the social conventions of seating arrangements,barriers, were used to regulate visitors access tocertain parts of the office and consequently certaintypes of data. Of course, both of these strategieswere backed up by the presence of a locked filecabinet where files that were not being currentlyused were placed; these were the strategies that sheused to protect data while it was being worked on.

The monitor and physical layout were not the onlypart of her office that needed this kind of protection.Her online work typically involved processing andreferring to offline documents. For example,processing data about an employee’s immigrationstatus requires not just working on online documentsbut referral to physical copies of Immigration and

Naturalization Service materials that need to be keptat the side of the monitor for easy reach. At thesame time however, she could not have thosedocuments seen by other employees who mightpotentially come into her office (since theimmigration status of an employee is legallyprotected). Instead, she had devised a complexsystem of folders that allowed her to storedocuments away from the gaze of others and withoutrevealing their contents by having to label them.

More than simply hiding the documents away, thefolders were also used to sort the work out withoutmaking the sorting critieria visible to others throughthe use of written words. She used different coloredfolders for processing different types of information.For example, there were two colors associated withimmigration (one for full time employees, one forinterns—which require different kinds of processingwork), and other colors for other types ofconfidential work. The colored folders serve twopurposes simultaneously. First, they protectinformation from other people’s sight, and second,they allow her to immediately understand whatoutstanding activities she has to work on withoutcompromising the integrity of the data containedwithin the folders.

A fourth and final aspect of the relationship betweenonline and offline aspects of security is particularlymarked when people must deal with the physicalmanifestation of their networking service – thecables, routers, modems and other pieces ofequipment through which their connection ismaintained. Wireless networking technologies areperhaps especially interesting here due to theircombination of tangible and intangible elements.Wireless networks offer people the sameinfrastructure interfaces that they conventionallyassociate with wired or point-to-point networks thatconventionally carry with them obvious properties ofphysical security and accessibility. At the same time,however, they make the actual infrastructure ofservice provision intangible. One user weencountered spent some time trying to diagnose anapparently problem with a networked printer thatrefused to accept print jobs, before noticing that, infact, he was connected through his neighbor’s accesspoint rather than his own (and so was connectedfrom an unauthorized IP network). Anotherinformant had resorted to wrapping his access pointin aluminum foil in order to deal with apparent

Page 9: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 20037

interference problems, which turned out, in fact, tobe intermittent DSL failures. The very intangibilityof network infrastructure in these cases makes itharder to for people to relate online experiences tooffline manifestations of technology.

In other words, what we see from our observations isthat, for everyday users, security is not purely anonline matter; it extends into the physical world. Theinformation which is to be protected, the resourcesto be managed, and the work to be carried out allexist in a physical setting too. Security practices maydraw as much on the arrangement of physical spacesas on the arrangement of technical resources and,again, providing people with technical solutions thatcannot be understood or integrated into what peoplesee as the “whole problem” will reduce theireffectiveness.

4.3 Hackers, Stalkers, Spammers andMarketers

Our subjects have varying degrees of familiaritywith technology, work in very different settings(physically and organizationally), and perform verydifferent kinds of work. Similarly, a range ofdifferent situations are classed as threats, almostcoextensively. We found four broad classes ofthreats that people brought up in discussion: hackers,stalkers, spammers, and marketers.

“Hackers” are individual threats who fit the expectedimage – people out to cause mischief and harm,generally highly skilled, but motivated by the samerandomly violent impulse which leads to vandalism(rather than conducting targeted attacks). Hackersare broadly recognized as a threat, although they arenot, in general, seen as a danger, but rather as anuisance. The very image of hackers as “maladjustednerds” seems to detract from concerns about identitytheft, criminal intent, etc.1

“Stalkers” are those who, as described above, mightuse information gleaned online to pursue an offlinethreat. For many people, the protection of theironline identity and information is an extension of theprotection of their person and their property. Thepossibility for offline consequences of onlineactivity – including but not limited to physical 1 Not that these concerns are not present, but they are lessfrequently associated with particular social groups.

danger to ones own person or to ones friends andfamily – is a remarkably prevalent concern.

“Spammers” are organizations and individuals thatadvertise through unsolicited messaging, wastingpeople’s time and using up organizational resourcesthrough an implicit denial of service. As we relatedearlier, unsolicited email is a major concern topeople, and, critically, they see it as a securityproblem.

“Marketers” are people who invade individualprivacy by surreptitiously collecting informationabout activities, purchasing patterns, and so forth.We found it interesting that the marketers aredefined as threats in pretty much the same way ashackers, stalkers and spammers. People reportedmaintaining false identities, falsifying information,refusing to disclose identifiers, and other strategiesby which they explicitly attempted to evade suchtracking. To an extent, it seemed that older peopleseemed more likely to trust organizations and regardrenegade individuals as threats; younger people weremore likely to see organizations as potential threats.Age difference effects are noted by Sheehan (2002);we will have more to say about them in a moment.

5 Attitudes Towards SecurityClearly, experiences with security and systems varybetween individuals. Our data suggest a range ofattitudes that people display towards security.

5.1 Security as an ObstacleWe have already noted differences between youngerand older participants in our study. A furtherinteresting separation between our older and youngerparticipants relates to this issue of experience. Ingeneral, our younger subjects, with a relativelylonger exposure to computer systems (and inparticular, it seems, childhood exposure) express amuch greater confidence in their abilities withcomputer systems. In particular, they seem to havebeen more likely to encounter situations in whichsecurity services proved problematic, hinderingrather than helping their activities. Getting filesthrough firewalls, for instance, had been problematicfor some, who found that they had to turn off orcircumvent security technologies in order to get theirwork done. They were more likely to talk of securityin terms of its costs as well as its benefits, and frametechnical security measures as ones that can interfere

Page 10: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 20038

with the practical accomplishment of work. This is,of course, the “barrier” argument when seen fromthe other side.

A similar example occurs in a related study of teenuse of SMS (text messaging via GSM’s ShortMessage Service) reported elsewhere (Grinter andEldridge, 2003.) The teens studied neverintentionally turned off their phones, which meantthat they rarely if ever used their password to logback onto the phone after a reboot. This meant thatwhen they accidentally let the battery run out, theteenagers described having to take their mobilephone to the nearest service center to get thepassword reset before they could resume receivingSMS’s. This delay, in addition to creatinginconveniences to have to go to the service centervia public transportation, also caused considerationfrustration when the teenagers realized just howmany messages and potentially activities in the fewhours or days it would take to get fixed.

In other cases, security appears as an obstacle inother ways. For much day-to-day use, security is nota primary concern for end users; people rarely boottheir computer in order to deal with securityconfigurations. The persistence of virus checkers,intrusion detectors, and other similar systems ininterrupting current work in order to insist thatsomething be done (new rules installed, portsblocked, or even just a button pressed) seemed to beproblematic. This is, perhaps, another case of thedifficulty of an “all-or-nothing” approach – securityis either something unmentioned, or it is somethingto be dealt with suddenly and immediately.

5.2 PragmatismIn broad terms, and in line with the previousobservation, the younger respondents seemed morepragmatic about their security needs, expressingmore nuance about the situations in which theymight need security. For instance, they discussedusing known insecure technologies in settings wherethey felt that the risks were justified (e.g. a machinethat was known to be chock full of viruses, but wasotherwise unused so it didn’t matter.) This pragmaticorientation in younger subjects is in line withprevious findings (Sheehan, 2002). Pragmatic userssee security as a trade-off, one that must becontinually struck as one balances immediate needsagainst potential dangers. For pragmatic users, then,

systems need to be both flexible and translucent, sothat these trade-offs can be made effectively.

5.3 FutilityHowever, even amongst those who expressed moreconfidence about their abilities and a morepragmatic orientation towards security, there is anoverwhelming sense of futility in people’sencounters with technology. This corroborates thesimilar observation was made by Weirich and Sasse(2001) in their investigations. Our subjects makerepeated reference to the unknown others (hackers,stalkers, etc.) who will always be one step ahead,and whose skill with technologies will mean thatthere are always new attacks to be diverted. As aresult, they talk repeatedly of security lying not somuch in technology as in vigilance; the continual,active defense against new and evolving threats.

The results of this sense of futility vary dependingon the setting and the forms of threat. With respectto the broad Internet, it certainly contributes tofrustration and the sense that one is continually“running to stay in the same place”; it creates afictive norm of adequate protection, against whichpeople continually find themselves wanting. Inorganizational settings, it becomes manifest mainlyas a concern with “due diligence” – the visibledemonstration that one has done enough. As in thecases discussed earlier where security moves out ofthe computer and into the physical environment, thedemonstration that one has taken due care to manageinformation and activities securely becomesimportant, even though subjects may not feel thatthese measures are likely to survive an assault.

6 Social ContextWhen we focus on security as a practical problemthat people encounter and solve, we need also toconsider the context within which it is encounteredand solved. Frequently, the social and organizationalcontext surrounding user activity is important.

6.1 Delegating SecurityUnsurprisingly, most people, in the course of theirdaily work, have neither the time nor inclination tobe continually vigilant for new threats; they arefocused on getting their work done. One particularlyinteresting issue, then, is the various modalities bywhich people delegate responsibility for security.

Page 11: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 20039

Security is, to some extent, turned into someoneelse’s problem, or at least, external resources aremarshaled to deal with the problem. Four forms ofdelegation are identifiable in our interviews.

The first is to delegate to technology, which involvesrelying on some form of technology for protection.So, people might rely on SSL encryption for dataconnections, ssh tunneling for their email, or trustthat switched Ethernet is more secure than a atraditional common medium. These are, of course,the solutions that the technical community is used toproviding. Interestingly, though, this was perhapsone of the least common ways of managing securitythat we encountered. It is also interesting to observethat this delegation is an investment of trust, and wespeculate that it depends on visible presence oftechnology to be trusted, which questions the idea ofsecurity as an invisible or transparent facet of asystem. Arguably, the use of physical arrangementsto embody security concerns is a case of delegationto technology (albeit less advanced.)The second mode of delegation is to delegate toanother individual, such as a knowledgeablecolleague, family member, or roommate. Often, thismight be someone who set the computer up in thefirst place; their knowledge and skill is cited as oneelement of a person’s defense against potentialthreats. For people who feel limited in their ability toassess technology, known individuals may be moretrustworthy.

The third mode is to delegate to an organization;like delegation to an individual, this delegates toothers, but the others are organizationally definedand may not even be known personally. Essentially,this is the “we have a very good support group”argument. The skills and especially the vigilance ofthe organization is where people place their trust. Insome cases, again due to the fictive norm associatedwith vigilance, more trust may be accorded toexternal organizations; and so, facilities run by acentral service rather than by a local group, orfacilities managed through an outsourcingarrangement, are seen as more secure.

Finally, we also found a mode in which peoplewould delegate to institutions. So, our earlierexamples in which financial institutions are seen asinherently more trustworthy because they arepresumed to have a primary concern with security isan example of this trust in institutional arrangements

and archetypes. Again, this is an online/offlinerelationship; impressions of the banks’ concern withphysical security (locked vaults and armed securityguards) are carried over to online security, eventhough of course online interactions with a bankdepend on a complex of intermediate technologiesoutside of any bank’s control.There is an important temporal aspect to this processof delegation. Essentially, once responsibility hasbeen delegated, it becomes almost invisible; itseemed to be rare for these issues to be revisited.Individuals to whom responsibility had beendelegated when they set up the computer sometimesdisappeared from view (the former roommate orcolleague, or the son who had left for college), andyet they were still invoked as the guarantor ofsecurity. In organizational settings, we found that,over time, some newer employees would not haveany recollection of what kinds of access controls, forexample, would be on their file systems. Delegationto the support group would have occurred severalyears prior to their arrival and they could notarticulate what kinds of privileges existed. This isinteresting for two reasons. First, the work practicesof groups often “grow over” the underlying securitywhile taking advantage of what it provides, until thesecurity decisions are lost to conscious memory.Second, no-one concerned themselves with this.Between the initial security decision and thesupporting work practices, the day-to-dayconfiguration had disappeared but was still beingenacted correctly.

6.2 Socially-Defined Security NeedsAnother feature for several of the people that weinterviewed was that security was not just defined asthe means of securing the information and itstransmission with respect to legal codes, but alsowith respect to how others perceived the relativesensitivity of the information. Decisions about howto handle and manage online (and even offline) datawere sometimes made based on how someone elsefelt that the data should be treated.

For example, one person described how for certaininformation requests (where there was no clearinstitutional guidelines such as Federal laws) shewould ask a number of people and then base herdecision on the most conservative response.Deciding whether to grant electronic file access to

Page 12: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 200310

someone, or email documents, was not just adecision of what technical security parameters, butalso a matter of determining what to apply to asituation.

6.3 Security as a PracticeThe people we interviewed had a number of methodsfor managing online security, some of whichinvolved using security protocols in what may seemlike unusual ways, and others of which that appear toinvolve no security, but illustrate how people thinkabout security in technology.

Whitten and Tygar’s (1999) analysis of emaildiscovered that users had incredible difficultiesusing PGP to secure their communications.However, what is clear is that people use email tocommunicate all the time, and even when they haveinformation that needs to be protected. So, wewondered what they did to “secure” the information.We discovered two common strategies.

First, people use institutional means to securecommunications. We see this each time we receivean email from someone that has an attachedsignature file that states the legal and illegal uses ofthe contents of the message. Rather than securingthe communications, the purpose of these statementsis to defend the contents if they becomeincriminated2. In other words, corporations oftenattempt to mitigate the risks of information leaks bysecuring the consequences of those leaks by markingthe messages.Although it may not be secure technically, it is notsurprising that this approach is used. Markingdocuments has long been a means by whichcorporations sort and prioritize the contents ofpresentations and reports and so forth. The securingof email messages appears to coincide with themigration of email from an informal chattingtechnology to a formal means of corporatecommunications.

Second, we also found cases where people wereusing context to secure their email messages. Bythis we mean that we found cases where peopledescribed sending email that did not explicitly state 2 The inverse, perhaps, of the idea that it’s easier to seekforgiveness than permission; it is easier to enforce throughsanction than prevention.

what the subject was in the actual email itself, butused a shared working context to express the newinformation. For example, an email message thatsays, “I took the actions you requested” could referto many types of activity including processingsensitive data such as updating someone’simmigration status. Moreover, by removing anysense of time from the contents (other than the datestamp) no specific temporal information could bededuced.

Crucially, this arose not simply as happenstance;rather, it was an explicit strategy adopted to supportsecure communication as a part of a broader patternof work. Using cryptic email rather than encryptedemail offered two advantages. First, it was simply alot easier to do than using a security tool to encryptthe information. By easier though, we do not justmean Whitten and Tygar’s usability of variousencryption software, we mean that context-basedencryption may simply be the more visible form ofsecurity measures for many people working withemail. The fact that it can be accomplished as partand parcel of the working activities, rather than as aseparate and parallel activity, also makes it a moreconvenient feature of work practice (Smetters andGrinter, 2002).

The visibility of security systems (their presence andutility) let alone their usability is also illustrated bythe use of media switching as a security measure. Inour interviews, we found several occasions wherepeople switched communications media based onsecurity decisions. In particular, a number of peopleat site B described suggesting in email a switch tothe telephone for the most private or sensitivediscussions. Several interviewees said that theytrusted the telephone for their most secureconversations, and introduced a media switch to thetelephone from email when the most sensitive oftopics came up.

Perhaps what is most surprising about this ability torate technological mediums for security is the samephenomenon reported by teenagers (Grinter andPalen, 2002). Teenagers using Instant Messagingtechnologies also reported suggesting a media switchto the telephone for the most confidential ofconversations. The telephone offers two relatedadvantages. First, potentially, it is a moreprobablistically secure medium than email.Although it can be tapped and listened into, maybe it

Page 13: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 200311

is less statistically likely. Second, and the one morecommonly articulated, the medium is ephemeral inthe sense that nothing that is said is readily recordedand transmittable. Unlike electronic text, somethingthat teenagers observed with clarity was that it wasmuch harder to record and convince anyone elsewith absolute certainty what was said on thetelephone. In other words, confidentiality andprivacy of information can be more likelyguaranteed in a medium that is not as readilyrecorded, and understanding the different betweenelectronic conversation and electronic text, the audioword seemed more secure than the textual one.

Earlier, we discussed encryption and the need forsecure communications and its relationship to mediachoice. In that and previous discussions, currentsecurity systems seemed almost to fail our users inthe sense that they did not fit work practices.Moreover, they competed ineffectively with otheralternatives such as simply switching media(something that was once just available to officeworkers is now something that teenagers at homecan consider). However, we also found someoccasions where security measures had beenincorporated into working practices.

One example of this concerns access control toshared directories. In this case, two legal staffexplained how they used the access control settingsfor online directories as a means of communications!Specifically, they both worked on files that oncethey had finished with their own notes needed to besent to a centralized legal body for furtherprocessing. Rather than using email to do this, theyused a system of shared online directories. Whilethey worked together and locally on the files, theyput them in an online directory that only they couldaccess. When they had finished working on the filethey would simply move the file to anotherdirectory, one whose access controls were set toallow other people to access it from a remote site.One advantage that the two local legal staff foundwith this scheme was that they did not have to knowspecifically who they had to send the files too(unlike email).

6.4 Managing IdentityIn the interviews we discovered another challengefor security: that of identity. This manifested itself

in two ways – the production of identity, and theinterpretation of identity.

First, we found that our informants were veryconscious of the ways in which they presentedthemselves online. Many, for instance, maintainmany virtual identities (email addresses, onlinepersonas, etc) as a way of controlling their visibility.In addition, we found some evidence for the use ofpartial identities; by controlling which specificaspects of information (social security number,home zip code, etc) they gave to different entities,some respondents attempted to maintain control overthe degree to which they could be identified andtracked.3 When identity is seen as the sum of thesefactors, a subset seems secure.

The second issue is the interpretation of identity. Inmany of solutions proposed an individual secureshim- or her- self individually. Personal firewalls,personal encryption, passwords, and so forth allplace a primacy on the individual. However, wefound a number of cases where seemingly individualpeople were in fact groups of people, and because ofthat security decisions and solutions became moreproblematic.

This problem was most clearly exhibited byindividuals who had personal assistants. In manycases, the email address of an executive does notequate to the person themselves: it refers to themand their personal assistant. When we talked withassistants and people who emailed executives wediscovered a mismatch in expectations anddifficulties with this arrangement.

Although turning an executive’s email into a groupdistribution list (that goes to the executive and theirassistant) we found cases where people weresurprised by this distribution. The metaphor impliedby email is that the email goes to the individualidentified by the handle that it is sent to. Bycontrast, group distribution lists are also visiblebecause of the meaning conveyed in the handle used.For example [email protected] is expected to goto a person by that name alone, [email protected] could be considered as agroup email for people interested in football.

3 Whether they were successful in these efforts is, of course,entirely a different matter.

Page 14: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 200312

In our interviews we found that people weresurprised when individual email addresses turnedout to be group lists that contained a respondee thatwas not initially expected. In some cases, therevelation of an extra person raised securityconcerns for the surprised. We found cases whereindividuals who had other people read and respondto their email had to reassure surprised recipients byexplaining that the other reader was a trustedindividual.

In other cases, we discovered that occasions wherethese “extra” readers had been removed from themessage. While this may have secured theinformation content, we also came to understand thatwhen this involved scheduling the trade-off betweensecurity and organization was difficult. Forexample, we found that on occasion people wouldattempt to communicate with executivesindividually. Sometimes this would create problemsfor their assistants because private communicationswhile being private would also mean that peopleneeded to make travel, dining, or simply meetingarrangements would have no idea that commitmentshad been arranged.

The issue of identity management is a complex andchallenging one for security systems. As Palen andDourish (2003) observe, people act continually andsimultaneously in multiple capacities – asindividuals, as representatives of organizations orprofessional groups, as family members or membersof some occupational groups, etc. The ways in whichthey act – the information that they chose todisclose, and how, when, and to whom they discloseit – mark them as affiliated with one or another setof people. Conventional separation into “roles” failsto capture the fluid and especially the simultaneousnature of these capacities in which one acts.

7 Dialectic Nature of Security andPrivacy

The central element of our approach has been toexamine security as a practical problem, as itmanifests itself every day for the users of networkedinformation systems, and as it is routinely solved bythem in the course of their work. Across a range ofusers, a range of technologies, and a range ofsettings, we find a number of commonalities. Oneparticularly relevant one, which relates to ongoingresearch into the nature of privacy in both online and

offline settings, is the dialectic nature of security andprivacy management (Altman, 1975; Palen andDourish, 2003).

Conventional discussions of privacy have a numberof common properties. Privacy is normallyconceived of as a state of social withdrawal, forinstance; and it is normally a normative issue. Thedialectic model, though, suggests, first, that privacymay be better conceived of as a dynamic process ofboundary management, characterized as much by apressure for disclosure as by a pressure forwithdrawal, and ultimately operating as a dynamicresolution between the two, managed according tocircumstances and immediate needs; and second,that this dynamic process should be seen within atemporal context which is oriented both towards ahistory of past actions through which norms ofconventional practice are defined, a future ofpotential needs and expectations.

We can clearly see security emerging in a similarlight in our data. The very definition of what countsas “secure” is a contingent matter; “security”depends on the circumstances. Rather than beingpredefined and absolute, security is relative and mustbe continually adapted and managed in the course ofsystem use. It is both responsive to social practice(which sets the conventions and patterns by whichsituations will be interpreted) and contributes to it(by creating new patterns that are shared withinsocial groups and organizations). This perspectivesuggests a number of relevant design considerations.

8 Technical ResponsesClearly, as demonstrated by the data we havepresented here, security is a significant concern forend users. Most importantly, though, it is a concernof a sort not typically explored by security research.Where security research has typically focused ontheoretical and technical capabilities andopportunities, for end users carrying out their workon computer systems, the problems are moreprosaic. They are concerned with getting their workdone and with acting in ways appropriate to thesettings in which they find themselves; they areconcerned with communicating with colleagues;they are concerned with the details of theirimmediate tasks and generally not with moreabstract concerns such as the specification,formulation, and configuration of security

Page 15: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 200313

mechanisms. Indeed, one of the things that we havelearned is that it may be inherently implausible fortypical users to specify, in advance of particularcircumstances, what their security needs might be;those needs arise only as a result of specificencounters between people, information, andactivities. Our goal in taking this broad look atsecurity “as it happens” is not to specify a set ofrequirements for the redesign of specific securitysystems such as password systems, encryptionmechanisms, or VPNs. Such usability-basedexplorations are valuable, but our concern issomewhat broader. Our investigations suggest to usthat the problem of effective security will not befixed by correcting design errors in existingtechnologies, but rather by respecifying therelationship between security facilities and everydaywork.

As examples, consider the following implications forsecure system design.

Our data points to the dialectic nature of security;that information protection and information sharingmutually define each other, and that the process ofmanaging security is the dynamic process ofmanaging that balance. This suggests that protectionand sharing of information are two parts of the sametask; they are always carried out together.Interestingly, though, most systems separate thesetwo tasks. Typically, information sharing is aprimary task, while information protection is asubsidiary task, specified in advance through controlpanels, configuration controls, abstract rulespecification, etc. Our investigations suggest that weneed to think of these as being the same tasks; Ishould use the same mechanisms to shareinformation as to protect it, and, critically, theyshould be available to me at the same time. A split inwhich sharing information (adding files to a server,sending an attachment, logging into an IM service)is separated from the work of protecting it(specifying access control, encrypting theinformation, specifying availability) is ineffective.Essentially, practices such as maintaining multipleemail addresses or AIM screen names is a practicalsolution that users have forged which allows them toconjoin information sharing and informationprotection as a unified task.

One critical issue that arises out of many of ourobservations is the extent to which people are able to

monitor and understand the potential consequencesof their actions. Since security requirements dependon the specific circumstances of action and aresubject to continual reflection and revision, it isnecessary to provide people with the means tounderstand the security implications of the currentconfiguration of technologies at their disposal.Rather than being “transparent,” then, securitytechnologies need to be highly visible – available forinspection and examination seamlessly as a part ofwork. Interactive system design emphasizes thatavailable functionality and courses of action shouldbe continually available at-a-glance; securityconfiguration should be available in just the sameway. This, critically, is quite different from beingable to “call up” security information when it’sneeded; the point is that it is needed as part andparcel of every activity in the system. Relatedresearch explores the technical infrastructure todevelop these ideas (Dourish and Redmiles, 2002.)

Taking this one step further, we note that security isa mutual achievement of multiple parties. Like thepeople sending cryptic email to maintain the securityof their information, people achieve security in thecontext of the activities that they carry out together.The security consequences of my actions depend notonly on what I do but also on what my colleaguesdo. The scope of security, then, is a collaborativescope; it extends beyond the individual. Recognizingthat the correct unit of analysis for security systemsis groups rather than individuals, and thatvisualization approaches such as those suggestedabove might apply to groups, significantly changeshow we conventionally think of security systems andsecurity interfaces. Security is a collectiveaccomplishment, an outcome of shared practices aswell as shared configuration and technology.

9 ConclusionsWhile the research community has been extremelysuccessful in developing the mathematicalfoundations of secure computing andcommunication services, we have, perhaps, been lesssuccessful in the more practical task of making dayto day systems effectively secure. Any technologyfor secure communication is only as secure as thesettings within which it is deployed. We have arguedthat a major obstacle to the development of moreeffective security strategies is that these systems

Page 16: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 200314

often match poorly to the ways in which people needto make use of them.

A small but growing group of researchers havebegun to examine the usability of securitytechnologies, and have noted a range of problemsthat interfere with the effective use of technologiescurrently employed for security in day-to-daysettings. However, our argument here is broader. Webelieve that effective solutions will not come solelyfrom repairing the usability problems associatedwith existing technologies, because the very natureof those technologies – the ways in which theyconceive of the problems of security – is a source oftrouble. When we look at those areas in which HCIcan be said to have lead to radical improvements inusability – such as the development of graphical userinterfaces and the development of the Internet intothe Web – it is instructive to note that they did notarise through the incremental modification ofexisting technologies (e.g. systematically improvingthe usability of each command-line UNIX program.)Similarly, we believe that effective security willrequire that we examine the conceptual models onwhich our systems are built.

We have been exploring the conceptual foundationsof users’ experiences of security. Critically, we findthese to be embedded in working practice, inphysical settings, and in social and organizationalarrangements. This seems to suggest an alternativeapproach to security – one that makes the securityimplications of actions visible and accessible in thesame way that everyday actions are visible andaccessible in the everyday physical and socialenvironment. The physical and social world areorganized in ways that are perceptible andrationalizable, while our security solutions tend to beinvisible, unpredictable, and obscure for end-users.Rather than making security be about preventingusers from doing things, the goal of our work is tomake it rather be about enriching the experience ofwhat they can do.

AcknowledgementsWe would like to thank Tom Berson, Leysia Palen,David Redmiles, and Diana Smetters for theircontributions to our thinking about these issues. Wealso gratefully acknowledge the patience and help ofour interview subjects. This work has been

supported in part by National Science Foundationaward IIS-0133749.

References[1] Ackerman, M. and Cranor, L. 1999. Privacy Critics:

UI Components to Safeguard Users’ Privacy. AdjunctProceedings of CHI’99 (Short Papers), 258-259.

[2] Ackerman, M., Cranor, L., and Reagle, J. 1999.Privacy in E-Commerce: Examining User Scenariosand Privacy Preferences. ACM Conf. on ElectronicCommerce, 1-8. ACM.

[3] Adams, A. and Sasse, M.A. 1999. Users Are Not TheEnemy: Why users compromise security mechanismsand how to take remedial measures. Comm. ACM,42(12), 40-46.

[4] Adams, A., Sasse, M.A., and Lunt, P. 1997. MakingPasswords Secure and Usable. In Thimbleby, H.O’Connaill, B., and Thomas, P. (eds), People andComputers XII: Proceedings of HCI’97, 1-19.Springer.

[5] Altman, I. 1975. The Environment and SocialBehavior: Privacy, Personal Space, Territory andCrowding. Monterey, CA: Brooks/Cole PublishingCo. Inc.

[6] Bellotti, V. and Sellen, A. 1993. Design for Privacyin Ubiquitous Computing Environments. Proc.European Conf. Computer-Supported CooperativeWork ECSCW’93, 77-92. Kluwer.

[7] Brostoff, S. and Sasse, M.A. 2000. Are Passfacesmore usable than passwords? A field trialinvestigation. In S. McDonald, Y. Waern & G.Cockton (Eds.): People and Computers XIV -Usability or Else! Proceedings of HCI 2000, 405-424. Springer.

[8] Dewan, P. and Shen, H. 1998. Flexible Meta Access-Control for Collaborative Applications Primitives forBuilding Flexibile Groupware Systems. Proceedingsof ACM Conference on Computer-SupportedCooperative Work CSCW'98, 247-256. ACM.

[9] Dhamija, R. and Perrig, A. 2000. Deja Vu: A UserStudy. Using Images for Authentication. InProceedings of the 9th USENIX Security Symposium,Denver, Colorado.

[10] Dourish, P. 1993. Culture and Control in a MediaSpace. Proc. European Conf. Computer-SupportedCooperative Work ECSCW’93, 125-137. Kluwer.

[11] Dourish, P., Edwards, K., LaMarca, A., Lamping, J.,Petersen, K., Salisbury, M., Terry, D. and Thornton,J. 2000. Extending Document Management Systemswith User-Specific Active Properties. A C M

Page 17: Security Day-to-Day: User Strategies for Managing Security

UCI-ISR-03-5 June 200315

Transactions on Information Systems, 18(2), 140-170.

[12] Dourish, P. and Redmiles, D. 2002. An Approach toUsable Security based on Event Monitoring andVisualization. Proc. ACM New Security ParadigmsWorkshop NSPW 2002 (Virginia Beach, VA). NewYork: ACM.

[13] Friedman, B., Hurley, D., Howe, D., Felten, E., andNissenbaum, H. 2002. Users’ Conceptions of WebSecurity: A Comparitive Study. Short paperpresented at ACM Conf. Human Factors inComputing Systems CHI 2002 (Minneapolis, MN.)

[14] Glaser, B. and Strauss, A. 1967. The Discovery ofGrounded Theory: Strategies for QualitativeResearch. Chicago: Aldine.

[15] Greenberg, S and Marwood, D. 1994. Real-TimeGroupware as a Distributed System: ConcurrencyControl and its Effect on the Interface. Proc. ACMConf. Computer-Supported Cooperative WorkCSCW’94, 207-218. ACM.

[16] Grinter. R. and Palen, L. 2002. Instant Messaging inTeen Life. Proc. ACM Conf. Computer-SupportedCooperative Work CSCW 2002 (New Orleans, LA),21-30. New York: ACM.

[17] Grinter, R. and Eldridge, M. 2003. Wan2tlk?Everyday Text Messaging. Proc. ACM Conf. HumanFactors in Computing Systems CHI 2003 (FtLauderdale, FL). New York: ACM.

[18] Henning, R. 2000. Security Service LevelAgreements: Quantifiable Security for theEnterprise? Proc. New Security Paradigm Workshop(Ontario, Canada), 54-60. ACM.

[19] Irvine, C. and Levin, T. 1999. Towards a Taxonomyand Costing Method for Security Services. Proc. 15th

Annual Computer Security Applications Conference.IEEE.

[20] Irvine, C. and Levin, T. 2001. Quality of SecurityService. Proc. ACM New Security ParadigmsWorkshop, 91-99.

[21] Palen, L. and Dourish, P. 2003. Unpacking “Privacy”for a Networked World. Proc. ACM Conf. HumanFactors in Computing Systems CHI 2003 (Ft.Lauderdale, FL). New York: ACM.

[22] Rimmer, J., Wakeman, I., Sheeran, L., and Sasse,M.A. 1999. Examining Users’ Repertoire of InternetApplications. In Sasse and Johnson (eds), Human-Computer Interaction: Proceedings of Interact’99.

[23] Sheehan, K. 2002. Towards a Typology of InternetUsers and Online Privacy Concerns. The InformationSociety, 18, 21-32.

[24] Sheeran, L, Sasse, A., Rimmer J., and Wakeman, I.2001. How Web Browsers Shape Users'Understanding of Networks. The Electronic Library,20 (1), 35-42.

[25] Shen, H. and Dewan, P. 1992. Access Control forCollaborative Environments. Proc. ACM Conf.Computer-Supported Cooperative Work CSCW’92,51-58. ACM.

[26] Smetters, D. and Grinter, R. 2002. Moving from theDesign of Usable Security Technologies to theDesign of Useful Secure Applications. Proc. ACMNew Security Paradigms Workshop NSPW 2002(Virginia Beach, VA). New York: ACM.

[27] Spyropoulou, E., Levin, T., and Irvine, C. 2000.Calculating Costs for Quality of Security Service.Proc. 16th Computer Security ApplicationsConference. IEEE.

[28] Thomsen, D. and Denz, M. 1997. IncrementalAssurance for Multilevel Applications. Proc. 13th

Annual Computer Security Applications Conference.IEEE.

[29] Weirich, D. and Sasse, M.A. 2001. Pretty GoodPersuasion: A first step towards effective passwordsecurity for the Real World. Proceedings of the NewSecurity Paradigms Workshop 2001 (Sept. 10-13,Cloudcroft, NM), 137-143. ACM Press.

[30] Whitten, A. and Tygar, J.D. 1999. Why Johnny Can’tEncrypt: A Usability Evaluation of PGP 5.0. Proc.Ninth USENIX Security Symposium.

[31] Yee, K.-P. 2002. User Interaction Design for SecureSystems. Proc. 4th International Conf. Informationand Communications Security (Singapore).

[32] Zurko, M.E. and Simon, R. 1996. User-CenteredSecurity. Proc. New Security Paradigms Workshop.ACM.