privacy acknowledgement: jason hong, cmu. overview of privacy why care? why is it hard? thinking...

Post on 28-Dec-2015

217 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Privacy

Acknowledgement: Jason Hong, CMU

Overview of Privacy

• Why care?• Why is it hard?• Thinking about and Designing for Privacy

– Specific HCI issues and designs

• Why privacy might not matter

• Very broad look at privacy– Social aspects, legal aspects, philosophical, user interface

• Protection from spam, identity theft, mugging• Discomfort over surveillance

– Lack of trust in work environments

– Might affect performance, mental health

– May contribute to feeling of lack of control over life

• Starting over– Something stupid you did as a kid

• Creativity and freedom to experiment– Protection from total societies

– Room for each person to develop individually

• Lack of adoption of tech

Why Care About Privacy?End-User Perspective

Everyday Risks Extreme Risks

Stalkers, Muggers_________________________________

Well-beingPersonal safety

Employers_________________________________

Over-monitoringDiscrimination

Reputation

Friends, Family_________________________________

Over-protectionSocial obligationsEmbarrassment

Government__________________________

Civil liberties

The Fundamental Tension

• More information can be used for good and for bad• Facebook

– Keeping in touch with friends

– But embarrassing photos or breakups recorded for all time?

• Google search reveals significant amount of information, especially over time and scross applications

The Fundamental Tension

• More information can be used for good and for bad• Facebook

– Keeping in touch with friends

– But embarrassing photos or breakups recorded for all time?

• People Finder– Okayness checking and coordination

– But also stalking, monitoring at work, or embarrassment

• Amazon (or any ecommerce site)– Can improve search results, personalized content

– Price discrimination, selling your info to others, not keeping your info safe from hackers

• Characteristics– Real-time, distributed

– Invisibility of sensors

– Potential scale

– What data? Who sees it?

• Design Issues– No control over system

– No feedback, cannot act appropriately

• You think you are in one context, actually in many

– No value proposition

Why is Privacy Hard?

Why is Privacy Hard?

• Devices becoming more intimate– Call record, SMS messages

– Calendar, Notes, Photos

– History of locations, People nearby, Interruptibility

– With us nearly all the time

• Portable and automatic diary– Accidental viewing, losing device, hacking

• Protection from interruptions– Calls at bad times, other people’s (annoying) calls

• Projecting a desired persona– Accidental disclosures of location, plausible deniability

Why is Privacy Hard?

• Your stories / thoughts?

• Hard to define until something bad happens– “Well, of course I didn’t mean to share that”

• Risks not always obvious up front– Burglars went to airports to collect license plates

– Credit info used by kidnappers in South America

Why is Privacy Hard?Definition problem

• Expectations and levels of comfort change with time and/or experience– Both individual and societal

– Many people objected to having phones in their homes because it “permitted intrusion… by solicitors, purveyors of inferior music, eavesdropping operators, and even wire-transmitted germs”

Why is Privacy Hard?Social Perspective

Why is Privacy Hard?Social Perspective

The appearance of Eastman’s cameras was so sudden and so pervasive that the reaction in some quarters was fear. A figure called the “camera fiend” began to appear at beach resorts, prowling the premises until he could catch female bathers unawares.

One resort felt the trend so heavily that it posted a notice: “PEOPLE ARE FORBIDDEN TO USE THEIR KODAKS ON THE BEACH.” Other locations were no safer. For a time, Kodak cameras were banned from the Washington Monument. The “Hartford Courant” sounded the alarm as well, declaring the “the sedate citizen can’t indulge in any hilariousness without the risk of being caught in the act and having his photograph passed around among his Sunday School children.”

• Cause and effect may be far in time and space– Think politicians and actions they did when young

– Video might appear on YouTube years later

• Privacy is highly malleable depending on situation– Still use credit cards to buy online

– Benefit outweighs cost

• Power or social imbalances– Employees may not have many choices

• Easy to misinterpret– Went to drug rehabilitation clinic, why?

Why is Privacy Hard?Individual perspective

• Easier to capture data– Video cameras, camera phones, microphones, sensors

– Break “natural” boundaries of physics

• Easier to store and retrieve data– LifeLog technologies

– Googling a potential date

Why is Privacy Hard?Technical Perspective

• Data getting easier to store and retrieve– LifeLog technologies

– Googling a potential date

Why is Privacy Hard?Technical Perspective

• Easier to capture data– Video cameras, camera phones, microphones, sensors

– Break “natural” boundaries of physics

• Easier to store and retrieve data– LifeLog technologies

– Googling a potential date

• Easier to share data– Ubiquitous wireless networking

– Blogs, wikis, YouTube, Flickr, FaceBook

• Inferences and Machine Learning– Humidity to detect presence

Why is Privacy Hard?Technical Perspective

• Bad data can be hard to fix– Sen. Ted Kennedy on TSA no-fly list

• Market incentives not aligned well– More info can market better

– Can sell your info

• Many activities are hidden– What are credit card companies,

Amazon doing?

– What is NSA doing?

Why is Privacy Hard?Organizational Perspective

• Few tools• Few evaluation techniques• Lack of clear metrics

Why is Privacy Hard?Purely HCI Perspective

• Privacy is a large umbrella term– Lots of different groups and schools of thought

that don’t always interact or agree with each other– Tools and methods for one school of thought doesn’t

necessarily work well for others

• Privacy as anonymity – Cypherpunks, database researchers, machine learning

• Privacy as a rational process for organizations• Privacy as organic process / Personal privacy

– A lot of HCI, CSCW, CMC work falls here• Ubicomp 2003

– Workshop on Privacy (mostly men)– Workshop on Intimate Computing (mostly women)

Why is Privacy Hard?Meta-Research Perspective

What is Privacy?

• No standard definition, many different perspectives• Different kinds of privacy

– Bodily, Territorial, Communication, Information

• Many different philosophical views on info privacy– Different views -> different values -> different designs

– Note: next few slides not mutually exclusive

Principles vs Common Interest

• Principled view -> Privacy as a fundamental right– Embodied by constitutions, longstanding legal precedent

– Government not given right to monitor people

• Common interest -> Privacy wrt common good– Emphasizes positive, pragmatic effects for society

• Examples:– National ID cards, mandatory HIV testing

Self-determination vs Personal Privacy

• Self-determination (aka data protection)– Arose due to increasing number of databases in 1970s

– “Privacy is the claim of individuals, groups or institutions to determine for themselves when, how, and to what extent information about them is communicated to others” (Westin)

– Led to Fair Information Practices (more shortly)

– More of individual with respect to governments, organizations, and commercial entities

• Personal privacy– How I express myself to others and control access to myself

– More of individual with respect to other individuals

Self-determination vs Personal Privacy

• Examples: – Cell phone communication

• Data protection view– Telecoms record about who I called– How long keep the data?

• Personal privacy– Caller ID– What I choose to say on phone

– Instant messaging• Data protection view

– Store messages? Google Talk– Privacy policy

• Personal privacy– Who your buddies are– Invisible mode– Logs

– Facebook

Privacy as Solitude / Isolation

• “The right to be let alone”• People tend to devise strategies “to restrict their own

accessibility to others while simultaneously seeking to maximize their ability to reach people” (Darrah et al 2001)

– Protection from interruptions and undesired social obligations

• Examples: – Spam protection

– Do-not call list, not answering mobile phone

– Invisible mode, ignoring an IM

– IPod cocooning on public transit

Privacy as Anonymity

• Hidden among a crowd• Examples:

– Web proxy to hide actual web traffic

– “Someone in this room who is over 30 and once broke his right arm” vs “a female”

– Location k-anonymity

• This view is highly popularamong technical people– Measurable

– Limitations?• Crowd• Not “Turag”

Privacy as Projecting a Desired Persona

• People see you the way you want them to see you• Examples:

– Cleaning up your place before visitors

– Putting the right books and CDs out

– Having “desirable” Facebook groups,hobbies, politics, etc on your profile

Privacy as a Process

• Controlled, rationalistic process– Bank and web site privacy policies

– Many rules governing how personal information gathered and used

• Organic and fluid process– Adjusting window blinds

– Opening or closing my office door

– Choosing what I do or don’t disclose during a conversation

Privacy as Protection of Self vs Others

• Protecting Self• Protecting Others?

– Mandatory privacy, wearing clothes

– Cell phones going off in theaters

Overview of Privacy

• Why care?• Why is it hard?• Thinking about and Designing for Privacy

– Specific HCI issues and designs

• Why privacy might not matter

Legal Differences for Privacy

• America tends to have sector-by-sector privacy laws– HIPAA, CALEA, COPPA, FERPA, finance, video rentals

– Much of the legal rulings on privacy happens in judiciary• Wiretapping, advanced sensing tech

– Cynically, wait until a disaster happens, then try to fix

• Europe has comprehensive privacy laws– European Union Data Protection Directive

– Stronger focus on prevention

– Working party that will issue rulings on biometrics, privacy policies, etc

Fair Information Practices (FIPs)

• Many laws based on Fair Information Practices– Set of principles stating how organizations

should handle personal information

• Based on Self-determination / Data Protection view– “Privacy is the claim of individuals, groups or institutions to

determine for themselves when, how, and to what extent information about them is communicated to others” (Westin)

• Note: many variants of FIPs– Will discuss Organization for Economic Cooperation

and Development, one of the strictest sets

Fair Information Practices (FIPs)

• Collection limitation• Data quality• Purpose specification• Use limitation• Reasonable security• Openness and transparency• Individual participation• Accountability

Implications for Design

• Data protection perspective on privacy– Organizations collecting lots of data

– Hospitals, financial institutions, etc

• However, few tools for helping organizations “do the right thing” in HCI or elsewhere

SPARCLE

• Can author privacy policies in natural language

SPARCLE

• Parses those policies

SPARCLE

• Attaches those policies to data collected• Enforced by some policy engine

Privacy Policies

• Evidence strongly suggests people don’t read privacy policies (unless assigned as homework )– Carlos Jensen et al, CHI 2004

• Problems with privacy policies?– Too hard to read

– Privacy policy changed, can I challenge?

– This policy can change at any time, come back often

– Cover you’re @$$

– No market or perhaps legal interest

Multi-Level Privacy Policies

• http://www.pg.com/privacy/english/privacy_notice.html

Multi-Level Privacy Policies

• Idea from EU Working group on privacy– Short - Few sentences, for mobile phone

– Condensed - Half page summary

– Full - Details

Platform for Privacy Preferences Protocol (P3P)

• A machine-readable way for web sites to state their privacy policies

• One of the original scenarios:– Users could define what info willing to share with web sites

• Name, address, email, etc

– Could download P3P from web site• What they collect, why they collect, who they share, etc

– Web browser could then share or not share info

• Thoughts?– Incentives for people to participate, adoption

– Like buying insurance, don’t want to do it until have to

– Vendors have to make P3P, users have to create policies

Segmenting Users

• Westin and others have been running surveys over the past few years looking at individuals wrt orgs

• Don’t care (~10%)– I’ve got nothing to hide

– We’ve always adapted

– "You have zero privacy anyway. Get over it."

• Fundamentalist (~25%)– Don’t understand the tech

– Don’t trust others to do the right thing

• Pragmatist (~65%)– Cost-benefit

– Communitarian benefit to society as well as individual

Segmenting Users

• Best to focus your designs on don’t care and pragmatist segments

• Huge caveat: this only applies to individuals and organizations, not individuals to individuals– There still is not a good survey instrument for this

– Literature suggests that Westin survey does not have strong correlation with what people share with others

– May also mean that privacy studies have unknown bias in them

Contextual Instant Messaging

• Facilitate coordination and communication by letting people request contextual information via IM– Interruptibility (via SUBTLE toolkit)

– Location (via Place Lab WiFi positioning)

– Active window

• Developed a custom client and robot on top of AIM– Client (Trillian plugin) captures and sends context to robot

– People can query imbuddy411 robot for info• “howbusyis username”

– Robot also contains privacy rules governing disclosure

• Web-based specification of privacy preferences– Users can create groups and

put screennames into groups

– Users can specify what each group can see

Control – Setting Privacy Policies

• Coarse grain controls plus access to privacy settings

Control – System Tray

Feedback – Notifications

Feedback – Social Translucency

Feedback – Offline Notification

Feedback – Summaries

Feedback – Audit Logs

Evaluation

• Recruited fifteen people for four weeks– Selected people highly active in IM (ie undergrads )

– ~120 buddies, ~1580 messages / week (sent and received)

– ~3.3 groups created per person

• Notified other parties of imbuddy411 service– Update AIM profile to advertise

– Would notify other parties at start of conversation

Results of Evaluation

• 321 queries– ~1 query / person / day

– 61 distinct screennames, 15 repeat users

– 67 interruptibility, 175 location, 79 active window

• Added Stalkerbot near end of study– A stranger making 2 queries per person per day

Results – Controls

• Controls easy to use (4.5 / 5, σ=0.7)“I really liked the privacy settings the way they are. I thought they were easy to use, especially changing between privacy settings.”

“I felt pretty comfortable with using it because you can just easily modify the privacy settings.”

• However, can be lots of effort“It’s time consuming, if you have a long buddylist, to set up for each person.”

• Asked for more location disclosure levels– Around or near a certain place

Results – Comfort Level

• Comfort level good (4 / 5, σ=0.9)– 12 participants noticed stalkerbot, 3 didn’t until debriefing

– However, no real concerns

– Reasoned that our stalkerbot was a buddy or old friend

– Also confident in their privacy control settings

“I know they won’t get any information, because I set to the default so they won’t be able to see anything.”

Results – Appropriateness of Disclosures

• Mostly appropriate (2.47 / 5, where 3 is appropriate)– Useful information for requester? Right level of info?

– Two people increased privacy settings, one after experimentation, other after too many requests from specific person

• However, more complaints about accuracy– Ex. Left a laptop in a room to get food, person wasn’t there

Results – Usefulness of Feedback

• Bubble notification, 1.6 / 6 (σ=0.6)

Results – Usefulness of Feedback

• Bubble notification, 1.6 / 6 (σ=0.6)• Disclosure log, 1.8 (σ=1.3)

Results – Usefulness of Feedback

• Bubble notification, 1.6 / 6 (σ=0.6)• Disclosure log, 1.8 (σ=1.3)• Mouse-over notification, 3.7 (σ=1.0)• Offline statistic notification, 4 (σ=1.4)• Social translucency Trillian tooltip popup, 4.8 (σ=1.1)• Peripheral red-dot notification, 5.4 (σ=0.7)

Discussion

• Disclosure log not used heavily– Though people liked knowing that it was there just in case

• Surprisingly few concerns about privacy– No user expressed strong privacy concerns

– Feature requests were all non-privacy related

– If low usage, due to not enough utility, not due to privacy

• Does this mean our privacy is good enough, or is this because of users’ attitudes and behaviors?

Understanding Adoption

• Need to tie attitudes and behavior with adoption models

Teens

Optimistic vs Pessimistic Privacy

• How to tell computer when and when not to disclose information to others?

• Most privacy controls pessimistic, prevent bad– Setting up rules

• An alternative design is optimistic– Assume bad things rare

– Detect and fix after the fact

• Pros and cons?– Have to predict in advance disclosures

– Can’t fix everything after the fact

Several UIs Combine Aspects of Both

• AT&T Find Friends– Add friends (pessimistic)

– They have unlimited queries butyou also see each query (optimistic)

• Interactive mode possible too

AskBozo

(Pessimistic)

Always Allow(Optimistic)

Allow today(Limited

Optimistic)Just this once

Conjecture: Concerns Relax Over Time

Privacy Placebos?

• Privacy policies• IMBuddy audit logs• CareNet display

• Reputation management• Can be used as a shield for abusive behavior• Supermarket loyalty cards

– Gauge effect of marketing, effects of price and demand

– Market to best customers

• Can streamline economic transactions– Easy credit

• EU – “Regulators prosecuted an animal rights activist who published a list of fur producers and a consumer activist who criticized a large bank on a Web page that named the bank’s directors.”

Is Privacy always Good?

Lessig’s Framework

• Most of the HCI privacy work falls under understanding Norms and building Architecture

• Tom Erickson argues that we should help facilitate norms through “translucent” systems

Social Translucency

• Make participants and their activities apparent to others

• Ex. Alice is unlikely to repeatedly query for Bob’s location if she knows Bob can see each request– Erickson is implicitly arguing for optimistic privacy

Plausible Deniability

• Another example of supporting a norm• If I don’t answer my phone:

– Busy, shower, driving, bozo

– Ambiguity is good here

• How to build into systems?– Natural part of most asynchronous communication systems

– Unclear in general

– How reliable should our systems be?• Spam filters• Location granularity

Panopticon

• Subtle way of controlling prisoners– Idea first observed by Jeremy Bentham

– Popularized by Michel Foucault

• Modern day versions?

Subtle Control

“[The Active Badge] could tell when you were in the bathroom, when you left the unit, and how long and where you ate your lunch. EXACTLY what you are afraid of.”

- allnurses.com

One Taxonomy

Before During After

•Rules•Testing rules•Anonymizing your data

•Detection•Audits

•Go / no go decision•Better interactive feedback

top related