1 d.j.schianochi2004 tutorial usability…and beyond! understanding usefulness, usability & use...
TRANSCRIPT
1 D.J.SchianoCHI2004 Tutorial
Usability…and Beyond!
Understanding Usefulness, Usability & Use
CHI 2004 TutorialApril 2004
Diane J. Schiano<Abbreviated for PARC>
2 D.J.SchianoCHI2004 Tutorial
PageIntroduction 15
A Primer on User Exp Research Design 30---Core UE Research Design Principles 37---Key Pragmatic Issues 56
Methods, Measures & More 67---Key Research Design Decisions
68---Methods: Self-Report & Observational 69---Measures & More: Quantitative, Qualitative 128
More On Communicating Results
148
Appendix 155
Table of Contents
3 D.J.SchianoCHI2004 Tutorial
This tutorial will provide a general understanding of: Usefulness, usability and use studies: what they are, how and why they are done,
and the kinds of information they yield, with extended examples and resources available in the tutorial notes.
User research design principles, and the pragmatic challenges of attaining validity, reliability, generalizability and robustness in user studies.
The major self-report and observational methods available for user studies, and how to choose and apply them appropriately.
Approaches to dealing with qualitative, quantitative and hybrid data, and to summarizing results and making design recommendations from them
Pragmatic considerations in applying procedures, discussed from logistical, organizational, and personal/privacy perspectives
Also provided: • Guidelines, readings, other resources for conducting studies and evaluating
findings, both in class and in the tutorial notes
• Opportunities for expert and peer feedback on practical problems participants are invited to bring in for consideration during the User Research Design Clinic.
Course Learning Objectives
4 D.J.SchianoCHI2004 Tutorial
AbstractDigital products are growing increasingly complex, encompassing interactions among humans as well as between humans and technology. Careful methods are required to understand user experience of these products, and to use this understanding to inform iterative design. The goal of this tutorial is to provide a practical understanding of the principles and procedures used to assess product usefulness, usability and use. Participants are given guidance and grounding in general user research design principles and procedures to aid them in choosing methods, conducting studies, evaluating results, and making recommendations effectively, even under constrained conditions. A principled yet pragmatic approach is advocated, consolidating the best of classic usability engineering and ethnographic methods—and applied appropriately for the research question and context at hand. Useful exercises, extended examples and extensive references and other resources are provided. Finally, attendees are invited to bring actual problems to discuss in the ‘User Research Design Clinic’ for peer and expert feedback.
5 D.J.SchianoCHI2004 Tutorial
Usability…and Beyond!
Understanding…Usefulness, Usability &
Use
6 D.J.SchianoCHI2004 Tutorial
Usability…and Beyond!
Understanding…User Experience!
Focusing on the PrinciplesUnderlying the Procedures
7 D.J.SchianoCHI2004 Tutorial
Introduction
8 D.J.SchianoCHI2004 Tutorial
• Depends critically on effective User Research.
• “If we build it they will come… NOT!!!”
• What’s the use… of designing products that aren’t…
Useful,
Usable,
Used???!!!
Creating Useful, Usable & Used Products…
9 D.J.SchianoCHI2004 Tutorial
Core User Experience Research Concepts
• Usefulness• Why--and how--could the product be useful to
people? Design (& marketing) implications from current practice?
• Usability• How easily--and well--can the product be learned
and used? Implications for re-design?
• Use• How do people actually use the product?
Implications for re-design?
10 D.J.SchianoCHI2004 Tutorial
User-Centered Product Research & Design
Usefulness --> Use <----------------------> Usability
From UsabilityNet
11 D.J.SchianoCHI2004 Tutorial
Usefulness, Usability, Use in Product Cycle
Usefulness
Usability
Use
Planning +
Feasibility
Requirements
Design
Implementation
Test + Measure
Post-Release
12 D.J.SchianoCHI2004 Tutorial
Usefulness, Usability, Use in Product Cycle
Usefulness
Usability Use
From UsabilityNet
13 D.J.SchianoCHI2004 Tutorial
We’ll Focus on Research Design Principles
• Primarily the WHYs behind choosing and implementing user research methods.
• For more on HOW-TOs, see…• UsabilityNet (http://www.usabilitynet.org)
• Appendix references & recommendations
14 D.J.SchianoCHI2004 Tutorial
UsabilityNet: An Excellent Resource!
• High Quality, Free• How-Tos, Mini-Tutorials• http://www.usabilitynet.org
15 D.J.SchianoCHI2004 Tutorial
Core User Experience Research Concepts
• Usefulness• Why--and how--could the product be useful to
people? Design (& marketing) implications from current practice?
• Usability• How easily--and well--can the product be learned
and used? Implications for re-design?
• Use• How do people actually use the product?
Implications for re-design?
16 D.J.SchianoCHI2004 Tutorial
• Ethnography (Usefulness & Use)
• Human Factors Engineering (Usability)
Two Classic Approaches to UE Research
17 D.J.SchianoCHI2004 Tutorial
The Ethnographic Approach
Traditional Emphases:
• Usefulness & Use (motivations, practice)
• Self-report w/ contextualized observation
• Naturalistic context, no (or low) control
• “Why? How?” questions • Qualitative data & deliverables
18 D.J.SchianoCHI2004 Tutorial
The Human Factors Engineering Approach
Traditional Emphases:
• Usability
• Observation (task performance)
• Lab context, high control
• “How often/fast/much?” questions • Quantitative data & deliverables
19 D.J.SchianoCHI2004 Tutorial
These Approaches are Now Converging…
• Self-report & observation are complementary
• “Naturalistic” observations are becoming increasingly common (esp. on the Internet)
• Using converging methods is more informative and cost-effective.
20 D.J.SchianoCHI2004 Tutorial
These Approaches are Now Converging…
• So it is becoming increasingly important to understand the core principles underlying ALL user experience research.
• And that’s why we’re here!
21 D.J.SchianoCHI2004 Tutorial
A Primer on User Research Design
22 D.J.SchianoCHI2004 Tutorial
A Primer on User Research Design
‘Science is the elucidation of common sense.’
Francis Bacon (attrib)
‘There are the hard sciences, and then there are the difficult sciences.’
Gregory Bateson
23 D.J.SchianoCHI2004 Tutorial
The Art & Science of UE Research Design
The creative use of ….
research principles & pragmatics
…to construct, conduct & communicate research to effectively inform product design.
24 D.J.SchianoCHI2004 Tutorial
• Prioritize. Focus on what you want to learn.
• Design your research project using appropriate methods based on:
- Research principles & pragmatic considerations taken together.
• Conduct the research appropriately.
• Analyze & interpret findings responsibly.
- Use caution; qualify as needed.
• Communicate your findings effectively.
Overview of the Research Process
25 D.J.SchianoCHI2004 Tutorial
• Methods (What you can do):
Ask < ----------- > Observe (Self-Report) (Behavior)
• Context (How--& where--you do it):
Naturalistic < ----------- > Controlled
• Data analyses & deliverables:
Qualitative < ----------- > Quantitative
Your Key Research Design Decisions
26 D.J.SchianoCHI2004 Tutorial
My Advice…
KISS: Keep it simple, s’il vous plait!
>> Focus on what you really need to learn.• Your questions, goals, deliverables
>> Prioritize!• Design principles & pragmatic considerations
>> What evidence would convince YOU?• Use common sense• Be your own best critic. Challenge yourself!
27 D.J.SchianoCHI2004 Tutorial
Core Research Design Principles
• Validity
• Reliability
• Generalizability
28 D.J.SchianoCHI2004 Tutorial
Core Research Design Principles
• Validity (Am I really studying what I think I am?) • aka “internal validity”
• Reliability (Will my findings be repeatable?)• aka “statistical significance”
• Generalizability (Do my findings apply appropriately?)• aka “external validity”
29 D.J.SchianoCHI2004 Tutorial
Validity & Reliability
30 D.J.SchianoCHI2004 Tutorial
Validity & Reliability
31 D.J.SchianoCHI2004 Tutorial
Validity & Reliability
High Reliability, High ValidityConsistent & ON-Target
32 D.J.SchianoCHI2004 Tutorial
Validity & Reliability
33 D.J.SchianoCHI2004 Tutorial
Validity & Reliability
High Reliability, Low ValidityConsistent but OFF-Target
34 D.J.SchianoCHI2004 Tutorial
Validity & Reliability
35 D.J.SchianoCHI2004 Tutorial
Validity & Reliability
High Reliability, Low ValidityNOT Consistent & OFF-Target
36 D.J.SchianoCHI2004 Tutorial
Generalizability
37 D.J.SchianoCHI2004 Tutorial
Generalizability, aka “External Validity”
External Applicability“Throughput” to Related Target(s)
38 D.J.SchianoCHI2004 Tutorial
Core Research Design Principles
• Validity (Am I really studying what I think I am?) • Confounds & controls, errors & biases• Look for disconfirming evidence
• Reliability (Will my findings be repeatable?)• Statistical significance v chance• Sample size (# participants, observations)
• Generalizability (Will my findings apply appropriately?)• Representativeness (of sample, context)
39 D.J.SchianoCHI2004 Tutorial
Core Research Design Principles
• Validity
• Reliability
• Generalizability
• These 3 principles may seem simple at first glance, but they are profound…and the issues involved can become quite complex.
• They provide a foundation for evaluating ALL research—qualitative or quantitative, naturalistic or controlled, market research or usability testing.
40 D.J.SchianoCHI2004 Tutorial
Example: IRC’s LambdaMOO Project
41 D.J.SchianoCHI2004 Tutorial
Goals: Assess “Hype-otheses”, Characterize Community
• Ask (Self-Report) < ----------- > Observe Behavior
Survey, Interviews < ----------- > Logfile Analysis
• Naturalistic < ----------- > Controlled Context
LambdaMOO < ------> Lab?
• Qualitative < ----------- > Quantitative Data
E.g., Transcripts < ----------- > # Hours Logged On
Convergent Methods Approach
LambdaMOO Project Overview
42 D.J.SchianoCHI2004 Tutorial
LambdaMOO Project Methods
• Survey (Self-Report)• 1 Week Call upon Login; 581 Respondents
• ~ 30 Questions, Various Formats, Online
• Interviews (Self-Report)• 12+ Real-Life, Long-Term Participants (Many IVR,etc)
• 1.5-2 hr In-Depth, Semi-Structured Interviews & Maps & Follow-ups
• Logging Studies (Naturalistic Observed Behavior)• Who/Where/When @ ~ 1min Intervals, 24 hr/day, ~2wks
• Privacy Respected
• Data on ~ 4,000 Users Obtained (Twice, ~ 6 Mo. Interval)
• And Much More...• Participant observations, attending BayMOO mtgs, comparison
studies, etc.
43 D.J.SchianoCHI2004 Tutorial
Q: Addiction to LambdaMOO?
• Self-Report Use Estimates Very High
• Previous research papers, popular books & press:
• 80 hrs/wk “not uncommon”
• Our interview findings not inconsistent
• Our Logfile Observations Differed Greatly
• Mean ~ 8 hrs/wk (with multi-tasking & idle time!)
• Less than 5% users on for 20 or more hrs/wk
44 D.J.SchianoCHI2004 Tutorial
LambdaMOO Use Data from Logfiles
Experience Quartile
0.00
0.20
0.40
0.60
0.80
1.00
1.20
1.40
1.60
1.80
HighMedHighMedLowLow
females
males
others
Modal Presenting Gender
Mean=1.13 hrs/day; ~ 8 hrs/wk
45 D.J.SchianoCHI2004 Tutorial
Q: Are you convinced by the logfile data? Why or
why not? Can you explain divergent results?
>>Validity(Am I really studying what I think I am?)
>>Reliability (Will my findings be repeatable?)
>>Generalizability (Will my findings apply appropriately?)
Q: Addiction to LambdaMOO?
46 D.J.SchianoCHI2004 Tutorial
Key Pragmatic Issues
• Robustness (How strong is this effect?)
• Impact (How important is this finding?)
• Convergence (More is better!)
47 D.J.SchianoCHI2004 Tutorial
Key Pragmatic Issues
• Robustness (How strong is this effect?) • Large in magnitude (effect size)
• Impact (How important is this finding?)• High in priority, severity
• Convergence (More is better!)• Multiple modest methods can be most informative and cost effective.
48 D.J.SchianoCHI2004 Tutorial
Q: Addiction to LambdaMOO?
• Robustness (How strong is this effect?) • Huge effect size (difference): 80 v 8 hrs/wk
• Impact (How important is this finding?)• High potential social import, impact
• Convergence (More is better!)• Self Report & Observation; Qualitative & Quantitative
• Triangulating “why”, “how”, “how often/fast/much” questions from various perspectives.
49 D.J.SchianoCHI2004 Tutorial
Other Pragmatic Considerations…
• Complexity of Product/Design/System
• Your Deliverables• Design recommendations? Presentation? Paper?
• Time Frame • Product deadlines, readiness, design cycle
• Cost • Time, money, personnel
• Other Resources & Constraints• Availability of participants, prototypes, tools• Your skills, expertise & interests• Organizational & political priorities• Etc.
50 D.J.SchianoCHI2004 Tutorial
Example: LambdaMOO Project
• Extremely Complex System, User Community
• Our Deliverables• Business & design recommendations
• Time-Frame • System complete but evolving; cohort effects
• Cost • High
• Other Resources & Constraints• Research ideal: Many resources, few constraints
51 D.J.SchianoCHI2004 Tutorial
Pragmatic Suggestions from UsabilityNet….
52 D.J.SchianoCHI2004 Tutorial
1) Given Limited Time & Resources…
53 D.J.SchianoCHI2004 Tutorial
2) Given No Direct Access to Users…
54 D.J.SchianoCHI2004 Tutorial
3) Given Limited Skill/Expertise…
55 D.J.SchianoCHI2004 Tutorial
Methods, Measures & More…
56 D.J.SchianoCHI2004 Tutorial
• Methods (What you can do):
Ask < ----------- > Observe (Self-Report) (Behavior)
• Context (How--& where--you do it):
Naturalistic < ----------- > Controlled
• Data analyses & deliverables:
Qualitative < ----------- > Quantitative
Key Research Design Decisions
57 D.J.SchianoCHI2004 Tutorial
Methods
58 D.J.SchianoCHI2004 Tutorial
Methods Overview
Self-Report (e.g., Surveys, Interviews)
• Explanations: Meaning, salience, satisfaction• Feelings, opinions, preferences, priorities• Other otherwise unobservables (w/ caution!)
Observations (e.g., Logfile & Clickstream Analyses, Task Performance)
• “Naturalistic” behavior • Task performance
59 D.J.SchianoCHI2004 Tutorial
Taxonomy of Common Methods (after Nielsen, 1993)
• Heuristic Evaluation Self-Report (of Experts)
• Performance Measures Observed Behavior
• Verbal Protocols Self-Report
• Observation Observed Behavior
• Questionnaires Self-Report
• Interviews Self-Report
• Focus groups Self-Report (in Groups)
• Logging actual use Observed Behavior
• User feedback Self-Report
60 D.J.SchianoCHI2004 Tutorial
• Methods (What you can do):
Ask < ----------- > Observe (Self-Report) (Behavior)
• Context (How--& where--you do it):
Naturalistic < ----------- > Controlled
• Data analyses & deliverables:
Qualitative < ----------- > Quantitative
Key Research Design Decisions
61 D.J.SchianoCHI2004 Tutorial
Self-Report Methods
62 D.J.SchianoCHI2004 Tutorial
Self-Report Methods
• Surveys, Questionnaires
• Interviews
• And More…
63 D.J.SchianoCHI2004 Tutorial
Surveys, Questionnaires
64 D.J.SchianoCHI2004 Tutorial
• What you can do: Ask (for Self-Report) < ----------- > Observe Behavior
• How (& where) you do it:
Naturalistic < ----------- > Controlled Context
• Data analyses & deliverables:
Qualitative < ----------- > Quantitative
Surveys
65 D.J.SchianoCHI2004 Tutorial
• What you can do: Ask (for Self-Report) < ----------- > Observe Behavior
• How (& where) you do it:
Naturalistic < ----------- > Controlled Context
• Data analyses & deliverables:
Qualitative < ----------- > Quantitative
Surveys
66 D.J.SchianoCHI2004 Tutorial
Surveys
What’s wrong with this picture? - - >
67 D.J.SchianoCHI2004 Tutorial
01. How often do you use <Feature X>?
02. How good a user of MS Word do you consider yourself?
03. Which version of Word do you usually use?
04. When you need to find out how to do something using < Feature X >, what do you usually do first?
a) Run through the menus c) Read the manual
b) Run through the toolbars d) Access the easy-to-use help index
…
10. How do you feel about < Feature X >?
a) It's one of the best things Microsoft ever did
b) I like it d) I don't like it
c) Indifferent e) I hate it so much I turned it off
11. Why?
Survey
68 D.J.SchianoCHI2004 Tutorial
Surveys
• + Broad demographics, bkgnd information,
• + Specific questions, often quantitative
• + Fairly cheap & easy to conduct & analyze
• + Useful pre- & post- task comparisons
• - Limited, not natural context
• - Subject to biases, memory effects
69 D.J.SchianoCHI2004 Tutorial
Surveys: General Issues & Guidelines
• Be Brief, Clear, Specific, Consistent & Easy
• Question Design:• Specific > General• Forced-Choice > Agree/Disagree• Offer No Opinon or N/A & Comment options• Use rating scales for measuring intensity
• Ease of Using Surveys Comes At a Price: • Measurement errors & biases. Sampling issues.• Open-ended Qs harder to analyze, but more valid?
• ALWAYS Pilot Test & Iterate on Questions!
70 D.J.SchianoCHI2004 Tutorial
Example: LambdaMOO Survey
• Survey
• 1 Week Call upon Login; 581 Respondents
• ~ 30 Questions, Various Formats, Online
• A relatively quick and easy way to ask for a fairly large amount of self-report information from a large population. The survey was structured, and response formats for were designed for ease of coding, analysis and comparison.
• Note: Quantitative survey results are often more valid, useful, and interesting when considered comparatively rather than absolutely
71 D.J.SchianoCHI2004 Tutorial
k (Self-Report) < ----------- >
• Ask (Self-Report) < ----------- > Observe Behavior
• Naturalistic < ----------- > Controlled Context
• Qualitative < ----------- > Quantitative Data
Example: LambdaMOO Survey
72 D.J.SchianoCHI2004 Tutorial
Example: Sociality in LambdaMOO?
• Survey Responses
73 D.J.SchianoCHI2004 Tutorial
User Surveys for Design I (From UsablityNet)
Usefulness
74 D.J.SchianoCHI2004 Tutorial
User Surveys for Design II
75 D.J.SchianoCHI2004 Tutorial
User Surveys for Design III
76 D.J.SchianoCHI2004 Tutorial
Subjective Assessment Surveys I
Usability & Use
77 D.J.SchianoCHI2004 Tutorial
Subjective Assessment Surveys II
78 D.J.SchianoCHI2004 Tutorial
Subjective Assessment Surveys III
79 D.J.SchianoCHI2004 Tutorial
Interviews
80 D.J.SchianoCHI2004 Tutorial
• What you can do: Ask (for Self-Report) < ----------- > Observe Behavior
• How (& where) you do it:
Naturalistic < ----------- > Controlled Context
• Data analyses & deliverables:
Qualitative < ----------- > Quantitative
Interviews
81 D.J.SchianoCHI2004 Tutorial
• What you can do: Ask (for Self-Report) < ----------- > Observe Behavior
• How (& where) you do it:
Naturalistic < ----------- > Controlled Context
• Data analyses & deliverables:
Qualitative < ----------- > Quantitative
Interviews
82 D.J.SchianoCHI2004 Tutorial
Interviews
• + Less limited, more naturalistic
• + Greater depth of understanding
• - Difficult to collect & analyze results
• - Subject to biases, memory effects
83 D.J.SchianoCHI2004 Tutorial
Interviews: General Issues & Guidelines
• Context choices:• Conversational style or structured interview? • At your facility or theirs?
• ALWAYS use a guide sheet.
• Be prepared! • Consider logistics, how to record data & take notes, AND how to analyze & present results.
84 D.J.SchianoCHI2004 Tutorial
Example: Sociality in LambdaMOO?
• Previous reports emphasized the importance of sociality in LambdaMOO .
• “Great good place” >>> Club/pub analogy
• Our survey and logfile data was mixed:
• Survey suggested most time spent socializing.
• Logfile analysis showed most time spent alone!
• Modal # characters in rooms together ~ 1.
85 D.J.SchianoCHI2004 Tutorial
Q: Sociality in LambdaMOO?
• Interviews explained how BOTH were correct:
• Most people discussed spending most time:
• SOCIALIZING ALONE!
Using remote messaging
– MOOmail, paging (presaging IM)
Typically (& increasingly) from “home”
– For security & multi- tasking
86 D.J.SchianoCHI2004 Tutorial
Interviews I (From UsablityNet)
Usefulness, Usability, Use
87 D.J.SchianoCHI2004 Tutorial
Interviews II
88 D.J.SchianoCHI2004 Tutorial
More Self-Report Methods…
• See UsabilityNet for a variety of other self-report—and hybrid—methods.
• If you’re interested—and if there’s time--we can discuss some of these methods later in the tutorial.
Observational Methods
90 D.J.SchianoCHI2004 Tutorial
Observational Methods
‘You can learn a lot from looking.’
Yogi Bera (attrib.)
91 D.J.SchianoCHI2004 Tutorial
• What you can do: Ask (for Self-Report) < ----------- > Observe Behavior
• How (& where) you do it:
Naturalistic < ----------- > Controlled Context
• Data analyses & deliverables:
Qualitative < ----------- > Quantitative
Observations
92 D.J.SchianoCHI2004 Tutorial
• What you can do: Ask (for Self-Report) < ----------- > Observe Behavior
• How (& where) you do it:
Naturalistic < ----------- > Controlled Context
• Data analyses & deliverables:
Qualitative < ----------- > Quantitative
Observations
93 D.J.SchianoCHI2004 Tutorial
Naturalistic Observations
• Direct • Outside observer “in context”, or• Participant Observer
• Indirect (Recorded) • Video, audio recordings• Clickstream and logfile data
---> “Notes & Quotes”
94 D.J.SchianoCHI2004 Tutorial
Ethnographic Naturalistic Observations
not h
yp
oth
esis
driv
en
Fernando’s filing system(from Nardi)
95 D.J.SchianoCHI2004 Tutorial
Ethnographic Observations
• Typically, detailed and extended observation of behavior and artifacts in context. Often in conjunction with interviews & participant observation.
• Ethnographic core concepts:• Holism• Natives’ point(s) of view• Natural context• History
96 D.J.SchianoCHI2004 Tutorial
Classic Example: Ethnographic Research
• Schiano, Nardi, Gumbrecht & Swartz. (CHI2004 Short paper submission). “Blogging by the Rest of Us”.
97 D.J.SchianoCHI2004 Tutorial
• Ask (for Self-Report) < ----------- > Observe Behavior
• Naturalistic < ----------- > Controlled Context
• Qualitative < ----------- > Quantitative Data
Example: LambdaMOO Log Observations
98 D.J.SchianoCHI2004 Tutorial
LambdaMOO Logfile Studies
• Logged information• State of each object (who/where/when) in system
• Recorded @ ~ 1min intervals, 24 hr/day, ~2wks
• Data on ~ 4,000 users obtained
• Twice, with ~ 6 month interval btwn studies
• Huge, rich database of “objective” information on use patterns in naturalistic context.
• Extensive data re-coding and analysis was required.
• The findings are very compelling.
99 D.J.SchianoCHI2004 Tutorial
Q: Addiction to LambdaMOO?
• Self-Report Use Estimates Very High
• Previous research papers, popular books & press:
• 80 hrs/wk “not uncommon”
• Our interview findings not inconsistent
• Our Logfile Observations Differed Greatly
• Mean ~ 8 hrs/wk (with multi-tasking & idle time!)
• Less than 5% users on for 20 or more hrs/wk
100
D.J.SchianoCHI2004 Tutorial
Experience Quartile
0.00
0.20
0.40
0.60
0.80
1.00
1.20
1.40
1.60
1.80
HighMedHighMedLowLow
females
males
others
Modal Presenting Gender
Q: Addiction to LambdaMOO?
Mean=1.13 hrs/day; ~ 8 hrs/wk
101
D.J.SchianoCHI2004 Tutorial
• Convergent methods were very effective; we felt we gained a good understanding of the community.
• Our company decided NOT to invest in MUDs or other online communities…But by interpreting the findings narrowly, missed a major opportunity: IM & chat!
• Schiano (1998). “ Lessons from LambdaMOO”. Presence.
LambdaMOO Project Epilog
102
D.J.SchianoCHI2004 Tutorial
User Observations (From UsablityNet)
Usefulness, Use
103
D.J.SchianoCHI2004 Tutorial
User Observations II
104
D.J.SchianoCHI2004 Tutorial
Usability Tests as Controlled Observations
• Task performance is observed behavior
• Exploratory, assessment, validation & comparison tests (Rubin)
• Highly controlled task, lab context
• Aids direct comparisons, ease of analysis, reliability
• May hinder validity, generalizability
105
D.J.SchianoCHI2004 Tutorial
Classic Example: Human Factors Research
• Schiano, Ehrlich, Raharja & Sheridan (2000). Face to Interface: Facial Affect in (Hu)Man and Machine”. CHI2000.
106
D.J.SchianoCHI2004 Tutorial
Brief Example: Online Web Usability Tests
Data: Speed, accuracy…& clickstreams!
107
D.J.SchianoCHI2004 Tutorial
Performance Testing I (From UsablityNet)
Usability
108
D.J.SchianoCHI2004 Tutorial
Performance Testing II
Observational Methods: Issues & Guidelines
• Naturalistic v Controlled Context Issues
• Major Data Coding & Analysis Issues• Responsible data reduction required
• - Validity, Generalizability Issues:• Inferring users’ attention, intentions
• Remember, correlation <> causation!
• Never Underestimate:• Privacy/Permissions Issues • Preparedness & Pragmatics
110
D.J.SchianoCHI2004 Tutorial
Observation Special Topic: Clickstreams
111
D.J.SchianoCHI2004 Tutorial
• Who is Visiting Your Site• How Many, How Long, How Often
• Paths Taken Through Your Pages
• Page Analyses• Frequency of Use, Time Spent on Each Page
• Entries & Exits• Where Are Users Coming From? Leaving?
• Success?• Transactions, Downloads, Info Viewed
Clickstream Information
112
D.J.SchianoCHI2004 Tutorial
Example: Netraker Clickstream Data
113
D.J.SchianoCHI2004 Tutorial
Example: Netraker Clickstream Data
Pogo-stickingwith
Back-button
Start page
Target page
Backtracking from target page - everyone resorted to search
Majority of participants traveled this
path
Page color: Red=long time
Orange=mediumYellow=short
Dead-end
Participants used many navigation
strategies
And search terms
Multiple paths to the same page
Preferred strategy:browsing
FeaturesExample Observations
Link color: Black=link Red=back-
button
114
D.J.SchianoCHI2004 Tutorial
• Who is Visiting Your Site• How Many, How Long, How Often
• Paths Taken Through Your Pages
• Page Analyses• Frequency of Use, Time Spent on Page
• Entries & Exits• Where Are Users Coming From? Leaving?
• Success?• Transactions, Downloads, Info Viewed
Web “Clickstream” Data: Useful Information
Rosenbloom, 2000?
115
D.J.SchianoCHI2004 Tutorial
Caveats Re. Clickstream Observations
• Much missing or potentially misleading data…• Same cookie, diff user? Diff address, same user?• Cached content (& back button) issues• Different behaviors of diff browsers, portals, ISPs
• Much is inferred…• Especially users’ intention, attention
• Validity & generalizability issues• Incredibly rich, readily available data • Best in convergence with other methods
• Netraker, Vividence, Enviz, etc.
116
D.J.SchianoCHI2004 Tutorial
Measures & More…
• Quantitative & “Quantified” Data
117
D.J.SchianoCHI2004 Tutorial
Quantitative Summary Statistics
• Data-->”How often/fast/much”
• Measures of Central Tendency• Mean (Average value)• Median (Middle value)• Mode (Most common value)
• Measures of Variability • Range (Interval btwn lowest & highest values)• Variance (Sum of squared deviations from mean)• Standard Deviation (Square root of variance)
118
D.J.SchianoCHI2004 Tutorial
“Quantifying” Qualitative Data
• Recoding “why”, “how” data to assess “How often/fast/much”
• Often compelling, but can be difficult to do, and easily subject to bias.
• Use with caution!
119
D.J.SchianoCHI2004 Tutorial
Quantitative Analyses
Match measures & analyses to your questions… and what you want to communicate.
• Collapsing, transforming, summarizing data• Graphical representations• Significance tests• T-tests, ANOVAs• Correlations, etc.• Etc.
120
D.J.SchianoCHI2004 Tutorial
Brief Example: IM Logfile Study
[2000-1208-13:23:30] IDLE: Donna[2000-1208-14:52:59] ACTIVE: Wally[2000-1208-14:53:45] CLE: [Wally @ [Home, 650 123 4567]] moved focusinto TIM with Donna (uid: 2)[2000-1208-14:54:05] CLE: [Donna @ [972 999-1234]] moved focus into TIMwith Wally (uid: 1)[2000-1208-14:54:05] TEXT: [Wally @ [Home, 650 123 4567] => Donna] [Wow,Edith left me a message saying that the ABX thing went well and thatthey want "to go to the next step".][2000-1208-14:54:06] CLE: [Donna @ [972 999-1234]] moved focus out ofTIM with Wally (uid: 1)[2000-1208-14:54:13] TEXT: [Wally @ [Home, 650 123 4567] => Donna][Interesting...][2000-1208-14:54:15] CLE: [Wally @ [Home, 650 123 4567]] moved focus outof TIM with Donna (uid: 2)[2000-1208-14:59:44] IDLE: Wally[2000-1208-15:17:48] CLE: [Donna @ [972 999-1234]] moved focus into TIMwith Wally (uid: 1)[2000-1208-15:17:55] CLE: [Donna @ [972 999-1234]] moved focus out ofTIM with Wally (uid: 1)[2000-1208-15:18:00] ACTIVE: Donna[2000-1208-15:19:13] CLE: [Donna @ [972 999-1234]] moved focus into TIM with Wally (uid: 1)
121
D.J.SchianoCHI2004 Tutorial
IM Data Coding & Analysis Issues
• Units of Analysis?
• “Chunk”: Chat activity separated by > 5 min idle time• Pairs: Hi v Lo IM Familiarity; Pair Interaction
• Measures • “Objective” Data: Who, Where, When,
How Often/ Fast/ Much?• Interpretive Coding: How? Why?
• Inter-Coder Reliability Issues
• Analyses• Primarily Quantitative
122
D.J.SchianoCHI2004 Tutorial
CHUNKSTATS: Donna,Wally,2000-1208-2,22,1208-15:19:24,2,1,1,1.00,N/A,22.00,0,idle,84,N/A,1180
******
Opening Negotiation: N / YTrigger: I / O / S / RReference to Previous Conversation in Day: N / YExplict Reference To Prior Communication In Other Medium: N / YSwitch Media: A / E / NInteraction Through Location Field: N / YThird Party Involvement: N / YMisunderstanding: N / YExplicit Interrupt: N / YClosing: N / YContent: W, S, RThreads: N / YTopics: (list them)
Interpretive Coding:Human Judgment
“Objective” Measures
123
D.J.SchianoCHI2004 Tutorial
Initial Results
124
D.J.SchianoCHI2004 Tutorial
Initial Results -->Further Coding, Analyses
125
D.J.SchianoCHI2004 Tutorial
For Further Information…
•Schiano, Ehrlich, Raharja & Sheridan (2000). “Face to Interface: Facial Affect in (Hu)Man and Machine”. CHI2000.
126
D.J.SchianoCHI2004 Tutorial
Match Your Measures to Your Questions
• Qualitative (“Why?” “How?”)
• Quantitative (“How often/fast/much?”)
• Convergent (Bits of both)
Trying to translate general hypotheses into specific quantitative questions can be very useful. Not all issues lend themselves to quantification. But forthose that do, the precision and ability to make direct comparisons that quantification provides can prove highly useful.
127
D.J.SchianoCHI2004 Tutorial
But Let’s be Clear…
• Quantification is not a “magic bullet”…even with so-called “objective” measures. Human judgment—your judgment--is always required in conducting research, analyzing & interpreting data…and communicating your findings!
• Numbers can be impressive…or intimidating, depending on your audience, and they are easily misused…even unintentionally. Think carefully about what you’re trying to find out in your research, why, and for whom. Remember that the apparent precision of numerical results may be misleading. In short, be responsible!
128
D.J.SchianoCHI2004 Tutorial
Measures & More…
• Qualitative Data
129
D.J.SchianoCHI2004 Tutorial
Qualitative Measures & More…
• Notes & Quotes• Photos, Video clips• Narratives, Summaries• Structured Products to Inform Design
• Personas/User Profiles• Use Cases / Scenarios of Use
• User Advocacy in Design
130
D.J.SchianoCHI2004 Tutorial
Personas/ User Profiles
131
D.J.SchianoCHI2004 Tutorial
Example: Personas/User Profiles I
http://ccm.redhat.com/user-centered/personas.html
132
D.J.SchianoCHI2004 Tutorial
Personas/User Profiles II
133
D.J.SchianoCHI2004 Tutorial
Personas/User Profiles III
134
D.J.SchianoCHI2004 Tutorial
Use Cases/ Scenarios of Use
135
D.J.SchianoCHI2004 Tutorial
Example: Use Cases/ Scenarios of Use
136
D.J.SchianoCHI2004 Tutorial
More On Communicating Results
137
D.J.SchianoCHI2004 Tutorial
More On Communicating Results
‘There are lies, damn lies and statistics.’
Mark Twain
138
D.J.SchianoCHI2004 Tutorial
Medium as Message…
Your products are communicative, persuasive.• Presentations• Papers/reports• Structured products for design (e.g., personas, use
cases)• User advocacy in design meetings
You are making a case with every selection,summary or presentation of results…• Notes & Quotes• Photos, Video clips• Narratives, summaries• Statistical summaries, graphs & tables
139
D.J.SchianoCHI2004 Tutorial
Bottom Line: Be Responsible, Credible…Useful!
• In Designing & Conducting Research• As we’ve discussed at length
• In Reporting Results
• Refer to research principles in summarizing and presenting results
• Provide access to methods & raw data
• In Making Recommendations• Be judicious and pragmatic• Prioritize by robustness and impact.
140
D.J.SchianoCHI2004 Tutorial
User-Centered Product Research & Design
Usefulness --> Use <----------------------> Usability
From UsabilityNet
141
D.J.SchianoCHI2004 Tutorial
Special Discussion Topics
142
D.J.SchianoCHI2004 Tutorial
User Research Design Clinic
143
D.J.SchianoCHI2004 Tutorial
Appendix