delivering results: how do you report user research findings?
DESCRIPTION
The long, textual written report is dead, isn’t it? So how do you deliver your findings to your clients? Is it PowerPoint? An email? A spreadsheet? Post-it notes? And what do you include? Positive findings? Screenshots with callouts? Just issues? Or recommendations as well? Are they prioritized? If you ask our panelists, some of us have developed templates that we use and modify for each research activity, and others change the deliverable based on the activity and client.TRANSCRIPT
May 7, 2012
Delivering Results: How Do You Report User Research Findings? Jen McGinn Eva Kaniasty Dharmesh Mistry Kyle Soucy Carolyn Snyder Steve Krug Bob Thomas
May 7, 2012
May 7, 2012
Delivering Results: How Do You Report User Research Findings? The long, textual written report is dead, isn’t it? So how do you deliver your findings to your clients? Is it PowerPoint? An e-mail? A spreadsheet? Post-it notes? And what do you include? Positive findings? Screenshots with callouts? Just issues? Or recommendations as well? Are they prioritized?
May 7, 2012
Panelists If you ask our panelists, some of us have developed templates that we use and modify for each research activity, and others change the deliverable based on the activity and client. Each panelist will spend 3-5 minutes showing you their typical deliverables, and then we’ll open the floor for audience Q&A. Jen McGinn, Principal Usability Engineer, Oracle Eva Kaniasty, Founding Principal, RedPill UX Dharmesh Mistry, Usability Specialist, Acquia Kyle Soucy, Founding Principal, Usable Interface Carolyn Snyder, Founding Principal, Snyder Consulting Steve Krug, Founding Principal, Advanced Common Sense
May 7, 2012
Jen McGinn Principal Usability Engineer, Oracle
May 7, 2012
Overview I’ve worked at hardware and software companies, conducting
research on phones, Macs, and PCs I present my research results in one of two ways, neither of which
is a long, written report in Word RITE-Krug study: bullet points at the bottom of a wiki page Traditional Study: slides in 60-minute meeting (generally
remote, via web conference) I’m going to spend 10 seconds showing you a wiki page, and 2
minutes walking you through the structure of one of my PowerPoint presentations
Then I’ll summarize the take-aways
May 7, 2012
What I call RITE-‐Krug TesCng for Agile 3 or 4 par(cipants Prototype will likely change between par(cipants
Stakeholders a9end every session and a debrief mee(ng in a single day
A>er the debrief mee(ng, a list of items that the designers will change is posted on the wiki page
May 7, 2012
May 7, 2012
ExecuCve Summary In [When?], the [what product?] was tested by [number and
type of participants] in [method type] to evaluate the ease of use of several features including [features or use cases].
High level findings included [usually a total of 3 to 4 bullets]: • [ 1 - 2 biggest positive findings] • [ 1 - 2 biggest positive findings] • [ 2 or 3 biggest usability issues] • [ 2 or 3 biggest usability issues]
This presentation covers all of the findings and subsequent
recommendations.
May 7, 2012
Agenda Goals Tasks Participants Findings Recommendations Next Steps
May 7, 2012
Goals Evaluate the usability of the following features of the U-Haul.com website: Are users confused about how to price a rental? A storage unit?
How do users react to the insurance options? Do they understand the coverage?
How do users feel about the presentation of items for purchase or for rent?
How effective is the shopping cart content? Are users confused by when they need to pay for items?
Do users value the star ratings? U-Haul brand?
How do users feel about the targeted FAQ and search result pages?
Does our online documentation help prevent calls to the service center? Can they determine how to reach out to the U-Haul vendor nearest them?
May 7, 2012
Tasks 1. Get the price of a 1-way move across country
2. Find a specific piece of information in the FAQ
3. Determine the size and cost of a storage unit needed to hold specific items
4. Find the phone number of a U-Haul location
5. Book the truck (and insurance), adding rental items and purchased items
6. Determine insurance coverage
7. Find the U-Haul location nearest you
May 7, 2012
ParCcipants Participant
ID Gender Age Occupation Web-savvy
U1 Male 24 Missionary Average
U2 Male 52 Small business manager Average
U3 Female 62 62 Retired. Formerly television news producer, then licensed paralegal.
Average
U4 Female 36 Housewife Average
U5 Male 31 Sales and marketing Average
May 7, 2012
Findings
May 7, 2012
Choosing a Truck
Another issue
One par(cipant suggested this fix
2 par(cipants had this issue and did ‘x’ to work around it
May 7, 2012
Goals and QuesCons Revisited [All the same as before] Are users confused about how to price a rental?
A storage unit?
How do users react to the insurance options? Do they understand the coverage?
How do users feel about the presentation of items for purchase or for rent?
How effective is the shopping cart content? Are users confused by when they need to pay for items?
Do users value the star ratings? U-Haul brand?
How do users feel about the targeted FAQ and search result pages?
Does our online documentation help prevent calls to the service center? Can they determine how to reach out to the U-Haul vendor nearest them?
May 7, 2012
PosiCve Findings [these always come first] All par(cipants easily found the links to the FAQs and had no trouble finding the
answer to the license ques(on under FAQs
All par(cipants made use of the maps when comparing op(ons.
All par(cipants did scroll down to compare prices, loca(ons and reviews
4 par(cipants valued the presence of the [higher] star ra(ngs 2 par(cipants valued U-‐Haul loca(on more than the off-‐brand vendors
2 par(cipants were pleased that the truck rental page "retained her informa(on" -‐-‐ the addresses and dates
2 par(cipants appreciated the visuals of the items inside the storage units and the graphic of the person shown in the small unit icon
2 par(cipants easily added the dolly, blankets and boxes during the truck rental task flow
May 7, 2012
RecommendaCons Priority DescripCon RecommendaCon LocaCon
High Par(cipants don't understand what the purchased insurance actually covers
Re-‐format coverage and exclusions into bulleted lists; Don't use legal jargon
Damage coverage
High
Par(cipants have a very hard (me es(ma(ng the storage unit size that would meet their needs
Provide more user assistance Self Storage loca(on details page
Medium Up-‐sell process for items to rent or purchase is confusing
Put the purchased items into another page in the flow, and make it clearer that users can opt out.
Addi(onal rental items, Shopping cart
Medium Par(cipants are concerned that the site is incorrectly calcula(ng the mileage and therefore overcharging
Add a link to display the map, so they can check it in place
Select your preferred pickup loca(on
Low
Par(cipants were not sure what loca(on the giant thumbtack/pin was (address or zip code) or how far away the loca(ons were
Display the distance "from" the specified loca(on, like the Self-‐storage results page
Select your preferred pickup loca(on, Loca(on
May 7, 2012
Next Steps
Work with [which stakeholders or teams] to prioritize changes
Work with [stakeholders or teams] to design alternatives
Validate that the new designs address the issues with users
May 7, 2012
Summary Tell them what you’re going to tell them
Executive summary Agenda Goals/Questions
Tell them
Tasks & participants (sometimes methodology) Animated slides for progressive disclosure Screen shots annotated with findings
Tell them what you told them Review goals of the research and the questions they were intended
to answer Positive findings (go slowly here) Prioritized opportunities for improvement
May 7, 2012
Eva Kaniasty Founding Principal, RedPill UX
May 7, 2012
Report Formats PPT: visually engaging but real-estate constrained (and will force you to be brief). Formatting can be time-consuming. MS Word/Narrative: more room for context; quick, but can appear dry and boring. 3rd Option: No report.
May 7, 2012
Deciding Factors Time/Budget (Mode of) Presentation of Results Company Culture / Industry Stakeholder Involvement Deliverable Shelf Life
May 7, 2012
ReporCng Findings (1)
May 7, 2012
ReporCng Findings (2)
May 7, 2012
Dharmesh Mistry Usability Specialist, Acquia
Content Management System Open Source Software Community
Products built on Drupal Open Source/ Proprietary Start-up
May 7, 2012
Stakeholders Development Cycle Turn around time Credibility Tracking Issues Presenting Provide recommendations
Thousands of Stakeholders (New and Existing) ? Weeks-Months Mix Reputation Low-Medium Twitter, Conferences, Blog post, Drupal.org No, never!
Deciding Factors
May 7, 2012
Blog Post Drupal.org Conferences/ Videos
May 7, 2012
Supporting information
Detailed Information
Main Report
http://drupal.org/node/1399056 http://drupal.org/node/1399258 http://drupal.org/node/1289476
Tracking
http://www.drupalusability.org/
May 7, 2012
Stakeholders Development Cycle Turn around time Credibility Tracking Issues Presenting Provide recommendations
3-5 Agile (3 week sprints) Hours/ Days/ Weeks Good High-Very High Conference calls Sometimes
Deciding Factors
May 7, 2012
Email Reports Google Doc Reports
May 7, 2012
Spreadsheet Reports
May 7, 2012
Spreadsheet Reports
May 7, 2012
Kyle Soucy Founding Principal, Usable Interface
@kylesoucy www.usableinterface.com
May 7, 2012
Formal Usability TesCng/Research Report
Findings !
categorized !
by !
screens or p
ages!
May 7, 2012
3-4 !
Positive Findi
ngs!
3-4 !
Negative Findings
!
When, What, W
ho, Where, !
and Why Statement!
May 7, 2012
Findings: Severity RaCngs
May 7, 2012
Findings
Major Usability Problem
May 7, 2012
Findings: RecommendaCons
May 7, 2012
Highlight Video
May 7, 2012
Observer Debrief Notes
May 7, 2012
Carolyn Snyder Founding Principal, Snyder Consulting • There is no one “best” format • Do what works for the client, culture, circumstances • Steal good ideas, drop losers
May 7, 2012
Formal Text Report: “I’m not dead yet!” Finding
Severity rating
Explanation of issue
Supporting observations from notes
Recommendations
May 7, 2012
PowerPoint, Screen Shots with Callouts
43
Most people read this text; everyone drilled into [noun]
People understood the stacked bar graphs,
Amount isn’t explicit. The user must do the math.
Can’t explore [action]. People knew it was important.
People wanted concrete, prioritized advice.
Not clear why it showed 2 variations of graph People understood the
purpose
Interest in these Interest in these links
Important sentence buried in paragraph
Ambiguous
(Imagine a screen shot here)
May 7, 2012
PowerPoint with “report” in Notes Field
May 7, 2012
SomeCmes the best report is… …no report
Can you do something more useful instead?
May 7, 2012
Steve Krug Usability Consultant, Advanced Common Sense
May 7, 2012
Expert Reviews – What I do No report, no slides. Live remote walkthrough. Gave up writing Big Honking Report years ago
I hate writing I’m inherently lazy Only real purpose seemed to be to justify cost Mostly: I could get away with it (I have a book)
I tell clients up front: I’ll report my observations in a GoToMeeting session Encourage them to have all interested parties
attend, question, argue Option: Written report—for double the price
May 7, 2012
Expert Reviews – What I do 90-120 minute session Strive for best audio (VOIP) I walk through the site/app, doing narrative of observed
issues (cf. Carol Barnum’s session on storytelling) Limited to only the most serious problems (n < 10) My recommendations for fixing them Encourage them to get objections out of their system
while I’m there to answer Major weakness of written report: no dialogue
Record the session for their use later
May 7, 2012
Expert Reviews – What I do I don’t accentuate the positive
Feels artificial, patronizing to me We’re all grownups on this bus I tend to be very encouraging anyway
“Getting it all right is very hard.” “Everybody has these kinds of issues.” “You can fix them.”
May 7, 2012
Usability Tests– What I recommend I don’t do them anymore; I teach other people to do
them
May 7, 2012
Usability Tests– What I recommend Forget the report: GET THEM TO COME TO THE TESTS!
Most crucial success factor Seeing is believing: watching makes converts Many other good effects flow from watching as a
group Do whatever it takes to get them to come
Keep it brief (3 participants) Keep it convenient (on-site) Regular schedule (“A morning a month”) THE BEST SNACKS MONEY CAN BUY!
May 7, 2012
Usability Tests– What I recommend “But I can’t get them to come…”
Please stop your incessant whining Try harder
OK, yes, you can create a report Two-page (max) bullet list email; 30 minutes to write
What we tested (site, prototype, etc.) with link to it Tasks they did Top three problems observed Solutions to these problems, which will be implemented
before next month’s tests (Optional) Link to recordings
May 7, 2012
Extra Credit Read Recommendations on Recommendations
Rolf Molich, Kasper Hornbæk, Steve Krug, Josephine Scott and Jeff Johnson
http://www.dialogdesign.dk/tekster/Recommendations_on_Recommendations.pdf
Get Jen McGinn to share her report from CUE 9 Best in show, out of 19 seasoned UX pros Try to figure out her secret sauce and imitate it
May 7, 2012
Narrative on top of screenshots “N participants ________.” Participant quotes Excellent, terse writing) Key observations only
May 7, 2012
QuesCons 1. Do you change your delivery of usability results depending on your role as a
internal/external consultant or as a company employee? 2. How important are positive vs. negative findings? 3. How have your reports changed over the years? Is there anything you do
differently than when you first started writing them? 4. How do you categorize the findings in your reports? For example, do you
categorize them by the page/screen, by the step in a certain process (e.g. checkout process), or by the task?
5. Lean UX is a trending topic. Have you had experience with Lean UX or Agile methods, and had to change the way you conduct research and deliver results?
6. What guidelines do you follow when writing recommendations or proposed solutions to problems?
7. Do you decide ahead of time how long a report should be and make an effort to keep it that length? If so, what dictates the length?
8. If you think a report is too long and needs to be trimmed down, how do you decide what to cut out?
9. What part of a report is the hardest for you to write?