kyle banerjee and laura zeigen portals workshop may 30, 2008
TRANSCRIPT
Kyle Banerjee and Laura ZeigenPORTALS workshop
May 30, 2008
OverviewWhat makes a catalog “next gen”?How the same interface can be achieved
different waysHow do we combine different results from
different resources?Overview of next gen catalog optionsWhat constitutes good user interface in catalogs?Needs assessment for the catalogUsability testing usesHow to conduct usability testing in catalog
What makes a catalog “next gen”?Improved search and browse experienceDesigned to potentially access more resources than
beforeAllow management of diverse collections as a single
unified collectionSeamless integration of products maintained by many
vendors and data providersMore than just the public front of the ILS (inventory
control for physical materials)Should be more a discovery tool that should be able
to tell you more about the resources you are gathering together than current ILS systems do.
What next gen catalogs have in commonAre in a stage of rapid developmentAre more user friendly for casual users than
the traditional catalogAllow more flexible exploration of information
using facets, collocation of related works, and other methods
Encore
Ann Arbor (Drupal over III)
Aquabrowser
Scriblio (WordPress over III)
WorldCat Local
MAJAX
Endeca
Evergreen
Vufind
Primo
Behind the magicSignificant lag time Missing facets Excessive server load / only useful in low
demand environments No access to local data Very limited advanced search capabilitiesDisappointing ability to deal with distributed
collections
So…If next gen catalogs are so great, why isn’t
everyone doing them yet?Problem is that they are closed systemsTime and cost and staff expertise neededStill very new in development
How does data get in and out? Old data = badRealtime availability of data is critical, but is not
always possible given the system.Best method depends on system:
Standards based interfaceAPIURLDirect database callsScreen scraping – automating interfaces intended
for humansSometimes, manual processes are needed
Product Data source Staff time requiredAquabrowser Extraction from Summit For data extraction and
initial setup. Significant configuration and setup. Modest systems administration afterwards .
Encore Summit Minimal.
Endeca Extraction from Summit Extensive programming needed. Significant configuration and setup. Modest systems administration afterwards .
Evergreen Replacement for III Extensive setup and systems administration.
Primo Extraction from Summit No left hand title match. Significant configuration and setup. Modest systems and Oracle administration afterwards.
WorldCat Local WorldCat holdings Minimal.
http://orbiscascade.org/index/next-generation-catalog-comparisons-public
Implications of choosing one path over anotherLots of time, effort, money, staff expertise
involved no matter which option chosenWhat types of conditions (personnel, systems)
would make it easier or harder to implement?Currently too many variables – we are
hacking around the edges.
Alliance projectCan grow or shrink number of librariesBringing together resources (articles, etc.)
that go beyond a library’s holdingsUsually searching catalog for something
specific or general. If there are better types of resources that would be better.
What is the role of wikis?Why use a wikiWhat would be involved in integrating oneScriblioDrupal
Interface design (part 1)What constitutes good user interface design
in catalogs? How can you tell?
Discuss!
Interface design (part 2)Feedback to entire groupDo we have consensus?What are we using as our baseline for
measurement and comparison?Are we even talking about the same kind of
site (academic, public, etc.)?
Interface design (part 3)What constitutes good interface design completely depends on who your users are and what their needs are!
So, how do you know what your users’ needs are?Usability testing can be one way to help gather
this informationAnalytics is another
Why conduct usability testing?It can take a lot of staff timeIt is almost always a challenge to recruit
subjects*Sometimes the results might not be what you
want to hear or things you actually cannot change at this time.
So…why do it?
* Although, according to Jakob Nielsen (http://www.useit.com/alertbox/20000319.html) , you only need 5-8 people to adequately test.
Why conduct usability testing?Allows you to see first-hand how people are
searching and uncovers searching behaviors (what people do vs. what they say).
Usability testing gives you more data so you can design your interfaces and systems to better meet user searching needs
Allows you to understand what users need vs. what is just your opinion (or others in your group)
Marketing opportunity!
What usability testing does not measureDoes not show how OPAC fits in overall search
strategy.Does not measure every situation for every
individual or users’ changing needs over timeHence the need for not just doing usability testing once!May not measure the tasks users more commonly doContextual interviews are best for thisDoes not provide statistical dataThis is where analytic software (ex: Google Analytics code
in toplogo file for III systems) comes in!Squishiness of this = video / have manager come observe
What usability testing does measureDoes allow you to observe people’s actual
searching behavior (vs. what people say about your site / their opinions about what they do).
Does allow you to assess users’ level of satisfaction with the site.
Does allow you to develop rappore with your users and discover unforeseen issues.
How usability testing can help with needs assessment and marketingThe other side of the equation with analyticsGives you the “squishy”, unforeseen information
that cannot show up in surveys or through analytic software.
You are showing your patrons you care about their opinion and your level of service to them
Creates more opportunities for 1-1 “just in time” learning sessions.
Raises awareness of tools/services you can provide
Raises awareness of library at institution
What kinds of needs assessment could/should you be doing?Depends on what you are trying to find outAnalytics (quantitative)Usability testing (qualitative)FormalInformal
“Café test” – sit in a coffee shop with a laptop and offer to buy coffee for anyone willing to tell you what they think / go through some tasks.
Gore, P. and Sandra G. Hirsh. “Planning Your Way to a More Usable Web Site”. Online. (May/June 2003) p. 20-27.
Genuis, S. “Web Site Usability Testing: A Critical Tool for Libraries.” Feliciter (Canadian Library Association) no. 4 (2004) p. 161-165.
Assessment tests (part 1)Technique What is
measuresWhat you can learn
Limitations
Focus groups User perceptions How users view and experience your site
One person can dominate
Survey User perceptions How satisfied and useful users say your site is
Not tied to actual user behavior
Usability heuristics
Site analysis How your site measures up against accepted guidelines
Not tied to actual user behavior
Prototypes (incl. paper mockups)
User reactions to prototypes
How early designs will be received by users
Doesn’t show how people interact with existing site
Usability walk-throughs
User reactions to prototypes
How typical tasks in a more developed prototype will work
May be hard to predict actual user behavior on completed site
Assessment tests (part 2)Technique What is
measuresWhat you can learn
Limitations
Card sorting User conceptualizations
How people think labeling and grouping should be
Challenge to map users’ mental model to resource
Formal/informal usability testing(contextual interviews)
Actual user behavior
How real users react while performing tasks; provides insights into design
Limited in number of people that can participate
Server log analysis
Actual user behavior
How many people are coming to your web site; which pages requested most
Unable to determine what motivated user actions
• All these techniques discussed in depth at www.usability.gov site.• http://www.usability.gov/methods/• http://www.usability.gov/methods/contextual.html
• Information also on Wikipedia• http://en.wikipedia.org/wiki/Comparison_of_usability_evaluation_methods
The challenge of the catalogThere is not nearly as much information about
how do usability test OPACs as there is web sitesDon’t stress out! The process is similar
What is your goal / what you want to find out with testing?Where users place OPAC in how they do searches
would be a different set of testing than specifically testing the interface of the OPAC pages/bib record display, etc.
Which product you chooseDoes it allow as much flexibility as possible in
future?
User understanding of catalogsCatalogs are one small part of the user
experience.Usability testing the catalog will give you
information about things specific to that interface.
Usability testing the catalog (or even the web site) will not give you near complete information about your users’ total search strategies.
Finding out more about their overall search strategies would involve “contextual interviews” and other feedback done on an ongoing basis.
Mental models and cognitionBe emotionally prepared to accept the
feedback your users will give.Google, GoogleScholar, Amazon.com have set
user expectations about how to search for information.
Catalogs are powerful, but their interfaces are very different than these resources (and often uncontrollable by us)
Set expectations in usability testingGet and welcome feedbackDon’t promise them things you can’t deliver!
They think they are searching well, but they are confusing things and in the process missing a lot.
How to do usability testing1. Determine GOALS2. Determine METHODS3. Design the TASKS/QUESTIONS4. Test the test!5. Subject recruitment6. Actual testing (give more time!)7. Make any necessary follow-ups with users8. Write-up of findings9. Make recommendations10.Implement recommendations11.Follow-up testing
Determine goalsDo our patrons understand how to do basic
functions using our main page and OPAC?Does the layout and information architecture
of our main page and OPAC make sense to our patrons?
What would be your goals?
Discuss!
Determine methodsHow long as it been since your last usability
testing?Which type of testing did you do? What are you trying to find out this time? Which usability testing methodologies will best
help you get the information you need in this round?
For the purposes of class today, assume we have already decided on 1-1 usability testing as our method based on the goals we have determined.
Determining tasks/questionsMay morph over timeBe clear about where to start (i.e. library’s catalog
main page – http://libcatexample.edu) Once you have the tasks together, test them out on
library staff or students. Rework as needed to clarify.Helpful to have rationale next to question (see
Novotny article) for internal review (i.e. user does not see it).
Cobus, L. et al. “How Twenty-Eight Users Helped Redesign an Academic Library Web Site: A Usability Study.” Reference & User Services Quarterly v. 44 no. 3 (Spring 2005) p. 232-246.
Novotny, E. “I Don’t Think I Click: A Protocol Analysis Study of Use of a Library Online Catalog in the Internet Age.” College & Research Libraries v. 65 no. 6 (Nov. 2004) p. 525-537.
Example questions1. A friend recommends that you read a book called History
by Hollywood. Does the Institution Library have this book?
Question rationale: To examine how users search for known items and whether they could locate a book when they knew the title. The title was chosen so that the broadest keyword search resulted in fewer than fifty matches.
2. Please use the library catalog to ask for a copy of the book History by Hollywood.
Question rationale: This task was designed to see if users could determine how to use the request button to request materials.
Example questions3. The article you need is in the journal Kansas Law Review,
volume 49, issue 5, June 2001. Is this issue of the journal in the Institution Library? If so, where is it?
Question rationale: A unique title was chosen so that users retrieved only one match regardless of which search type was selected (keyword or browse). The research team was interested in determining how users interpreted a serials record.
4. Another article you need is in a journal called Civil Engineering. You need volume 70, Decmeber 2000. Is a copy for this year available on the main campus? If so, which library can it be found in?
Question rationale: To see how users navigated a potentially complex search. The default keyword search options results in over 2,000 matches. The record display includes multiple holdings, with the main campus copy at the bottom of a very long record.
Figuring out whom to testConcerns if you are testing the “right”
subjectsPer Steven Krug, grabbing someone in the hall
is better than no feedback at all!5-8 subjects will give you information about 80%
of the pain points/things to fix in your interfaceConsider anonymizing results – subjects will
be more willing to participateTell subjects you are doing this
If you need “proof” of users’ experiences and opinions, consider recording
Screen capture softwarePros and cons to usingDecide what your goals are first, then use
screen capture software as appropriateCamtasia (TechSmith)
http://www.techsmith.com/camtasia.asp $179/each (educational pricing)
(http://www.techsmith.com/purchase/education.asp)
Captivate (Adobe)http://www.adobe.com/products/captivate/$199/each (educational pricing)
(http://www.adobe.com/products/captivate/productinfo/faq/)Goodwin, S. Using screen capture software for web site usability and
redesign buy-in. Library Hi Tech v. 23 no. 4 (2005) p. 610-21
Subject recruitment and testingOnline surveys can be helpful for recruitment
But cannot give you all the information you needwww.surveymonkey.com
Other recruitment effortsCampus newsletter (email and web site)Flyers /table-tent flyers/ bookmarks at circ deskNew employee/student orientations / campus
contactsGive participants an incentive/thanks for their
timeBe clear about what you are asking them to do /
time commitment / that they can stop at any timePossible human subjects/IRB paperwork
Actual testing!Be willing to go to where your users are
Note: This is easier if you are not carrying video equipment or need screen capture software!
Bring/provide any necessary paperworkForm explaining the testing – you are testing the
site, not them! People get nervousForm with the tasks/questions (if they want a
copy)Your “script” with prepatory commentsYour form for filling out their responses
Exercise! (part 1)1. Get into groups of 3 - designate a “1”, a “2”, and a
“3”2. You will have the following roles
Facilitator – Person conducting the test/briefing the user/asking the questions
User/subject – Person going through the tasks and thinking out loud about their process while they do it
Notetaker – Person taking notes on observations of user comments and behavior as they go through tasks
Go through the questions, the facilitator asking the questions, the user going through the tasks and thinking out loud, and the notetaker taking notes.
Exercise! (part 2)Start from
http://consort.library.denison.edu/search
1. Does the Wooster Main Library have any copies of Harry Potter and the Half-Blood Prince?
2. Is the Journal of Academic Librarianship (JAL) available electronically for the year 2007? Is it available in print for the year 2007?
3. If you wanted the JAL in print for 2007 and none of the CONSORT libraries had it, where would you go?
Exercise debriefWhat did you find most challenging as a
tester? Facilitator? Notetaker? What would have helped with your
challenges in that role? What does that tell you about how to do (or
not do) testing in your environment?
Discuss!
Recommendations for write-upHave “executive summary” and more in-depth
write-up with all the data/video captures, etc.Use graphics (pie charts, etc.) to emphasize
results (i.e. “which percentage of users were not able to find this function or link?”)
Use screenshots to emphasize trouble spots that testing identified
Include short and long-term recommendationsSome things will be quick fixesSome things you cannot control in the ILS
Recommendations for testingMake usability testing (aka “feedback from
users”) part of what you do on an ongoing basis for your OPAC and web site.
Utilize orientations, classes, liaison programs to promote continual feedback loop.
Go out to users whenever needed. They won’t always tell you what is wrong through a “contact us” link on the site.
Make sure patrons understand they are a vital piece of the process in constructing the catalog and site.
Contact informationKyle Banerjee, MLISDigital Services Program ManagerOrbis Cascade [email protected] / 541.359.9599
Laura Zeigen, MA, MLISSystems & Web Development LibrarianOregon Health & Science [email protected] / 503-494-0505http://www.ohsu.edu/library/staff/zeigenl