group project for info 608 – spring 2011 notes -...

16
Group project for INFO 608 – Spring 2011 Notes This is a three week assignment in which the students work on an evaluation of the IPL website. The class materials are delivered through a wiki in Blackboard, and reinforced by weekly emails. I also respond to individual student emails and questions about the assignment, and send out digests of these questions and my answers. In this document, the email instructions are presented first, and the wiki pages are presented second.

Upload: truonghanh

Post on 16-Mar-2018

218 views

Category:

Documents


3 download

TRANSCRIPT

Group project for INFO 608 – Spring 2011 Notes This is a three week assignment in which the students work on an evaluation of the IPL website. The class materials are delivered through a wiki in Blackboard, and reinforced by weekly emails. I also respond to individual student emails and questions about the assignment, and send out digests of these questions and my answers. In this document, the email instructions are presented first, and the wiki pages are presented second.

Week 8 Digital libraries and the IPL Hello everyone, I’m posting the week 8 instructions slightly ahead of time, so that you get started on the final project work. For the final project we will be using the Internet Public Library (IPL) as a testbed for investigating and applying some of the main concepts of the course. The site has been tested in previous HCI classes, but the IPL project team always welcomes new evaluations, and will use your reports to adjust the current design and to help inform the next round of design decisions for the site. (I’m also planning to put all the evaluation results into a spreadsheet at some point …) The IPL is a digital library hosted at the iSchool at Drexel. It includes an online repository of approximately 40,000 records and a large web site of approximately 12,500 pages. In 2010 it received 5.75 million visits and 16.46 million page views (approximately 500,000 visits and 1.5 million page views per month). The materials on the Week 8 wiki page will introduce the IPL and digital libraries. Please make yourself familiar with them, as they will help you with your final report. The IPL Statement of Principles, which outlines some of the goals of the IPL, is available at http://ipl.org/div/about/statement.html The first assignment for Week 8 is therefore to review the IPL site, and the supporting materials on the wiki. Look around the IPL site, and see what it does. When looking at different parts of the site, think about how you can get to them (from the front page, through the search engine, by browsing, etc.). How would a user go about looking for specific resources? EVALUATING THE IPL – INTERVIEW, PERSONA, AND SCENARIO You will be using heuristic evaluation to evaluate the IPL website (see Week 9). In order to do that you need to create suitable personas and scenarios (which you did a test run for a couple of weeks ago). In order to create the personas, you need to carry out some interviews. The first step involves interviewing someone who you think might be a typical user of the IPL. According to the IPL managers, many of the resources are aimed at K-12 educational levels, and can be used both inside and outside the classroom. We will therefore assume, for the sake of the final project, that a typical IPL user is someone who is involved in K-12 education, as a student, teacher, parent, activity/curriculum developer (e.g. museum outreach worker), etc.

You will each therefore carry out a semi-structured interview. There is a short reading on interviews on the Week 8 wiki page. After reading this background, you will work as a group to develop a common interview instrument for your group. Please keep a record of this instrument; you will need to add it as an appendix to the final report. When you carry out the actual interview, please keep detailed interview notes. There is no individual grade for these interview notes, however I will ask each student to append their interview notes to the end of your group report. DEVELOPING YOUR PERSONA AND SCENARIO FOR THE IPL Once you have each carried out an interview, the next step involves the creation of a persona and scenario/task for testing the IPL web site. This is basically the same process that you followed for creating the persona and scenario last week, but this time based on the interview information. Remember, the task that underpins the scenario should be clearly stated at the top of the scenario. SHARING YOUR PERSONA AND TASK Once you have developed your persona and task, please share these with your group, and then select a new person and task from another group member for the evaluation part of the exercise. When you are creating your persona and scenario, you should therefore think of them as being useful for other members of your group. If you think another group member’s persona and scenario is unusable, please let me know! There is no individual grade for these personas and scenarios, however I will ask each student to append their interview notes, and persona and scenario to the end of your group report. In the next set of instructions, I will introduce the heuristic evaluation techniques. This is scheduled for Week 9 but I will send it out early in case folks want to make an early start on the final project. The instructions for this week are a little more complex, so please let me know if you have any questions about them. Mick.

Week 8 Information email – Peer evaluation As the final project is a group assignment, I will be including a peer assessment element with this assignment. The point of the peer assessment is to provide a communication channel between you and myself on how well you think your team is working. It can be frustrating working with a bad team, and if you think this is the case, you can let me know. The assessment will be factored into the assignment grade. Usually, there is not much influence. However, someone who has a lot of praise from teammates for putting in extra effort will get some bonus marks; while someone who has a lot of criticism from teammates (late input, not answering email, etc.) will have some marks deducted. I am attaching a draft of the peer assessment form, filled in with some examples. I am also attaching a blank form. The forms are also available on the front page of the wiki. The form is adapted from a previous online version of 608, taught by a different professor. Please be familiar with it. You will submit your peer evaluation form with the final project report. Please let me know if you have any questions about the form or the policy. Thank you! Mick.

Week 9 Information email – Heuristic Evaluations Hello everyone, I hope that your group projects are progressing well. In week 9, you will use the personas and scenarios developed during Week 8 to support a heuristic evaluation of the IPL website. So-called lightweight inspection techniques can, claims Jakob Nielsen, identify a wide range of common usability problems using only a small number of expert evaluators (and this is the significance of the chart posted at the top of the wiki page). For this week’s work, please review all the pages on Nielsen’s website that are linked, as well as the two readings by Danino, and Sharp, Rogers and Preece. Further optional readings are also available on the wiki page for Week 9. HEURISTIC EVALUATION OF THE IPL Information and instructions for carrying heuristic evaluations are provided on the Week 9 wiki page. Carry out your heuristic evaluation of the IPL using a persona and scenario developed by a team mate. When you have finished, share your final heuristic evaluation worksheets and results with your team, and then work as a team to prioritize the design issues that you encountered. When doing this, please think of the design issues in broader terms than the specific terms of your individual evaluation. What issues emerge across all of your evaluations? Once again this part of the assignment is not ‘due’ in the traditional sense, although it will have to be completed in order for your team to proceed to the next stage, which is writing the final report. This is quite a short email, so please let me know if you have any questions or concerns about this work. Thank you! Mick.

Week 10 Hello everyone. This email covers the final deliverable, the project report. It pretty much repeats the information on the wiki page. REPORT WRITING There are many ‘how to’ resources on usability and evaluation report writing, and I have included some of these on the wiki page for Week 10. While the reports on the wiki page differ in their particulars, they often also emphasize some similar elements of successful report writing. These include an executive summary, clear structure and writing, the use of graphics, and appendices for detailed information. The examples include a student report for an HCI class at the University of Michigan School of Information. Although it's written for a different type of website and different method - usability testing versus heuristic evaluation - it's a good example of a well-written and structured HCI report. While this is example report is not an exact template for your final report, but is one potential example. For your final report, you should also include at least the following sections (note that this is not an exhaustive list): - Title page - Contents page - Executive summary - Introductory section, for instance: background on digital libraries; background on children, adults, and HCI; background on the IPL) - HCI introduction, for instance: methods introduction; summary of personas, tasks, scenarios - Summary of test results (see last assignment as well); - Recommendations for design - Limitations of the study - Conclusion - References cited - Appendix: Team interview instrument - Appendix: Individual interview notes - Appendix: Individual personas and scenarios - Appendix: Individual heuristic evaluations - Appendix: Individual academic honesty statements Please note that this is a non-definitive list, and teams can adapt and add to this. PRESENTATION In addition to writing a team report, you should also produce a team presentation. Unfortunately we won’t be able to meet face-to-face to view each other’s presentations, but the exercise is still worth it. When preparing your presentation, please avoid common

mistakes (such as using too much small text, or too many bullets, or cut-and-pasting direct from your report). Please make the presentation readable and also informative for your audience! There is a limit of 30 slides for the presentation. SUMMARY SCHEDULE 1. Individual: Carry out heuristic evaluation, write up individual report. 2. Team: Analyze findings, and select and prioritize significant design concerns. 3. Team: Create a report prioritizing and describing these significant design concerns. 4. Team: Create required appendices. 5. Create a Powerpoint presentation to support your report. This assignment is worth 130 points. Please upload to the Blackboard Digital Dropbox by the end of Sunday, June 5, 2011. This is a bit of a complicated process, so please contact me if you have any questions about this assignment, or require further information and clarification. Mick.

Courses > 2010-2011 Spring : I-INFO608-901-201035 : Human-Computer Interaction > Class wiki > Week 08 Digital Libraries

Class wiki

(permalink)

Week 08 Digital Libraries (permalink)last edited by Michael Khoo on Friday, 05/13/2011 11:00 AM

Digital LibrariesHere are some readings on digital libraries, and some of the issues involved digital library analysis and design. If you arenot familiar with the basic concepts behind digital libraries, the following articles should bring you up to speed.

Jeng, J. (2005). What is usability in the context of the digital library and how can it be measured? InformationTechnologies and Libraries, 24(2), 47-56.Jeng's article provides a summary of some of the issues relevant to this course, including background on digital libraries,and proposes a general usability evaluation model.

Jeng - What Is Usability in the Context of the Digital Library and How Can It Be Measured.pdf

Reeves, T., X, Apedoe, and Y. H. Woo. (2003). Evaluating digital libraries: A user friendly guide. Boulder, CO:National Science Digital Library/University Corporation for Atmospheric Research.This is not so much an article, as a 'how to' guide' for digital library evaluators. Please read chapters 1 and 2, which talkabout planning an overall digital library evaluation, and chapter 4, which talks about usability.

http://www.dpc.ucar.edu/projects/evalbook/EvaluatingDigitalLibraries.pdf

The Internet Public LibraryWe will be using the Internet Public Library (IPL) as a testbed for investigating some of the main concepts of the course.Please read the following materials and make yourself familiar with them. The Mission Statement, and Statement ofPrinciples, outline some of the goals of the IPL.

Main Site: http://ipl.org/Statement of Principles: http://ipl.org/div/about/statement.html

The IPL was created in 1995 in a library and information science class taught by Professor Joe Janes at the University of

Michigan-Ann Arbor. The IPL was introduced as both a service organization for the general public and a teaching/learningenvironment for library and information science (LIS) faculty and students. As the ipl2, it now supports library servicesthrough the provision of reviewed collections, an online question-answering service, and information instruction for thepublic (incorporated into the question-answering service). Its subject-categorized collections, created by students andvolunteers (mainly librarians in training), contain more than 40,000 online resources, and are available for library andinformation science (LIS) programs to use as a training tool for tasks such as the creation and editing of metadata. In 2007,the IPL servers were moved to Drexel University. In 2008, the IPL began work on the next version of the library, ‘ipl2,' whichwas intended to merge the IPL and the LII and transition the library to a Web 2.0 architecture capable of supporting servicessuch as user accounts, personal collections, resource tagging and annotation, and the sharing of resources, collections,annotations, etc. Since then, the site has undergone a complete redesign, both with the underlying databases and catalogs,and also with the interface.

Interviews

A basic introduction to different interview techniques for gathering user requirements and data:

interviews.pdf

Optional ResourcesD-Lib Magazine is a good source of research articles on digital libraries.

http://dlib.org/

Judy Jeng (above) also has some useful digital library usability and evaluation bibliographies available:

http://web.njcu.edu/sites/faculty/jjeng/Content/bibliographies.asphttp://www.scils.rutgers.edu/~miceval/research/usability.html

Chris Neuhaus has a bibliography of digital library evaluation resources on his website:

http://www.uni.edu/neuhaus/digitalbibeval.html

Saracevic, T. (2005). How were digital libraries evaluated? Presentation at the course and conference Libraries in the Digital Age (LIDA 2005), 30May - 3 June, Dubrovnik, Croatia.A short but wide-ranging review of literature on digital library evaluation, with an interesting conclusion.

http://www.scils.rutgers.edu/~tefko/DL_evaluation_LIDA.pdf

ASSIGNMENT: EVALUATING THE IPL – INTERVIEW, PERSONA, AND SCENARIOYou will be using heuristic evaluation to evaluate the IPL website (see Week 9). In order to do that you need to createsuitable personas and scenarios (which you did a test run for a couple of weeks ago). In order to create the personas, youneed to carry out some interviews.

The first step involves interviewing someone who you think might be a typical user of the IPL. According to the IPLmanagers, many of the resources are aimed at K-12 educational levels, and can be used both inside and outside theclassroom. We will therefore assume, for the sake of the final project, that a typical IPL user is someone who is involved inK-12 education, as a student, teacher, parent, activity/curriculum developer (e.g. museum outreach worker), etc.

You will each therefore carry out a semi-structured interview. There is a short reading on interviews on the Week 8 wikipage. After reading this background, you will work as a group to develop a common interview instrument for your group.Please keep a record of this instrument; you will need to add it as an appendix to the final report. When you carry out theactual interview, please keep detailed interview notes.

There is no individual grade for these interview notes, however I will ask each student to append their interview notes to theend of your group report.

Developing your persona and scenario for the IPL

Once you have each carried out an interview, the next step involves the creation of a persona and scenario/task for testingthe IPL web site. This is basically the same process that you followed for creating the persona and scenario last week, butthis time based on the interview information. Remember, the task that underpins the scenario should be clearly stated atthe top of the scenario.

Sharing your persona and scenario

Once you have developed your persona and task, please share these with your group, and then select a new person andtask from another group member for the evaluation part of the exercise. When you are creating your persona and scenario,you should therefore think of them as being useful for other members of your group. If you think another group member’spersona and scenario is unusable, please let me know!

There is no individual grade for these personas and scenarios, however I will ask each student to append their interviewnotes, and persona and scenario to the end of your group report.

WHILE THERE ARE NO FORMAL DEADLINES FOR THIS PART OF THE FINAL PROJECT, IT ISIN YOUR AND YOUR TEAM'S INTEREST TO COMPLETE THESE AS EFFICIENTLY ASPOSSIBLE

Courses > 2010-2011 Spring : I-INFO608-901-201035 : Human-Computer Interaction > Class wiki > Week 09 Heuristic evaluation

Class wiki

(permalink)

Week 09 Heuristic evaluation (permalink)last edited by Michael Khoo on Sunday, 05/22/2011 8:35 PM

WEB READINGS: HEURISTIC EVALUATION

Jakob Niesen. Image: http://www.pixelsurgeon.com/interviews/interview.php?id=45

There is a wide range of available HCI techniques. These involve a similarly wide range of resource commitments, interms of time, expertise, tools and facilities, access to users, etc. Each of these and other factors can addconsiderably to the cost of carrying out HCI work. HCI is not necessarily a ‘cheap' technique.

Of course, not everyone - and particularly not those in an entry-level HCI class - will have access to the users,usability labs, and other resources required to carry out comprehensive evaluations. At the same time, there are anumber of techniques that can be used with relative success when there are limited resources available for carryingout HCI work. These are sometimes called ‘lightweight' or ‘discounted' techniques, and the developers of thesetechniques claim that if applied correctly, they can be used, with a relatively small number of tests, to identifysuccessfully many of the major flaws in a design. One group of such techniques includes heuristic evaluations.

Figure: Many significant usability findings can be identified with a small group of evaluators

Heuristic evaluation techniques are HCI techniques that can be used when users are not available, and when rapidturnaround in usability testing (for instance, to support iterative usability testing) is required. The process generallyinvolves HCI experts using structured evaluation instruments and processes to assess an interface. The first wikipage for Week 6 contains some guides to carrying out heuristic evaluations. The first of these guides is by aninfluential evaluator called Jakob Nielsen, who has probably been as influential in terms of methods as Norman hasbeen in terms of theory. So-called lightweight inspection techniques can, claims Jakob Nielsen, identify a wide rangeof common usability problems using only a small number of expert evaluators (and this is the significance of the chartposted at the top of the wiki page).

For this week's reading, therefore, please review all the pages that are linked, as well as the two readings by Danino,and Sharp, Rogers and Preece. Further optional readings are also available below.

Nielsen, J. Heuristic Evaluation.

Heuristic evaluation introduction http://www.useit.com/papers/heuristic/How to conduct a heuristic evaluation http://www.useit.com/papers/heuristic/heuristic_evaluation.htmlTen usability heuristics http://www.useit.com/papers/heuristic/heuristic_list.htmlCharacteristics of usability problems found by heuristic evaluationhttp://www.useit.com/papers/heuristic/usability_problems.htmlSeverity ratings for usability problems http://www.useit.com/papers/heuristic/severityrating.htmlTechnology transfer of heuristic evaluation and usability inspectionhttp://www.useit.com/papers/heuristic/learning_inspection.html

Danino, N. (2001). Heuristic evaluation: A step-by-step guide.

http://www.sitepoint.com/print/heuristic-evaluation-guide

Sharp/Rogers/Preece provide an overview of a range of uses of heuristic evaluation in Chapter 15, pages 684-702.

srpchap15.pdf

FURTHER READINGS (IF YOU ARE INTERESTED)

Cockton, G., Lavery, D., & Woolrych, A. (2003). Inspection-based evaluations. In the Human-Computerinteraction Handbook: Fundamentals, Evolving Technologies and Emerging Applications, J. A. Jacko and A.Sears, Eds. Lawrence Erlbaum Associates, Mahwah, NJ, 1118-1138.An introductory survey of various kinds of inspection methods, including theoretical and technical overviews.Heuristic evaluation methods are covered on pages 1122-1123. While there are a range of heuristic inspectionmethods available, a number of these have been summarized into a set of 10 heuristic principles by Jakob Nielsen,and we will look at these in more detail below.

cockton_lavery_woolich.pdf

Ling, C. and Salvendy, G. (2005) Extension of heuristic evaluation method: a review and reappraisal.Ergonomia IJE&HF, 27(3), 179-197.A review article of some of the current trends and possible future directions in heuristic evaluation.

E2005-3-Ling-heuristiceval.pdf

Nielsen, J. and Molich, R. (1990) Heuristic evaluation of user interfaces. In Proceedings of the SIGCHIConference on Human Factors in Computing Systems: Empowering People (Seattle, Washington, UnitedStates, April 01 - 05, 1990). J. C. Chew and J. Whiteside, Eds. CHI '90. ACM Press, New York, NY, 249-256. An early paper by Nielsen and Molich in which they describe how even a relatively small group of heuristic evaluatorscan together identify many of the main usability problems of an interface. Of historical interest - but interesting! Youwill see versions of the charts at the end of the paper reproduced in a number of discussions of heuristic evaluation.

Nielsen_Molich.pdf

Nielsen, J. (1994). Enhancing the explanatory power of usability heuristics. In Proceedings of the SIGCHIConference on Human Factors in Computing Systems: Celebrating interdependence (Boston, Massachusetts,United States, April 24 - 28, 1994). B. Adelson, S. Dumais, and J. Olson, Eds. CHI '94. ACM Press, New York,NY, 152-158.Another early paper of Nielsen's, in which he describes the method and results of his and Molich's analysis of existing

Another early paper of Nielsen's, in which he describes the method and results of his and Molich's analysis of existingheuristic methods. Again, of historical interest (and again, interesting!).

Nielsen_1994.pdf

ASSIGNMENT: HEURISTIC EVALUATION OF THE IPLheuristic_worksheet.doc

Carry out your heuristic evaluation of the IPL using one of the personas and scenarios developed by a team mate,and the worksheet for this assignment.

When you have finished, share your final heuristic evaluation worksheets and results with your team, and then workas a team to prioritize the design issues that you encountered. What problems seemed to you to be common acrosseach evaluation? Which ones are the most important, and why?

When doing this, please think of the design issues in broader terms than the specific terms of your individualevaluation. What issues emerge across all of your evaluations?

Once again this part of the assignment is not ‘due’ in the traditional sense, although it will have to be completed inorder for your team to proceed to the next stage, which is writing the final report.

Courses > 2010-2011 Spring : I-INFO608-901-201035 : Human-Computer Interaction > Class wiki > Week 10 Writing reports

Class wiki

(permalink)

Week 10 Writing reports (permalink)last edited by Michael Khoo on Sunday, 05/29/2011 10:42 PM

READINGS: WRITING REPORTS

Usability and evaluation are carried out on a wide range of systems of all kinds, for all kinds of reasons, and for avariety of different audiences, and so reporting requirements can differ considerably. There are many 'how to'resources on usability and evaluation report writing, and I have included some of these below. While they differ intheir particulars, they often also emphasize some similar elements of successful report writing. These include anexecutive summary, clear structure and writing, the use of graphics, and appendices for detailed information.

I have included some of these resources below. When you read them, you will notice that while they differ in theirparticulars, they often also emphasize some similar elements of successful report writing. These include anexecutive summary, clear structure and writing, the use of graphics, and appendices for detailed information.

Frechtling, J. (2002). The User-Friendly Handbook for Project Evaluation. The National Science Foundation.See pages 35-41.

Frechtling 2002 The 2002 User-Friendly Handbook for Project Evaluation nsf02057.pdf

Chapman, S., T. Hughes, O. Khroustaleva, and C. Korintus. Usability Evaluation of breastcancer.org. 2006.

This is a student report for an HCI class at the University of Michigan School of Information. Although it's a differenttype of website and different method - usability testing versus heuristic evaluation - it's a good example of a well-written and structured report.

breastcancerdotorg.pdf

Tognazzini, B. (2001) How to Deliver a Report Without Getting Lynched. AskTog.

http://www.asktog.com/columns/047HowToWriteAReport.html

FURTHER READINGS (IF YOU ARE INTERESTED)National Science Digital Library. Evaluation Reports.Just for reference, here are some of the reports that I and colleagues at the National Science Digital Library have

worked on. I have archived these on my faculty Web page.

http://ischool.drexel.edu/faculty/mkhoo/research.html

VIDEO: HOW NOT TO DO POWERPOINT

By Don McMillan.

http://www.youtube.com/watch?v=lpvgfmEU2Ck

ASSIGNMENT: FINAL REPORT INSTRUCTIONSFor your final report, you should also include at least the following sections (note that this is not an exhaustive list):

Title pageContents pageExecutive summary - list the main findings and implications of the reportIntroductory section, including: background on digital libraries; background on the IPLHCI introduction, for instance: methods introduction; summary of personas, tasks, scenariosSummary and prioritization of test results - screen shots are useful!Recommendations for designLimitations of the studyConclusionReferences citedAppendix: Team interview instrumentAppendix: Individual interview notesAppendix: Individual personas and scenariosAppendix: Individual heuristic evaluationsAppendix: Individual academic honesty statements

Please note that this is a minimum but non-definitive list, and teams can adapt and add to this.

PresentationIn addition to writing a team report, you should also produce a team presentation. Unfortunately we won't be able to

In addition to writing a team report, you should also produce a team presentation. Unfortunately we won't be able tomeet face-to-face to view each other's presentations, but the exercise is still worth it. When preparing yourpresentation, please avoid common mistakes (such as using too much small text, or too many bullets, or cut-and-pasting direct from your report). Please make the presentation readable and also informative for your audience!

There is a limit of 30 slides for the presentation.

Peer EvaluationPeer evaluation of your own and your teammates' performance is required for this assignment. You must return thefollowing document to the digital dropbox in order to receive your final grade.

peer_evaluation_blank.docx

Academic Honesty StatementYou are required to submit an academic honesty statement in order to receive a grade for this course. Please appendyour statement to the end of your team report.

This assignment is worth 130 points. Please upload to the Blackboard Digital Dropbox bythe end of Sunday (YLT), June 5, 2011.