evaluating quality: the mile method applied to museum web sites franca garzotto - hoc- hypermedia...

Post on 27-Mar-2015

217 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Evaluating quality:Evaluating quality:the MILE method applied to the MILE method applied to

museum Web sitesmuseum Web sites

Franca Garzotto Franca Garzotto - HOC- Hypermedia Open Center, Politecnico di Milano- HOC- Hypermedia Open Center, Politecnico di Milano

Maria Pia Guermandi Maria Pia Guermandi - Istituto Beni Culturali, Regione Emilia-Romagna- Istituto Beni Culturali, Regione Emilia-Romagna

Outline

• The need for measuring

• What is MILE

• MILE applied to museum web sites

The need for measuring quality

• Once quality criteria are defined (see the Minerva Quality Framework), we need to provide a measurement method to evaluate quality– An evaluation procedure or process– A metrics

What is MILE? • Developed at Politecnico di Milano (Hypermedia Open

Centre) and University of Lugano, Communication Sciences (TEC-lab)

• A systematic method to evaluate the quality in use of web applications, i.e., USABILITY– Usability is “the effectiveness, efficiency and satisfaction with which

specified users can achieve specified goals in particular environments” (ISO 9241-11)

• Combining structured inspection with empirical testing• Scenario driven• Up to now, applied to museum web sites (see next),

educational and e-commerce web applications

MILE principles

The usability of an application can be analysed at two levels:

1 : general (for any web application)

2 : domain specific (e.g. museum web sites)

MILE principles (cont.)

Separating different levels of analysis:

• CONTENT• COGNITIVE ASPECTS• NAVIGATION and INTERACTION• GRAPHICAL DESIGN• TECHNOLOGY (PERFORMANCE)

MiLE Key concepts_1: ABSTRACT TASK

• An abstract task (AT) describes a pattern of inspection activity – captures usability experience

• identifies the application features on which it is important to focus inspection

• describes some actions the inspector should perform during the evaluation

MiLE Key concepts_1: ABSTRACT TASK (cont.)

Abstract Tasks:• enforce standardisation and uniformity• reduce time and cost needed for

inspection• So far, A.T have been defined for

– level 1 (general): navigation/interaction/multiple media dynamics

– level 2 (domain specific): e.g., museum web sites (see next)

MiLE Key concepts_2: SCENARIOs_1• For evaluating domain specific aspects, MILE

provides SCENARIOS• Scenario:

– story about use (Carroll, 2002)– the description of a concrete episode of use of the application

(Cato, 2001)

In MILE, a scenario is composed by: • Abstract Task• User profile:

– Description of a stereotyped user, shaping the relevant features of typical “stakeholders”

MiLE Key concepts_2: SCENARIOs_2

• The definition of scenarios is based on User Requirements investigation and analysis carried on with domain experts and user experts

• Scenarios are needed to weight the relevance of usability violations that are detected during inspection (see process)

MiLE: 5 phases process Modeling the application under inspection

To identify the critical areas of the application relevant for a usability evaluation.

Performing abstract tasks Executing the actions described by ATs

Scoring Measuring accomplishment of abstract tasks through usability attributes

Weighting balancing the scores with relevance weights according to user profiles and

communication goals

Empirical testing user testing

Bologna Group

1. site’s presentation : general information about the Web-site structure

2. the real museum : contents and functions referring to a “physical museum”

museum’s presentation how to reach how to visit information about museum services information about the museum educational department promotions and fidelization politics information about the museum activities and events information about museum publishing

3. the virtual museum : contents and functions exploiting the communicative strenght of the medium

collections on-line single item’s description educational web activities on-line promotion and fidelization politics on-line events publishing on-line

Contents Survey Schema for museum Web-sites

Abstract task dimensions

•Efficiency: the action can be performed successfully and quickly•Authority: the author is competent in relation to the subject•Currency: the time scope of the content’s validity is clearly stated. The info is updated.•Consistency: similar pieces of information are dealt with in similar fashions•Structure effectiveness: the organization of the content pieces is not disorienting•Accessibility: the information is easily and intuitively accessible•Completeness: the user can find all the information required•Richness: the information required is rich (many examples, data…)•Clarity: the information is easy to understand•Multilevel: the information is given at different level of understanding•Multimediality: different media are used to convey the information•Multilinguisticity: the information is given in more than one language

Abstract task attributes

Users scenarios

ABSTRACT TASK: “find the educational activities occurring on a range of dates in a real museum”

USER PROFILE: “a family with two sons aged 5-10, living in the town where the real museum is actually located”

(A1) currency of the information; (A2) quality of the organization of the information; (A3) completeness of the information; (A4) richness of the information provided.

RELEVANT ATTRIBUTES

www.louvre.fr

www.nationalgallery.org.uk

Muséedu Louvre

NationalGallery

SCORING WEIGHTING

(A1) effectiveness of the information; (A2) completeness of the information; (A3) richness of the information; (A4) navigation organization.

ABSTRACT TASK: “find all the works of a given artist”

RELEVANT ATTRIBUTES

USER PROFILE: “an art historian looking for information about a topic he/she is currently carrying a research on”

www.hermitagemuseum.org

www.metmuseum.org

HermitageMuseum

MetropolitanMuseum

SCORING WEIGHTING

top related