user centered design intro to ucd explain its motivation discuss key stages in the process present...

Post on 20-Dec-2015

218 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

User Centered Design

Intro to UCD

Explain its motivation

Discuss key stages in the process

Present basic methods and techniques

UCD is about designing interactive

technologies to meet users’ needs.

Different stages:• understanding user needs• establishing requirements• prototyping alternative designs• evaluating designs

Key characteristics of any UCD

process:

• Focus on users early in the design and evaluation of the artefact

• Identify, document and agree specific usability and user experience goals

• Iteration is inevitable. Designers never get it right first time

Why involve Users

Around 63% of software projects exceed their cost estimates

Top four reasons are,

- Frequent requests for changes from users

- Overlooked tasks

- User’s lack of understanding of their own requirements

- Insufficient user analysis, communication, and understanding

If you involve end users in the design process,

• More likely to design/build something useful!

• Improve productivity• Reduce human error• Reduce maintenance• Reduce employee turnover• Manage expectations• Encourages ownership of the solution• Understanding of shortcomings/tradeoffs• Increase satisfaction

Principles of UCD approach

• User’s behaviour and context of use are studied and product is designed to support them

• Understanding user needs/pain points, as opportunities for design

• User characteristics are captured and designed for.

• Users are consulted from early concept phases, throughout design,

to final product• Responses to concepts, prototypes,

etc are taken seriously• All design decisions are taken

within the context of the user, their work and their environment

• All design iterations can be traced back to user goals

- not just what users say, but what they do

- how action and interaction are achieved

- interest in the ‘mundane’, taken for granted, moment-by- momentinteractions of people

Outputs: rich descriptions- need interpreting through use

of conceptual frameworks,models etc.

• A range of user research methods

– observation

– interview

– questionnaire

– focus groups

– participant analysis

Interviews

- Forum for talking to people

- Structured, unstructured or semi-structured

- Props, e.g. sample scenarios of use, prototypes, can be used in

interviews

- Good for exploring specific issues

- But are time consuming and may

be unfeasible to visit everyone

Questionnaires

- A series of questions designed to elicit specific information

- Questions may require different kinds of answers:

simple YES/NO

choice of pre-supplied answers

comment

- Often used in conjunction with other techniques

- Can give quantitative or qualitative data

- Good for answering specific

questions from a large, dispersed

group of people

Focus Groups

• Workshops or focus groups:

- Group interviews

- Good at gaining a consensus view and/or highlighting areas of conflict

• Why ‘establish’ requirements?

Requirements arise from

understanding users’ needs

Requirements should be

justified & related to data

Establishing requirements

• What do users want?

• What do users ‘need’?

• Requirements need clarification, refinement and completion over several iterations of the design process.

Focused problem definition established by analysising user data, will lead to stable list of requirements.

Types of Requirements

• Users: Who are they?• Usability and user experience qualities• Environment or context of use• Functional• Data

UsersCharacteristics

abilities, physical, background, attitude to computers etc

System use• Novice: step-by-step, constrained, clear

information• Expert: flexibility, access/power• Frequent: short cuts• Casual/infrequent: clear instructions, e.g.

menu paths

Usability and User Experience Requirements

Effectiveness, efficiency, safety, privacy

utility, learnability, memorability,

(and fun, helpful) etc etc

Environment or Context of Use:

Physical:dusty? noisy? vibration? light? heat? humidity? On the move? Layout of workspace?

Social:Sharing of files, of displays, in paper, across great distances, work individually, privacy for clients AND

Informal information distribution among users.

Organisational:hierarchy, IT department’s attitude and remit, user support, communications structure and infrastructure, availability of training

Functional

Historically the main focus of requirements activities:

What the system should do?

example; train a new employee how to carry out a task.

And also, memory size, response time, platform constraints...

Data

What data or input is required to make the system function for the user and how is it accessed

Self-service cafeteria system at UL

Functional:

system will calculate cost of purchases without the help of cafeteria staff.

Data:

system must have access to price of products

• Environmental:Most users will be carrying a tray, in a rush, noisy, talking/distracted, queing, etc

• User:most users comfortable with technology

• Usability:simple for new users,memorable for frequent users,quick to use, no waiting aroundfor processing

Evaluation

A – an existing system

or

B – a new design.

• A continuous iterative process examining:

Early prototypes of the new system

Later, more complete prototypes & product

• Looking for:

Extent of functionality, effect of interface on user, specific problems/issues

Usability, user experience, other objectives e.g., performance

• Designers need to check that they understand users’requirements and meeting key objectives

When to evaluate

• At all stages throughout design

From the first descriptions, sketches etc. of users needs through to

final product

Iterative cycles of

‘design - EVALUATE - redesign’

• Two main types of evaluation reflecting different stages and goals:

Formative

Summative

• (Involving users)

‘quick’

usability testing

field studies

• (Involving experts)

predictive evaluation

Quick

• What is it

informally asking users/consultants for feedback

• Advantages

Can be done any time, anywhere

Emphasis on fast input to the design process

Sense-checking, do early design ideas make sense?

early concept testing

Quick

• Disadvantages/issues

Not necessarily accurate or extensive

No careful documentation of findings

Usability Testing

Recording typical users’ performance on typical tasks

In a controlled setting, can be in the lab or in the field

Data: write-ups, video, log of key presses, time to complete tasks etc

Usability Testing

• Advantages

– Uninterrupted; can assess performance, identify errors and help explain why users did what they did

– Can be used with satisfaction questionnaires and interviews to elicit user opinions

Usability Testing

Disadvantages/issues

Lack of context; skill to determine typical users and typical tasks

Time to set up tests, recruit participants, and run tests

Need access to resources/equipment

Field Studies

What is it

Observations and interviews in natural settings

Advantages

Helps understand what users do naturally and how technology impacts them in context

For product design

Identify opportunities; determine design requirements; decide how best to introduce tech; evaluate in use

Field Studies

Disadvantages/issues

– Access to settings

– Lack of control; noise, distractions, time-tabling etc

Think-Aloud and Cooperative Evaluation

Think-Aloud Evaluation

"think-aloud", is an evaluation technique in which the user performs a number of tasks and is asked to think aloud to explain what they are doing at each stage, and why.

The evaluator records the users actions using,

• tape recordings • video • computer logging • user notes

Advantages

Think-aloud has the advantage of simplicity, it requires little expertise to perform, and can provide a useful insight into any problems with an interface.

Disadvantages

The information is necessarily subjective, and can be selective depending on the tasks chosen. Being observed and having to describe what you are doing can also affect the way in which you do something: ask a juggler to describe how she juggles.....

Cooperative evaluation

"Cooperative evaluation" is a variant of think aloud, in which the user is encouraged to see himself as a collaborator in the evaluation rather than just a subject.

As well as getting the user to think aloud, the evaluator can ask such questions as "Why?" and "What if.....?"; likewise, the user can ask the evaluator for clarification if problems arise.

Advantages

It is less constrained and therefore easier for the evaluator, who is not forced to sit in solemn silence;

the user is encouraged to actively criticise the system rather than simply suffer it;

the evaluator can clarify points of confusion so maximising the effectiveness of the approach.

Note that it is often not the designer who is the evaluator, but an independent person.

One of the problems with both these techniques is that they generate a large volume of information which has to be painstakingly and time-consumingly analysed.

Such a record of an evaluation session is known as a protocol, and there are a number to use; pen and paper, audio and video recording, computer logging and user diaries.

eg ‘pen and paper’ protocol…

How to run a session

As an evaluator spend a few minutes thinking of some scenarios and tasks for the user to perform. Include some complex tasks as well as some simple ones. Decide on whether you are going to use think-aloud or cooperative evaluation. Then run the evaluation, keeping full notes of the users actions and behaviour and any problems. The fuller the detail in the protocol, the better.

Recruit Users

Define the Target Group

- Recruit users who are similar to target group.

- Describe users, background, age, sex, familiarity with computers etc etc

Prepare tasks

Are the tasks specific?

Will the task focus the user on the part of the system you are most interested in?

How do you know that the task you have chosen represents the work the product is designed to support.

Have you written the task down in a way that a novice user can understand?

Before the user arrives….

Have you got the evaluation space prepared, system/device, tasks, debriefing questions etc

Have you worked through the tasks yourself?

When the user arrives…

Put users at ease.

Explain the co-operative design process

Explain the users role as collaborator CLEARLY.

While the user is using the system….

Keep Them Talking!!!

Make sure you know what they are doing and why they are doing it.

Do not contradict them.

Make notes.

People tend to say less when they are unsure what to do, but this is the time that the evaluator needs to know most.

You must encourage the user to go through explaining which buttons they would press and when, showing you what, if anything, would happen to the display and what they expect should happen.

(if you are using cooperative evaluation, you can discuss it with them, but for think aloud you have to just accept what is presented).

Debriefing the Users…..

Ask the users what is the best and worst feature.

Engage the user in a discussion on the system, remember that they are your collaborator in evaluating this system.

Ask the users what they though of the evaluation session.

Summerise your observations

1. Corralate your notes, the evaluation team should do this together.

2. Focus on unexpected behaviour and comments on the usability of the system.

3. Think of users problems as symtoms of bad design.

4. Which problems should be fixed?

5. Make recommendations.

Real World Situation

Sometimes it is neccesary to set up evaluations in a similar context to that in which they are designed to be used. Eg mobile technology?

Where appropriate, involve two or three users in one session. If this reflects a real world situation.

Eg watching a DVD?

Project

Four persons per group

Due date WK 9

(Monday 22nd Nov. 5pm)

An Evaluation Report

5000-6000 words

40% of overall module mark.

Submit Group Names and Topic to me by email, (once per group)

Subject Heading –

‘Co-Operative Evaluation’

Marilyn.lennon@ul.ie

By Friday 5th November.

Anyone without a group please attend the tutorial on Thursday 4th Nov at 3pm. SG20

Co-Operative Evaluation of an Interactive System or Object

Write a report outlining;

3 evaluation sessions, evaluating one system.

Describe;• The System or Object to be evaluated• The evaluation technique• The Users• The Session Set-up• The Tasks• The Observations• The Recommendations.

Included in the Appendix…

• I want a brief outline form each student on their role

in the evaluation sessions and write-up.

• An example of the material from an evaluation

Try to do evaluate something relevant

to your FYP!!!!!

(just a suggestion!)

PLEASE READ THE EXAMPLE SUPPLIED BEFORE STARTING

YOUR PROJECT.

Nokia “N-Gage” Game Deck posted at

http://richie.idc.ul.ie/~hci2004/

top related