cs5032 lecture 5: human error 1

41
HUMAN ERROR DR JOHN ROOKSBY

Upload: john-rooksby

Post on 05-Dec-2014

1.591 views

Category:

Technology


2 download

DESCRIPTION

 

TRANSCRIPT

Page 1: CS5032 Lecture 5: Human Error 1

HUMAN ERROR

DR JOHN ROOKSBY

Page 2: CS5032 Lecture 5: Human Error 1

IN THIS LECTURE …

This lecture focuses on human error in systems operation

Human error is implicated in many accidents and disasters.

• But this does not mean human error causes these accidents

• The media are often too quick to blame individuals … and computer scientists too quick to blame the user

We will look at issues in studying human error, and will outline three approaches to modeling human error.

We will then discuss design-for-error.

In the next lecture I will suggest it is more important to look at human reliability than human error.

Page 3: CS5032 Lecture 5: Human Error 1
Page 4: CS5032 Lecture 5: Human Error 1

HUMAN ERROR

Given that all complex systems involve people in their production, maintenance and operation it should not be surprising that human issues are often constituent in failures and accidents.

According to one report, human error accounts for…

Page 5: CS5032 Lecture 5: Human Error 1

50-70% OF AVIATION DISASTERS

http://en.wikipedia.org/wiki/1994_Fairchild_Air_Force_Base_B-52_crash

Page 6: CS5032 Lecture 5: Human Error 1

70% OF SHIPPING ACCIDENTS

Image: Roberto Vongherhttp://en.wikipedia.org/wiki/Costa_Concordia_disaster

Page 7: CS5032 Lecture 5: Human Error 1

60-85% OF SHUTTLE INCIDENTS AT NASA

http://en.wikipedia.org/wiki/Space_Shuttle_Challenger_Disaster

Page 8: CS5032 Lecture 5: Human Error 1

44,000 – 98,000 DEATHS A YEAR IN THE USA IN HEALTHCARE (MAINLY THROUGH MEDICATION ERRORS)

http://en.wikipedia.org/wiki/Anaesthetic_machine

Page 9: CS5032 Lecture 5: Human Error 1

CONTRIBUTION OR CAUSE?

Human error often features in accidents and disasters. But this does not mean they necessarily cause accidents.

• Accidents can have multiple causes, so why single out human error?

• Other factors can underlie human errors, such as poor design, lack of training, or overwork.

• There is not always a correct way to work. Some actions are just errors in hindsight

It is important to understand and reduce errors that happen at “the sharp end” of system operation

• But this may be to treat symptoms from deeper troubles.

Page 10: CS5032 Lecture 5: Human Error 1

THE “SHARP END” OF FAILURE

Technology

Users

Groups

Organisations

Regulations

Sh

arp

End

B

lunt

En

d

Page 11: CS5032 Lecture 5: Human Error 1
Page 12: CS5032 Lecture 5: Human Error 1

http://gizmodo.com/5844628/a-passenger-airplane-nearly-flew-upside-down-because-of-a-dumb-pilot

Page 13: CS5032 Lecture 5: Human Error 1

STUDYING HUMAN ERROR

Human activity and human error is not a simple topic. It encounters areas that have been debated in psychology and the humanities for decades, if not centuries.

Errors are predominantly studied by

• Laboratory simulations• Field observation• Archive data

It is difficult to study human error. What constitutes an error can be controversial

• What is the ‘correct’ action is in any given situation?

Page 14: CS5032 Lecture 5: Human Error 1

MODELING HUMAN ERROR

There have been several attempts to build taxonomies of human error. There is still no definitive model, and many argue there never can be.

In this lecture I will briefly cover three

• THERP (Technique for Human Error Rate Prediction)

• GEMS (Generic Error Modeling System)

• CREAM (Cognitive Reliability and Error Analysis Method)

Page 15: CS5032 Lecture 5: Human Error 1

THERP

THERP states human actions can be:

• Correct: Actions are done as specified

• Errors of omission: An action is omitted

• Errors of commission: An action is inadequate. out of sequence, mistimed, or not of quality (too much/too little/wrong way)

• Extraneous actions: An action is not expected at that time

THERP enables you to build probabilities of errors occurring in a given context.

The key problem with THERP is that it assumes a correct specification. The only things humans do is follow the specification, and if not it is an error. Reality is not like this.

Page 16: CS5032 Lecture 5: Human Error 1

GEMS

The GEMS model has also had a wide impact. It draws on cognitive psychology.

GEMS is founded upon an “Activity Space” model, which represents human performance on three levels:

• Skills based activity• Activities where we act more or less automatically

• Rule based activity• Activities where we apply rules

• Knowledge based activity• Activities where we fall back on our (sometimes patchy)

knowledge

Page 17: CS5032 Lecture 5: Human Error 1

GEMS• Errors in skills based activity:

• Execution errors• Slips: an error in executing an action correctly • Lapses: steps are missed when executing an activity

• Errors in rule based activity• Planning errors

• Rule-based mistake: a rule is misapplied

• Errors in knowledge based activity• Also planning errors

• Knowledge-based mistakes: knowledge is wrong or misapplied

Page 18: CS5032 Lecture 5: Human Error 1

GEMS: CHARACTERISTICS OF ERROR TYPES

Skills Based Errors

Rule Based Errors

Knowledge Based Errors

Main Error type Slips & lapses RB mistakes KB mistakes

Activity type Routine actions Problems solving activities

Attention Often elsewhere Directed at problem related issues

Control mode Mainly automatic More conscious

Predictability Largely predictable

variable

Frequency Common Uncommon

Opportunity Very high Very low

Detection Usually easy Difficult and often through intervention

Page 19: CS5032 Lecture 5: Human Error 1

GEMS

GEMS is best thought of as a way of characterising rather than defining error. There are a number of problems with it:

• It assumes human activities are goal or plan-driven (this is controversial in psychology and the humanities)

• The plan or goal is assumed to be correct but how do you judge this?

• It can be very difficult to definitively categorise any human action in terms of error

• Many different versions of the model exist

Page 20: CS5032 Lecture 5: Human Error 1

CREAM

CREAM (Cognitive Reliability and Error Analysis Method seeks to avoid the idea that humans simply introduce errors into ‘perfectly specified systems’, and enables you to model correct as well as incorrect actions

CREAM sees action taking place across three levels

• Person Related

• Organisation Related

• Technology Related

Page 21: CS5032 Lecture 5: Human Error 1

CREAM

CREAM details the genotypes (possible causes) of error as

• Person related

• Observation / Interpretation / Planning • Organisation related

• Communication / Training / Ambient Conditions / Working Conditions

• Technology related

• Equipment Failure / Procedures / Temporary Interface Problems / Permanent Interface Problems

CREAM does not offer a specific error taxonomy, but offers generic ‘phenotypes’ with which to build one.

Page 22: CS5032 Lecture 5: Human Error 1

VIOLATIONS

We have assumed so far, that errors are unintentional aberrations of an intentional and correct activity.

• But sometimes rules and procedures can be deliberately violated

Sabotage is one reason, but many violations can be well intentioned. These well-intentioned violations fall into three major categories:

• Routine violations: Taking the path of least effort. • Optimizing violations: Doing something (too) quickly or

cheaply.• Necessary violations: These are provoked by

organisational or contextual failings.

Page 23: CS5032 Lecture 5: Human Error 1

DESIGN FOR ERROR

If we design systems appropriately we can minimise error.• This is called Design for Error

When designing any system we need to be aware how human errors can and will be made. • This is not to say that human error is a problem that can

be solved through design – but to say that good design can play a role in minimising error.

Design for Error is similar to, but not the same as Design-for-Failure and Design-for-Recovery.

A helpful introduction to the concepts of Design for Error can be found in Don Norman’s book “The Design of Everyday Things”. He Recommends…

Page 24: CS5032 Lecture 5: Human Error 1

DESIGN FOR ERROR

Put the required knowledge into the world.

• Don’t require people to remember everything they need to know in order to operate a system

• This knowledge must be available in an appropriate form (manuals are often left on shelves unread)

• But understand that different people require different forms of guidance. An expert in some procedure will not want to be forced to follow the same sequence of steps that an amateur may need to go through.

Page 25: CS5032 Lecture 5: Human Error 1

DESIGN FOR ERROR

Design “forcing functions” physical or logical constraints

• Interlocks: These force certain sequences of events, for example opening a microwave door turns it off. To set off a fire extinguisher you must use remove the pin.

• Lockins: These stop you for carrying out a certain action in a particular context. For example most computers now cannot be shut down when there is unsaved work.

• Lockouts: These stop you for doing something. For example the stairs to the basement of a tall building are usually differently designed or have a gate to stop people continuing to the basement when evacuating.

Page 26: CS5032 Lecture 5: Human Error 1

DESIGN FOR ERROR

Narrow the gulf of execution and evaluation

• Make things visible to the user and to others, make the results of each action apparent

• Enable people to correct their own errors if they see them. Support double checking. Be aware that correcting other person’s error can create social difficulties (especially if that person is a superior)

• Provide support for evaluation in ways that are situationally appropriate. For example people may stop reading common error message, so if an uncommon error occurs consider not making it look like a run-of-the –mill event.

Page 27: CS5032 Lecture 5: Human Error 1

DESIGN FOR ERROR

Don’t assume that technical approaches need to be taken to reduce error.

• Human centred approaches can be effective, particularly training people.

• Organisational approaches such as planning workflows and shifts can be effective.

• The design of the working environment can also have a huge impact on error proneness.

Page 28: CS5032 Lecture 5: Human Error 1

KEY POINTS

Human error is often implicated in accidents and disasters

It is often wrong to say human error is the cause of an accident or disaster, as there will be other underlying causes.

It can be difficult and controversial to label any particular action as an error. Just because it varies from a procedure does not mean it was the wrong thing to do.

There are several ways to model error. Often these are derived from cognitive psychology, and concentrate on where a correct action was intended but an erroneous action performed.

We can design for error. Think of the ways people can experience errors and provide resources to reduce these.

Page 29: CS5032 Lecture 5: Human Error 1

SOURCES / READINGBooks by James Reason • (1997) Managing the risks of organisational accidents. Ashgate.• (1990) Human error. Cambridge University Press.• (2008) The Human Contribution. Ashgate.

Donald Norman (1988) The design of everyday things. Basic.• See the chapter “Too Err is Human”

Sydney Dekker (2006) The Field Guide to Human Error. Ashgate.

L Kohn & M Donaldson (editors) (2000) Too Err is Human: Building a Safer Health System. (http://www.nap.edu/openbook.php?isbn=0309068371)

Books by Erik Hollnagel • (1998) Cognitive Reliability and Error Analysis Method (CREAM)• (2006) Resilience Engineering

Page 30: CS5032 Lecture 5: Human Error 1

EXERCISE 1.

Page 31: CS5032 Lecture 5: Human Error 1

INFUSION DEVICE

For the next 5 minutes

• read “infusion devices” example.

Useful Definitions:

• An infusion device is a mechanical device that administers intravenous solutions containing drugs to patients.

• Hypertensive means patient has high blood pressure • Cubic centimetres can be written as cc’s or cm3• An Anaesthesiologist or anaesthetist is a medical doctor

who administers the anaesthetic before, during and after surgery.

• Intravenous (IV) fluid is supplied in plastic bags and administered using IV tubing.

Page 32: CS5032 Lecture 5: Human Error 1

INFUSION DEVICE

The “system” in this case was

• The digital technology, • The equipment (tubing etc) • The people, practices and procedures • and the physical design of the surgical suite.

The “failure” in this case was

• The breakdown in delivery of IV medications during surgery - the free flow of the medication from the infusion device.

Page 33: CS5032 Lecture 5: Human Error 1

INFUSION DEVICE

Systemic Failures:

• Multiple infusions devices, each requiring set-up and each requiring a slightly different set up.

• Each of three different medications had to be programmed into the infusion device with the correct dose for the patient

• Possible scheduling problems in the operating suites may have contributed to the anaesthesiologist having insufficient time to check the devices before surgery

• A new nurse on the team means assumptions within the team about responsibilities and ways of working might be false.

• The nurse found herself assembling a device she was unfamiliar with. Was she trained properly? Why didn’t she ask for help?

Page 34: CS5032 Lecture 5: Human Error 1

INFUSION DEVICEWhere was the error?

• There is no single error here• As in any safety critical industry there are numerous

faults and latent conditions that need to be addressed• Appropriate mechanisms need to be in place to trap

errors • Blaming the nurse is a common but inappropriate

reaction in this case. Hospitals often have a “blame culture”

See “Making Information Technology a Team Player in Safety: The Case of Infusion Devices” (further reading section) for more on infusion devices

The example is based upon Too Err is Human (see further reading section)

Page 35: CS5032 Lecture 5: Human Error 1

EXERCISE 2.

Page 36: CS5032 Lecture 5: Human Error 1

COMMON SLIPS AND LAPSES

Slips often occur is routine activities. We intend to do one thing, but do another. There are many kinds of slip:

Capture Errors:

An activity you are doing is “captured” by another one. Often a non-routine activity can be captured by a more routine one.

For example, sometimes when I am driving to St Andrews town centre I pull into the work car park as if I was driving to work.

Page 37: CS5032 Lecture 5: Human Error 1

Description Errors

Sometimes when we do a routine activity, we do it to something that is similar to but not the same as the thing intended. (It is not correct but “fits the description”)

For example sometimes if I leave my mobile next to my mouse, I grab the mobile by mistake.

For example I once dried my hands on my flatmate’s coat which was hanging on the back of a chair where a tea-towel would normally be

COMMON SLIPS AND LAPSES

Page 38: CS5032 Lecture 5: Human Error 1

Data driven errors

Many human actions are responses to something. These responses can enter into a processes as an additional step or as a mis-step

For example when I was typing a document, someone asked me the meaning of a word. I then realised I had typed that word instead of the word I mean to.

COMMON SLIPS AND LAPSES

Page 39: CS5032 Lecture 5: Human Error 1

Associate action errors

Sometimes our own internal associations can trigger a slip.

For example picking up the telephone and saying “come in”

For example, I once went to a job interview and instead of saying “Hi, I’m John”, I said “Hi, I’m scared”. (These kinds of associative errors are called Freudian Slips).

COMMON SLIPS AND LAPSES

Page 40: CS5032 Lecture 5: Human Error 1

Loss of Activation Errors

Sometimes we set out to do something, but along the way forget what we set out to do.

For example, I once went to my bedroom but once I was there wondered what it was I went to do. Once I was back downstairs I remembered I wanted to charge my phone.

COMMON SLIPS AND LAPSES

Page 41: CS5032 Lecture 5: Human Error 1

Mode Errors

Sometimes we operate a technology correctly, except that it is in the wrong mode.

For example, when turning my car around, I reversed it but forgot to put it in a forward gear before setting off forwards.

For example, I typed the body of a text message into the ‘to’ area on my phone.

Source: Donald Norman (1988) The design of everyday things. Basic.

COMMON SLIPS AND LAPSES