boards, dashboards, and data from the top: getting the board on board 1-3 p.m., june 11, 2007...

20
Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D.

Upload: hugh-peregrine-anderson

Post on 01-Jan-2016

231 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Boards, Dashboards, and Data

From the Top: Getting the Board on Board1-3 p.m., June 11, 2007

Boston, Massachusetts

James L. Reinertsen, M.D.

Page 2: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Boards ask two types of questions about quality and safety

1. How good is our care?─ How do we compare to others like us?

2. Is our care getting better? ─ Are we on track to achieve our key quality

and safety objectives?─ If not, why not? Is the strategy wrong, or is it

not being executed effectively?

Page 3: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

For all of these questions…

In God we trust.

All others bring data.

Yes, but what data?

Page 4: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Purpose of Measurement

Research Comparison or Accountability

Improvement

Key question “What is the truth?”

“Are we better or worse than…?”

“Are we getting better?”

Penalty for being wrong

Misdirection for the profession

Misdirected reward or

punishment

Misdirection for an initiative

Measurement requirements

and characteristics

Complete, accurate, controlled,

glacial pace, expensive

Risk adjusted, with denominators, attributable to

individuals or orgs, validity

Real time, raw counts,

consistent operational

definitions, utility

Typical displays Comparison of control and

experimental populations

Performance relative to

benchmarks and standards…

Run charts, control charts, time between

events…

Adapted from Solberg,Mosser, McDonald Jt Comm J Qual Improv. 1997 Mar;23(3):135-47.

Page 5: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Example of an answer to “How good is our care?”

Date of this report is October 24, 2006

Hospital could be “green” but still worse than median of comparison group

Compared to others

Page 6: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Another example of “How do we compare?”Hospital Adverse Events per 1,000 Patient Days

Adverse Events Include (but are not limited to):• Allergic rash• Excessive bleeding, unintentional trauma of a blood vessel• Respiratory depression requiring intubation due to pain medications• Hyperkalemia as the result of overdose of potassium• Lethargy/shakiness associated with low serum glucose• Drug-induced renal failure• Surgical site infection, sepsis, infected lines, other hospital-acquired infections• Internal bleeding following the first surgery and requiring a second surgery to stop the bleeding• Atelectasis, skin breakdown, pressure sores• DVT or pulmonary embolism during a hospital stay

Source: Roger Resar, John Whittington, IHI Collaborative

150

Number of Adverse Events per 1,000 Patient Days

Using IHI Global Trigger Tool

0 25 50 75100

125

Current IHI BestIHI Average

5 40

Our Hospital, May 2007

Page 7: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

What Boards should know about data on “How good are we and how do we compare

to others?”

Upside• Often risk adjusted• Apples to Apples• Source of pride• Source of energy for

improvement

Downside• Time lag (months)• Static (no data over time)• If you look bad, energy is

wasted on “the data must be wrong”

• If you look good, you become complacent

• How you look depends on how others perform

• Standards and Benchmarks are full of defects (“The cream of the crap”)

Page 8: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Recommendations for Board use of “How do we compare to others?”

1. Ask this question to help you set aims, and perhaps annually thereafter, but don’t use these sorts of reports to oversee and guide improvement at each meeting.

2. Compare to the best, not the 50th %tile• e.g. Toyota Specs

3. Always make sure you know how “Green” is determined

Page 9: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Boards ask two types of questions about quality and safety

1. How good is our care?─ How do we compare to others like us?

2. Is our care getting better? ─ Are we on track to achieve our key quality

and safety objectives?─ If not, why not? Is the strategy wrong, or is it

not being executed effectively? Where dashboards and scorecards

can be helpful to boards

Page 10: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

1.1 Satisfy Our Patients D

eath

s per

1000 D

isch

arg

es

YrMon.

2006200620062006200520052005200520042004200420042003200320032003DecSepJunMarDecSepJunMarDecSepJunMarDecSepJunMar

40

35

30

25

20

15

10

Monthly12mo rolling rate

Variable

Inpatient Mortality

Benchmark

8/7/2006; Prepared by Immanuel St. Joseph's-Mayo Health System Quality Resources Department

Immanuel St. J oseph's

Example: Immanuel St. Joseph’s Mayo Health System Board’s

answer to the question “Is our mortality rate getting better?”

Available in January 2007!

Page 11: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Is our quality and safety getting better?Are we going to achieve our aims?

• To answer these questions for Boards…─ The aims should be clearly displayed and understood─ A few system-level measure(s) should be graphically

displayed over time ─ The measures should be displayed monthly, at worst,

and should be close to “real time”─ Measures do not necessarily need to be risk adjusted─ Measures of critical initiatives (projects that must be

executed to achieve the aim) should be available if needed to answer the Board’s questions

Page 12: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

The Board question “are we going to achieve our aims?” requires management to have a strategic theory

Big Dots(Pillars, BSC…)

Drivers(Core Theory of

Strategy)

Projects(Ops Plan)

What are your key strategic aims? How good must we be, by when? What are the system-level measures of those aims?

Down deep, what really has to be changed, or put in place, in order to achieve each of these goals? What are you tracking to know whether these drivers are changing?

What set of projects will move the Drivers far enough, fast enough, to achieve your aims? How will we know if the projects are being executed?

Page 13: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

The ideal dashboard will display a cascaded set of measures that reflect the “theory of the strategy.”

Page 14: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Example Dashboard for Harm(for 5M Lives Campaign)

0

20

40

60

80

100

120

J an Feb Mar Apr May

Global HarmTrigger Tool

0

10

20

30

40

50

60

70

J an Feb Mar Apr May

Handwashing

42

44

46

48

50

52

54

56

58

J an Feb Mar Apr May

Culture ofdisciplineon safetyrules

0

10

20

30

40

50

60

70

J an Feb Mar Apr May

Teamwork

0

20

40

60

80

100

120

J an Feb Mar Apr May

Harm fromhigh alertmeds

0

2

4

6

8

10

12

14

16

18

20

J an Feb Mar Apr May

Surgicalcomplications

0

5

10

15

20

25

30

35

J an Feb Mar Apr May

PressureUlcers

0

2

4

6

8

10

12

J an Feb Mar Apr May

MRSA

0

2

4

6

8

10

12

14

16

J an Feb Mar Apr May

CHFReadmissions

System Level Measure: Global Harm Trigger Tool

Drivers: Handwashing, culture of discipline, and teamwork

Projects: High alert meds, surgical complications, pressure ulcers, CHF, MRSA

Board

Page 15: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

The full Board should review the System-level Measures (Big Dots.) The Board

Quality Committee should review both the System-level Measures and the Key Drivers

of those Measures. Occasionally, but not often, the Board will need to see measures of Key Projects, but these are generally the

responsibility of management to oversee and execute.

Page 16: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Common Flaws in Dashboards• No system-level measures or aims (so it’s possible to quality

and safety to be worse, and yet to achieve “green” on all the measures the Board sees!)

• Hodge-podge of system, driver, and project measures (so the Board doesn’t know what’s important)

• Static measures (so the Board has to take management’s word that “we’re on track to achieve our aims”

• Too many measures (so the Board doesn’t understand any of them)

• Mixture of “How do we compare to others” and “are we getting better?” measures (so the Board doesn’t know what questions to ask)

• Low, unclear standards for “green” (so the Board becomes complacent despite significant opportunities for improvement!)

Page 17: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Can you identify the flaws in the following “dashboard?”

Page 18: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Measure Current Performance Goal for 2007 Acute MI Core Measures

6h Decile National, 4h decile State 2nd state decile or above

Congestive Heart Failure Core Measures

4th Decile National, 2nd decile State 2nd State decile or above

Pneumonia Core Measures

3rd Decile National, 1st Decile State 2nd State decile or above

Press-Ganey Patient Satisfaction

57% Rate us “Excellent” Statistically significant improvement i.e 62% “Excellent” rating

OR Turnover Time 22 minutes 15 minutes Falls 7 per 1000 patient days Less than 5 per 1000 patient days Medication Errors 5.1 per 1000 patient days (from Nurse

Variance Reports) Less than 7 per 1000 patient days

Total Knee and Hip Infection Rates

1.2% Less than 4.1 % i.e. Better (lower) than 50th %tile for NNIS

Surgical Site Infection Rates for Cardiac Surgery

4.2% Less than 10.4% i.e. Better (lower) than 50th %tile for NNIS

Time to answer nurse call lights on all Med/Surg Units

We are developing a standard measure, and will report in future meetings to Board on this initiative

We are aiming to achieve significant improvement in timeliness of response to patients concerns.

Page 19: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Measure Current Performance Goal for 2007 Acute MI Core Measures

6h Decile National, 4h decile State 2nd state decile or above

Congestive Heart Failure Core Measures

4th Decile National, 2nd decile State 2nd State decile or above

Pneumonia Core Measures

3rd Decile National, 1st Decile State 2nd State decile or above

Press-Ganey Patient Satisfaction

57% Rate us “Excellent” Statistically significant improvement i.e 62% “Excellent” rating

OR Turnover Time 22 minutes 15 minutes Falls 7 per 1000 patient days Less than 5 per 1000 patient days Medication Errors 5.1 per 1000 patient days (from Nurse

Variance Reports) Less than 7 per 1000 patient days

Total Knee and Hip Infection Rates

1.2% Less than 4.1 % i.e. Better (lower) than 50th %tile for NNIS

Surgical Site Infection Rates for Cardiac Surgery

4.2% Less than 10.4% i.e. Better (lower) than 50th %tile for NNIS

Time to answer nurse call lights on all Med/Surg Units

We are developing a standard measure, and will report in future meetings to Board on this initiative

We are aiming to achieve significant improvement in timeliness of response to patients concerns.

No display over time

Low standards for “Green”

Mix of system, project measures

Mostly comparison measures

Page 20: Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D

Summary of Best Practices for Quality and Safety Dashboards for Boards

• Separate the two types of oversight questions─ How good is our quality? How do we compare to others?─ Are we getting better? Are we on track to achieve our aims?

• Ask the comparison question annually, when setting quality and safety aims. Avoid use of comparative data to track improvement.

• Frame your aims with reference to the theoretical ideal, and to the “best in the world,” not to benchmarks

• Ask the ‘improvement question’ at every meeting, and track with a dashboard that shows real-time data on system level and driver measures displayed on run charts

• Demand that management develop a “theory of the strategy to achieve the annual quality and safety aims

• Do not put project-level measures (often about one unit, disease, or department) on the Board’s dashboard