#HASummit14
Session #20How to Drive Clinical Improvement That Get Results
Tom BurtonAnd the Catalyst Academy Education Team
#HASummit14 2
What is a Clinical Program?
• Organized around care delivery processes
• Permanent integrated team of clinical and analytics staff
• Creates a iterative continuous learning environment
• Focus is on sustained clinical outcome improvement (not revenue growth)
• Not a Clinical Service Line (although you can Leverage Service Lines as a good start)
#HASummit14 3
Organizational AGILE Teams
= Subject Matter Expert= Data Capture= Data Provisioning & Visualization= Data Analysis
Women & Children’s Clinical Program Guidance Team
Pregnancy
MD LeadRN SME
Knowledge Manager
DataArchitect
Application Administrator
RN, Clin Ops Director
Guidance Team MD lead
Normal Newborn
MD LeadRN SME
Gynecology
MD LeadRN SME
• Permanent teams that meet weekly• Integrated clinical and technical members• Supports multiple care process families
#HASummit14 4
Incorporating the most effective learning methods
Teach Others - 90%
Practice by Doing- 75%
Discussion Group- 50%
Demonstration- 30%
Audiovisual- 20%
Reading- 10%
Lecture- 5% % represents average information retained through the particular learning method
‒ Duke University
0 20 40 60 80 100
#HASummit14 5
Session Objective4 Learning ExperiencesClinical Programs that Get Results Principles
Choose the right initiative
Understand variation
Improve data quality
Choose the right influencers
#HASummit14 6
Choose the right initiative
#HASummit14 7
Deal or No Deal Exercise
#HASummit14 8
DEAL or NO DEAL
#HASummit14 9
First Principle
• Picking an improvement opportunity randomly is like playing traditional DEAL or NO DEAL
• You might get lucky
• Choosing the loudest physician or the choosing based on non-data driven reason can dis-engages other MDs and use scarce analytical resources on projects that may not be the best investment
• It takes about as much effort to work on a large process as it does on a small process
9
#HASummit14 1010
Pareto Example: Resources Consumed
Cumulative %
% of Total Resources Consumed for each clinical work process
Key Findings:
50%
• 50% of all in-patient resources are represented by 7 Care Process Families
7 CPFs Number of Care Process Families(e.g., ischemic heart disease, pregnancy, bowel disorders, spine, heart failure)
• 80% of all in-patient resources are represented by 21 Care Process Families
21 CPFs
80%
Analytic System
#HASummit14 11
Dr. J.15 Cases$60,000 Avg. Cost Per Case
Mean Cost per Case = $20,000
$40,000 x 15 cases = $600,000 opportunity
Total Opportunity = $600,000
Total Opportunity = $1,475,000
$35,000 x 25 cases = $875,000 opportunity
Total Opportunity = $2,360,000
Total Opportunity = $3,960,000
Cost Per Case, Vascular Procedures
Analytic System
#HASummit14 12
Excellent OutcomesPoor Outcomes
# of Cases
Excellent OutcomesPoor Outcomes
# of Cases
Excellent Outcomes
# of Cases
Poor OutcomesExcellent Outcomes
# of Cases
Poor Outcomes
1
2
3
4
Varia
bilit
y
High
Low
Resource ConsumptionLow High
Improvement Approach - Prioritization
12
#HASummit14 13
Excellent OutcomesPoor Outcomes
# of Cases
Excellent OutcomesPoor Outcomes
# of Cases
Excellent Outcomes
# of Cases
Poor OutcomesExcellent Outcomes
# of Cases
Poor Outcomes
1
2
3
4
Varia
bilit
y
High
Low
Resource ConsumptionLow High
Improvement Approach - Prioritization
13
#HASummit14 14
Internal Variation versus Resource Consumption
Y- A
xis
= I
nte
rnal
Var
iatio
n in
Res
ourc
es C
onsu
med
Bubble Size = Resources Consumed Bubble Color = Clinical DomainX Axis = Resources Consumed
1
2
3
4
#HASummit14 1515
DEAL or BETTER DEAL
#HASummit14 16
Understand Variation
#HASummit14 17
The Popsicle Bomb Exercise
1M59585756555453525150494847464544434241403938373635343332313029282726252423222120191817161514131211109876543210
Timer
When you’re finished note your time and enter it in the HAS app – Poll Question 1
#HASummit14 18
Variation in Results
• Corp Analytics – shows results
18
#HASummit14 19
Less Effective Approach to improvement:“Punish the Outliers”
# of Cases
Current Condition
• Significant Volume• Significant Variation
# of Cases
Option 1: “Punish the Outliers” or “Cut Off the Tail”
Strategy• Set a minimum standard of quality• Focus improvement effort on those
not meeting the minimum standard
Mean
Focus on MinimumStandard
Metric
Excellent OutcomesPoor Outcomes Excellent OutcomesPoor Outcomes
1 box = 100 cases in a year
#HASummit14 20
Effective Approach to improvement: Focus on “Better Care”
Excellent OutcomesPoor Outcomes
# of Cases
Current Condition
• Significant Volume• Significant Variation
Excellent Outcomes
# of Cases
Option 2: Identify Best Practice “Narrow the curve and shift it to the right”Strategy• Identify evidenced based “Shared Baseline”• Focus improvement effort on reducing
variation by following the “Shared Baseline”• Often those performing the best make the
greatest improvements
Mean
Focus on Best Practice Care Process
Model
Poor Outcomes
1 box = 100 cases in a year
#HASummit14 21
Round 2
21
1M59585756555453525150494847464544434241403938373635343332313029282726252423222120191817161514131211109876543210
Timer
When you’re finished note your time and enter it in the HAS app – Poll Question 2
#HASummit14 22
Reduced Variation in Results
• Corp Analytics – shows results
22
#HASummit14 23
Improve Data Quality
#HASummit14 24
The Water Stopper Exercise
#HASummit14 25
Information Management
2525
DATA CAPTURE
• Acquire key data elements• Assure data quality• Integrate data capture into operational
workflow
DATA ANALYSIS
• Interpret data• Discover new information in the data
(data mining)• Evaluate data quality
DATA PROVISIONING
• Move data from transactional systems into the Data Warehouse
• Build visualizations for use by clinicians• Generate external reports (e.g., CMS)
Knowledge Managers (Data quality, data stewardship and
data interpretation)
Application Administrators (optimization of source systems)
Data Architects(Infrastructure, visualization, analysis, reporting)
= Subject Matter Expert
= Data Capture
= Data Provisioning
= Data Analysis
Fix it Here
Not HereNot Here
#HASummit14 26
Data Capture Quality Principles
• Accuracy
Does the data match reality?
Example: Operating Room Time Stamps
• Timeliness
What is the latency of the data capture?
Example: Billing data delay; end of shift catch-up
• Completeness
How often is critical data missing?
Example: HF Ejection Fraction
26
#HASummit14 27
Challenges with Data “Scrubbing”
27
• Analyst time spent on re-working scrubbing routines
• Root cause never identified
• Early binding vs. late binding – what you consider dirty data may actually be useful for others analyzing process failures.
• Using data to punish vs. data to learn – punish strategy promotes hiding the problem so clinicians don’t look bad
#HASummit14 28
Choose the right influencers
#HASummit14 29
Paul Revere's ride Exercise
#HASummit14 30
Revere vs. Dawes
30
Paul Revere
"Revere knew exactly which doors to pound on during his ride on Brown Beauty that April night. As a result, he awakened key individuals, who then rallied their neighbors to take up arms against the British.”
William Dawes
"In comparison, Dawes did not know the territory as well as Revere. As he rode through rural Massachusetts on the night of April 18, he simply knocked on random doors. The occupants in most cases simply turned over and went back to sleep."
Diffusion of Innovations (Free Press, 2003) by Everett M. Rogers
#HASummit14 3131
Innovators
earlyadopters
earlymajority
laggards(never adopters)
* Adapted from Rogers, E. Diffusion of Innovations. New York, NY: 1995.
latemajority
Innovators. Recruit innovators to re-design care delivery processes (like Revere)
Early adopters. Recruit early adopters to chair improvement and to lead implementation at each site.(key individuals who can rally support)
The
Cha
sm
N = number of individuals in group
N
N = number needed to influence group(but they must be the right individuals)
#HASummit14 32
Small Teams (Designs Innovation) • Meet weekly in iteration planning meeting
• Build DRAFT processes, metrics, interventions• Present DRAFT work to Broader Teams
OBInnovators
Guidance Team(Prioritizes Innovations)
• Meet quarterly to prioritize allocation of technical staff
• Approves improvement AIMs • Reviews progress and removes road blocks
OB Newborn GYN
W&N
W&N
Innovators
Innovators
Early Adopters
Broad Teams (Implements Innovation)
• Broad RN and MD representation across system• Meet monthly to review, adjust and approve DRAFTs• Lead rollout of new process and measurement
OB
W&N
W&N
W&N
Innovators
Early Adopters
Early Adopters
#HASummit14 33
Organizational AGILE Teams
= Subject Matter Expert= Data Capture= Data Provisioning & Visualization= Data Analysis
Women & Children’s Clinical Program Guidance Team
Pregnancy
MD LeadRN SME
Knowledge Manager
DataArchitect
Application Administrator
RN, Clin Ops Director
Guidance Team MD lead
Normal Newborn
MD LeadRN SME
Gynecology
MD LeadRN SME
• Permanent teams• Integrated clinical and technical members• Supports multiple care process families• Choose innovators and early adopters to lead
Innovators
Early Adopters
#HASummit14 34
How to identify innovators and early adopters• Ask
Innovators (inventors)
- Who are the top three MDs in our group who are likely to invent a better way to deliver care?
Early Adopters (thought leaders)
- When you have a tough case who are the top three MDs you trust and would go to for a consult?
• Fingerprinting selection process
Invite innovators to choose identify their top three MD choices from the early adopters to lead the Clinical Program
34
#HASummit14 35
Conclusion – TEACH OTHERS
#HASummit14 36
Teach Others Exercise Deal or No Deal
- Choose the right initiative
- Prioritize based on process size and variation
Popsicle Bomb
- Understand variation
- Measure variation and standardize processes
Water Stopper
- Improve data quality
- Fix the problem at the source
Paul Revere’s Ride
- Choose the right influencers
- Identify Innovators and Early adopters to
accelerate diffusion of innovationTake 1 minute and describe the purpose of each exercise to your neighbor, then swap and let them teach you
1M59585756555453525150494847464544434241403938373635343332313029282726252423222120191817161514131211109876543210
Timer
1M59585756555453525150494847464544434241403938373635343332313029282726252423222120191817161514131211109876543210
#HASummit14 37
Exercise Effectiveness Q1
Overall, how effective were the exercises in explaining the principles?
1) Not effective
2) Somewhat effective
3) Moderately effective
4) Very effective
5) Extremely effective
37
#HASummit14 38
Exercise Effectiveness Q2
How effective was the Deal or No Deal Exercise at teaching the principle of prioritizing based on process size and variation?
1) Not effective
2) Somewhat effective
3) Moderately effective
4) Very effective
5) Extremely effective
38
#HASummit14 39
Exercise Effectiveness Q3
How effective was the Popsicle Bomb Exercise at teaching the principle of understanding variation and standardizing processes?
1) Not effective
2) Somewhat effective
3) Moderately effective
4) Very effective
5) Extremely effective
39
#HASummit14 40
Exercise Effectiveness Q4
How effective was the Water Stopper Exercise at teaching the principle of fixing data quality issues at the source?
1) Not effective
2) Somewhat effective
3) Moderately effective
4) Very effective
5) Extremely effective
40
#HASummit14 41
Exercise Effectiveness Q5
How effective was the “Paul Revere Ride” exercise at teaching the principle of choosing the right influencers based on their capabilities as innovators and early adopters?
1) Not effective
2) Somewhat effective
3) Moderately effective
4) Very effective
5) Extremely effective
41
#HASummit14 42
Exercise Effectiveness Q6
Are you interested in running these same exercises in your organizations?
a) Yes
b) No
42
#HASummit14
Analytic Insights
AQuestions &
Answers
#HASummit14 44
Session Feedback Survey
1. On a scale of 1-5, how satisfied were you overall with this session?1) Not at all satisfied2) Somewhat satisfied3) Moderately satisfied4) Very satisfied5) Extremely satisfied
3. On a scale of 1-5, what level of interest would you have for additional, continued learning on this topic (articles, webinars, collaboration, training)?
1) No interest2) Some interest3) Moderate interest4) Very interested5) Extremely interested
2. What feedback or suggestions do you have?