implementation strategies & outcomes: advancing the science
TRANSCRIPT
Enola Proctor Johns Hopkins
February 24, 2014
1
Implementa)on Strategies & Outcomes: Advancing the science
Session overview: 1. Implementa?on science: What is it?
2. Implementa?on outcomes & strategies: conceptual & methodological challenges
3. Where are we going?
2
I. What is it?
NIH Defini?ons* Dissemina?on Research: – study of how & when research evidence spreads throughout agencies, organiza?ons, and front line workers.
Implementa?on Research: – scien?fic study of how to move evidence-‐based interven?ons into healthcare prac?ce and policy
**PAR13-‐055
3
What is implementa?on research?
“Research to inform how to make the right thing to
do the easy thing to do.”
-‐Carolyn Clancy, Agency for Healthcare Research and Quality
4
Implementa)on research: What does it take?
Quality gaps to address Evidence-‐based interven?ons The “how:” Implementa?on strategies The “where:” Context Theory Partnerships Research Methods & tools
5
Implementa?on is about improving care
The care that “could be” vs
The care that “is”
What quality gaps are of concern?
6
Quality of mental health care US mental health care: “D grade” (NAMI) AHRQ: Physical healthcare is improving, but no improvement in depression care (AHRQ’s 2010 Health Care Quality Report)
Household data: <10% of the U.S. popula?on with a serious mental disorder receives adequate care (Kessler et al, 2005)
Racial dispari?es in care
8
Implementa)on Outcomes Feasibility Fidelity
Penetra?on Acceptability Sustainability
Uptake Costs
*IOM Standards of Care
Conceptual Model for Implementa)on Research
What?
QIs ESTs
How?
Implementa?on Strategies
Implementa?on Research Methods
Service Outcomes* Efficiency Safety
Effec?veness Equity Pa?ent-‐
centeredness Timeliness
Pa)ent Outcomes
Clinical/health status
Symptoms Func?on
Sa?sfac?on
Proctor et al 2009 Admin. & Pol. in Mental Health Services
CONTEXT
CONTEXT
CONTEXT
CONTEXT
The Usual The Core of
Implementa)on Science
11
Implementa?on research studies…
Key variables:
behavior of healthcare professionals and support staff healthcare organiza?ons (culture/ context)
healthcare consumers and family members policymakers in context as key variables
Key outcomes:
sustainable adop?on, implementa?on and uptake of evidence-‐based interven?ons
12
Theories Now: Many models!!!
109 iden?fied models How to choose?
Tabak, Khoong, Chambers, & Brownson (2012), Bridging Research and Prac?ce: Models for Dissemina?on and Implementa?on Research, J Prev Med, 43(3):337–350
13
Implementation Strategies: Definition
Systematic intervention process to adopt and integrate evidence-based healthcare innovations into usual care *
Active ingredient in processes for moving EST’s and QI’s into usual care
*Powell, McMillen, Proctor et al., Medical Care Research and Review, 2012
14
Implementa?on Strategies
…………the ‘how to’ component of changing healthcare prac?ce.
……….Key: How to make the “right thing to do”
the “easy thing to do…Carolyn Clancy
15
Implementa?on Strategies: Complexity*
Discrete
• involve one process or ac?on, such as “mee?ngs,” “reminders”
Mul)faceted**
• uses two or more discrete strategies, such as “training + technical assistance”
Blended
• several discrete strategies are interwoven & packaged as protocolized or branded strategies, such as “ARC,” IHI Framework fro Spread”
*Powell, McMillen, Proctor et al., 2012
** Grimshaw et al., 2001, Grol & Grimshaw, 2003 16
A Compila?on or “menu” 68 strategies grouped by six key
processes* *Powell, McMillen, Proctor et al., Medical Care Research and
Review, 2012
17
Plan Strategies
• Gather informa?on • Select strategies • Build buy-‐in • Ini?ate leadership • Develop rela?onships
Finance Strategies
• Modify incen?ves for clinicians, consumers, reduce disincen?ves
• Facilitate financial support: place on formularies
Restructure strategies
• Revise roles • Create new teams
• Change sites • Change record systems
• Structure communica?on protocols
Quality Management Strategies
• Audit and provide feedback
• Clinician reminders
• Develop T.A. systems
• Conduct cyclical small tests of change
• Checklists
Strategies: What do we know?
• Passive dissemina?on is ineffec?ve – E.g. publishing ar?cles, issuing a memo, “edict”
• Training is most frequently used strategy
• Mul?-‐component, mul?level are more effec?ve
24
Implementa?on Strategies: What do we know?
Discrete: checklists, data feedback, reminders
Bundled or complex: Organiza?onal change strategies:
– teamwork, culture, communica?on
– Ex: ARC Technological strategies? Training strategies: Provider educa?on, coaching
Support strategies: Supervision, Site level support and monitoring
25
Implementa?on Outcomes
Dis?nct from clinical outcomes
Could have an effec?ve interven?on, poorly implemented
Could have an ineffec?ve treatment, successfully implemented
26
Implementa?on Outcomes: Key Concepts
• Acceptability • Adop?on • Appropriateness • Feasibility • Fidelity • Implementa?on cost • Penetra?on • Sustainability
27
Implementa)on Outcomes
Mul?ple stakeholders & mul?ple perspec?ves • Service consumers, families, providers, administrators, funders, legislators
• Which outcomes mater most to whom?
How are outcomes related? • This informs modeling of change • IO ↔ IO • IO →service system and clinical outcomes
28
Measuring Constructs
Developing field – Standard measures lacking
Common, overlapping constructs
Meta-‐analysis to enhance D&I measures inhibited by: – Weaknesses in informa?on about outcomes
– Use of dichotomous measures – Unit of analysis
29
Measurement: Toward Standardiza?on & Harmoniza?on
• Seatle Implementa?on Research Conference Measures Project – htp://www.seatleimplementa?on.org/sirc-‐projects/sirc-‐measures-‐project/
• Grid-‐Enabled Measures developed by the Na?onal Cancer Ins?tute – htp://cancercontrol.cancer.gov/brp/gem.html
30
Implementa?on Outcomes
Ques?on from the field: How much will this cost and what kind of havoc will it wreak?
Priority outcomes: • incremental cost • scale up & spread • sustainability
31
Implementa?on outcomes: what do we know?
• Fidelity = most frequently measured outcome • Provider aytudes frequently assessed
• Implementa?on outcomes are interac?ve: – Effec?veness greater acceptability – Cost feasibility
• We don’t know much about: – Sustainability – Scale up and spread
32
Which implementa?on outcomes are most important to pursue?
Stakeholder assessment Who are they?
– Payers, Policy makers – Administrators – Researchers – Clients/ Pa?ents , Families – Providers (clinicians, counselors, M.D.’s, nurses, OT, PT, SW) – Support staff (units, labs, medical records) – Supervisors, training teams
Where are they re: the implementa?on?
33
Which implementa?on outcomes are most important to pursue?
Push & pull Is there a demand to implement?
Is there a push? Is there a pull?
34
Which implementa?on outcomes are most important to pursue?
Contextual assessment: Barrier assessment: Prac?ce change needs to aligned with
Priori?es and trends in policy ecology* Agency infrastructure, system antecedents ** Capacity
*Raghavan, 2009 ** Emmons, 2013
35
Context: Consolidated Framework for Implementa?on Research (CFIR)
• Composed of 5 major domains: – Interven?on characteris?cs
– Outer seyng – Inner seyng – Characteris?cs of the individuals involved
– Process of implementa?on
Damschroder L, Aron D, Keith R, Kirsh S, Alexander J, Lowery J.: Fostering implementa?on of health services research findings into prac?ce: a
consolidated framework for advancing implementa?on science. Implement Sci 2009, 4(1):50. 36
Implementa?on Context
Advancing measurement for contextual constructs
• Measures exist for several of CFIR’s constructs
• More informa?on on the Wiki: htp://wiki.cfirwiki.net/index.php??tle=Main_Page
Understanding how to fit changing EB interven)ons into changing context* * Dynamic sustainability framework, Chambers et al., Implementa?on Science, 2013
37
Implementa)on Strategies: How to select?
Evidence of effec)veness and fit Under construc?on
Informed by context assessment Inner seyng Outer seyng
Interven?on features Barriers assessment
Flotorp, S.A (2013) A checklist for iden?fying determinants of prac?ce: A systema?c review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional prac?ce. ImplementaDon Science 8:35
38
Priority area #1: Implementa)on Strategies
Build the evidence Empirical tests of strategies
CER Cost effec?veness
Understanding what strategies work, for which EST’s, in which seyngs Developing more parsimonious strategies:
which components have which effects? Which strategies for which implementa?on outcomes?
40
Implementa?on Strategies: How to select?
• Context assessment: – Barrier iden?fica?on – System antecedents * – Root cause analysis
• Target to context • Stakeholder engagement
*Emmons, K. M., Weiner, B., Fernandez , M.E., & Tu, S. (2012), Systems Antecedents for Dissemina?on and Implementa?on : A Review and Analysis of Measures, Health Educ Behav 39: 87 ** Flotorp, S.A., Oxman, A.D., Krause, J. et al., (2013), A checklist for iden?fying determinants of prac?ce: A systema?c review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional prac?ce, Implementa?on Science, 8:35
41
Implementa?on Strategies: Specifica?on & repor?ng*
Implementa)on strategies carry same demands as interven)ons
• Opera?onal defini?ons • Protocols & manuals
• Fidelity Define strategies conceptually, opera)onally
42
DEBATE Open Access
Implementation strategies: recommendations forspecifying and reportingEnola K Proctor1*, Byron J Powell1 and J Curtis McMillen2
Abstract
Implementation strategies have unparalleled importance in implementation science, as they constitute the ‘how to’component of changing healthcare practice. Yet, implementation researchers and other stakeholders are not ableto fully utilize the findings of studies focusing on implementation strategies because they are often inconsistentlylabelled and poorly described, are rarely justified theoretically, lack operational definitions or manuals to guide theiruse, and are part of ‘packaged’ approaches whose specific elements are poorly understood. We address the challengesof specifying and reporting implementation strategies encountered by researchers who design, conduct, and reportresearch on implementation strategies. Specifically, we propose guidelines for naming, defining, and operationalizingimplementation strategies in terms of seven dimensions: actor, the action, action targets, temporality, dose,implementation outcomes addressed, and theoretical justification. Ultimately, implementation strategies cannot beused in practice or tested in research without a full description of their components and how they should be used. Aswith all intervention research, their descriptions must be precise enough to enable measurement and ‘reproducibility.’We propose these recommendations to improve the reporting of implementation strategies in research studies and tostimulate further identification of elements pertinent to implementation strategies that should be included in reportingguidelines for implementation strategies.
Keywords: Implementation strategies, Implementation research, Measurement, Methodology
The need for better specification and reporting ofimplementation strategiesImplementation strategies have unparalleled importancein implementation science, as they constitute the ‘how to’component of changing healthcare practice. Comprisingthe specific means or methods for adopting and sustaininginterventions [1], implementation strategies are recognizedas necessary for realizing the public health benefits ofevidence-based care [2]. Accordingly, developing strategiesto overcome barriers and increase the pace and effective-ness of implementation is a high research priority [3-7].While the evidence for particular implementation strat-
egies is increasing [8], limitations in their specificationpose serious problems that thwart their testing and hencethe development of an evidence-base for their efficiency,cost, and effectiveness. Implementation strategies areoften inconsistently labelled and poorly described [9],
are rarely justified theoretically [10,11], lack operationaldefinitions or manuals to guide their use, and are partof ‘packaged’ approaches whose specific elements arepoorly understood [12]. The literature on implementationhas been characterized as a ‘Tower of Babel’ [13], whichmakes it difficult to search for empirical studies ofimplementation strategies, and to compare the effectsof different implementation strategies through meta-analyses [9]. Worse yet, the lack of clarity and depth inthe description of implementation strategies within thepublished literature precludes replication in both re-search and practice. As with all intervention research,implementation strategies need to be fully and preciselydescribed, in detail sufficient to enable measurementand ‘reproducibility’ [14] of their components.The purpose of this article is to provide guidance to
researchers who are designing, conducting, and reportingstudies by proposing specific standards for characterizingimplementation strategies in sufficient detail. We beginby providing a brief introduction to implementationstrategies, including how the broad term has been
* Correspondence: [email protected] Warren Brown School of Social Work, Washington University in St.Louis, One Brookings Drive, Campus Box 1196, St. Louis, MO, USAFull list of author information is available at the end of the article
ImplementationScience
© 2013 Proctor et al.; licensee BioMed Central Ltd. This is an open access article distributed under the terms of the CreativeCommons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, andreproduction in any medium, provided the original work is properly cited.
Proctor et al. Implementation Science 2013, 8:139http://www.implementationscience.com/content/8/1/139
Priority area II: Implementa)on Outcomes
Mul)ple stakeholders & mul)ple perspec)ves • Which outcomes mater most to whom? • Ques?on from the field: How much will this cost and what kind of havoc will it wreak?
Priority outcomes: • incremental cost • scale up & spread • sustainability
43
Priori?es for Sustainability Research
• Sustainability of EBPs when contexts change
• Adaptability/Evolu?on of EBPs over ?me
• Scaling up prac?ces across health plans, systems, and networks
• Studying De-‐Implementa?on
44
Priority area III: Capturing complex implementa)on
Reality of most service delivery:
Co-‐occurring condi)ons → Mul)ple EBI’s
Evidence evolves → con)nually adopt
Limited capacity → must de-‐adopt
Fit to local context → adapta)on
Staff turnover→ con)nual training
46
Treatment Evidence Con)nues to Grow
What strategies can enable providers & organizaDons to implement evolving evidence?
47
Training: Implementa)on Research Ins)tute
(IRI) • Na?onal faculty & scholars • 2 yr. program for IR in mental health
• Funded by an NIMH R25 grant (NIMH -‐ R25 MH080916-‐03); VA & NIDA supplements
• Held at Brown School Washington University in St. Louis
48
Training: Mentored Training for Dissemina?on & Implementa?on Research in Cancer
(MT-‐DIRC) Na?onal faculty & fellows 2 yr. program of summer ins?tutes Based at Washington University NCI supported 12 fellows per year
Applica?ons for summer 2014 due this winter Contact: [email protected]
50
Training: Training Ins?tute for Dissemina?on & Implementa?on Research in Health
TIDIRH NIH wide
Housed:
UNC, UCSF, Washington U, Harvard
htp://ctsi.ucsf.edu/calendar/training/2012-‐training-‐ins?tute-‐dissemina?on-‐and-‐implementa?on-‐research-‐health-‐?dirh
• Applica?ons for 2014 due this winter 51
Support: Na)onal Ins)tute of Mental Health
P30 MH068579 R25 MH080916 P30 DK092950 U54 CA155496 UL1 RR024992 (Clinical and Transla?onal Science Award, CTSA)
Washington University Ins?tute for Public Health Brown School of Social Work
Conflicts: none
52