november 7, 2014 health it implementation, usability and safety workgroup david bates, chair larry...

Post on 16-Dec-2015

218 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

November 7, 2014

Health IT Implementation, Usability and Safety Workgroup

David Bates, chairLarry Wolf, co-chair

Workgroup Members

•David W. Bates, Brigham and Women’s Hospital

(Chair)

•Larry Wolf, Kindred Healthcare (Co-Chair)

• Joan Ash, Oregon Health & Science University

• Janey Barnes, User-View Inc.

• John Berneike, St. Mark's Family Medicine

• Bernadette Capili, New York University

• Michelle Dougherty, American Health Information Management Association

• Paul Egerman, Software Entrepreneur

• Terry Fairbanks, Emergency Physician

• Tejal Gandhi, National Patient Safety Foundation

• George Hernandez, ICLOPS

• Robert Jarrin, Qualcomm Incorporated

• Mike Lardieri, North Shore-LIJ Health System

• Bennett Lauber, The Usability People LLC

• Alisa Ray, Certification Commission for Healthcare Information Technology

• Steven Stack, American Medical Association

Ex Officio Members • Svetlana Lowry, National Institute of

Standards and Technology • Megan Sawchuck, Centers for Disease

Control and Prevention• Jeanie Scott, Department of Veterans

Affairs• Jon White, Agency for Healthcare

Research and Quality-Health and Human Services

ONC Staff • Ellen Makar, (Lead WG Staff)

1

Meeting Schedule

Meetings Task

Monday, September 22, 2014 2:00 PM-4:00 PM Eastern Time

• Review charge• Work to date=- background / history• Preliminary goals discussion of deliverable

Friday, October 10, 2014 1:00 PM-3:00 PM Eastern Time

• Presentation of usability research MedStar and NIST

Friday, October 24, 2014 1:00 PM-3:00 PM Eastern Time

• ECRI and TJC results of adverse event database analysis• Usability Testing • Implementation Science (field reports)• Certification – Alicia MortonFriday, November 7, 2014

1:00 PM-3:00 PM Eastern Time

Friday, December 12, 2014 1:00 PM-3:00 PM Eastern Time

• Post-implementation Usability & Safety, Risk Management & Shared Responsibility

• Safety Center Report Out • Realignment of timeline/ goals for 2015

2

Agenda

Objective: ONC Health IT Certification Program

1:00 p.m. Call to Order/Roll Call • Michelle Consolazio, Office of the National Coordinator

1:05 p.m. Context: Usability and Safety Criteria for the ONC Health IT

Certification Program• Larry Wolf, co-chair

1:20 p.m. ONC Health IT Certification• Alicia Morton, Office of the National Coordinator

1:40 p.m. How Usability of EHRs and Workflow Impact Patient Safety• Alicia Morton, Office of the National Coordinator• Lana Lowry, National Institute of Standards and Technology

2:55 p.m. Public Comment 3:00 p.m. Adjourn

4

ONC Health IT Certification

• 2014 Edition EHR Certification Criteria on “safety-enhanced design” (using UCD processes)

- Identify what is being done - Increased transparency based on information available through certification. See ONC’s CHPL site.

• ONC Authorized Certifying Body (ACB) can conduct surveillance in live environments.– ACB’s are “health oversight agencies” under HIPAA– See ONC FAQ #45

• CMS Fact sheet “flexibility rule” http://www.cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/Downloads/CEHRT2014_FinalRule_QuickGuide.pdf 5

Safety- Enhanced Design

6

New: Safety-enhanced design. User centered design processes must be applied to each capability an EHR technology includes that is specified in the following certification criteria:

§ 170.314(a)(1), (2), (6) through (8), (16) and (18) through (20) and (b)(3), (4), and (9).

http://www.gpo.gov/fdsys/pkg/FR-2014-09-11/pdf/2014-21633.pdf

Current: Safety-enhanced design. User-centered design processes must be applied to each capability an EHR technology includes that is specified in the following certification criteria:

§ 170.314(a)(1), (2), (6) through (8), and (16) and (b)(3) and (4).

UCD processes in 2014 Edition R2 Electronic Health Record (EHR) Certification Criteria

§170.314(g)(3) Safety-enhanced design. User -entered design processes must be applied to each capability an EHR technology includes that is specified in the following certification criteria:

§170.314(a)(1) Computerized provider order entry§170.314(a)(2) Drug-drug, drug-allergy interaction checks§170.314(a)(6) Medication list§170.314(a)(7) Medication allergy list§170.314(a)(8) Clinical decision support§170.314(a)(16) Inpatient setting only – electronic medication administration record§170.314(a)(18) Optional – computerized provider order entry – medications§170.314(a)(19) Optional – computerized provider order entry – laboratory§170.314(a)(20) Optional – computerized provider order entry – diagnostic imaging§170.314(b)(3) Electronic prescribing§170.314(b)(4) Clinical information reconciliation§170.314(b)(9) Optional – clinical information reconciliation and incorporation

Certification test procedure: http://www.healthit.gov/sites/default/files/170.314g3safetyenhanceddesign_2014_tp_approved_v1.3_0.pdf 7

Quality Management System

§170.314(g)(4) Quality management system. For each capability that an EHR technology includes and for which that capability's certification is sought, the use of a Quality Management System (QMS) in the development, testing, implementation and maintenance of that capability must be identified.– (i) If a single QMS was used for applicable capabilities, it would only need to be

identified once.– (ii) If different QMS were applied to specific capabilities, each QMS applied would

need to be identified. This would include the application of a QMS to some capabilities and none to others.

– (iii) If no QMS was applied to all applicable capabilities such a response is acceptable to satisfy this certification criterion.

Test procedure: http://www.healthit.gov/sites/default/files/170.314g4qms_2014_tp_approvedv1.2.pdf

8

ONC Health IT Certification ProgramCAPT Alicia Morton, DNP, RN-BC

9

ONC Health IT Certification Program

Workgroup Information Request /Items for today’s discussion:

• Brief overview of the ONC Health IT Certification Program– Current, 2014 Edition certification requirements and how

they are tested

• ONC CHPL site overview and revision plans• Surveillance program for certified products

10

Both cars meet baseline safety standards, functional conformance testing, vehicle emissions testing

Yet, those baseline standards are not the differentiator.

• Safe• Useful• Usable• Satisfying

11

How Does the ONC Health IT Certification Program Work?

Regulatio

n

• ONC issues a regulation that includes certification criteria (and associated standards) for health IT products and corresponding certification program requirements

Develope

rs

• Create health IT products that, at a minimum, meet the standards and certification criteria adopted by HHS in regulation

ONC-ATLs

• Test health IT products based on the standards and certification criteria adopted by HHS

ONC-ACBs

• Issue certification to tested health IT products; Conduct surveillance • Submit product information to ONC for posting on the Certified Health IT Product list (

CHPL)

Providers

& Hospitals

• Have assurances products meet specific certification criteria and associated standards• In the case of the EHR Incentive Program (aka MU), certified health IT in a specified

manner to attest/report to the program to receive an incentive and avoid a payment adjustment

12

ONC Health IT Certification Program Structure / Process

13

13

ONC reviews and posts certified

product to CHPL

ONCapproves

ONC-AAApproved Accreditor

accredits

NIST NVLAPNational Voluntary Laboratory

Accreditation Program

accredits

Developer

performs testing against criteria

ACBAuthorized

Certification Body*

ACBAuthorized

Certification Body*

ONC-ACBONC-Authorized

Certification Body

certifies tested products

Authorized Testing Body*

Authorized Testing Body*

ATLAccredited Testing

Laboratory

performs testing against criteria

ISO/IEC 17011

ISO/IEC 17025

NIST 150

NIST 150-31

auth

oriz

es

Product successfully passes testing

Product successfully achieves certification

ISO/IEC 17065

Seven Certification Criteria Categories in 2014 EditionClinical Care

CoordinationCQM Privacy &

SecurityPatient

EngagementPublic Health Utilization

CPOETransitions of Care

(1) Capture & exportAuthentication, access control, authorization

VDT Immunization InfoAutomated MU

Numerator Calculation

Drug-drug, drug-allergy checks

Transitions of Care (2) Import & calculate Auditable Events Clinical summary Immunization

TransmissionAutomated MU

Measure Calculation

Demographics E-prescribing Electronic reporting Audit report Secure messagingSyndromic

surveillance Transmission

Safety Enhanced Design

Vital signsClinical Info

ReconciliationAmendments

(HIPAA privacy)Reportable labs

transmissionQuality

management system

Problem listIncorporate lab

results Auto-log off Cancer info

Medication listSend labs to ambulatory

providersEmergency Access Cancer transmission

Med allergy list Data portability End-user device encryption

CDS Integrity

E-notesAccounting of

disclosures (HIPAA Privacy)

Drug-formulary

Smoking statusImage results Base EHRFamily HHx

Patient List CreationEMAR

Advance Directives

14

Helpful Clarifying Terminology

15

2014 Edition FR: Redefining Certified EHR Technology and Related Terms

1. Certified EHR Technology (CEHRT)—SPECIFIC to the EHR Incentive Program (MU)

2. Base EHR Definition (this subsumes the term “qualified EHR”)

3. Complete EHR Definition (going away with next rule-making as noted in 2014 Edition Release 2 FR. No effect on certification to the 2014 Edition.)

MU Measure

CMS sets specific provider performance

metric related to the use of certified capabilities

+Certified Capability in EHR Demonstration of MU=

ONC adopts certification criteria that specify

technical capabilities for EHR technology

An eligible provider reports their performance on each MU metric to CMS in order to receive an incentive or

avoid a penalty

An EP must e-prescribe more than 40% of their Rx’s

EHR technology required to be able create standardized e-Rx file

EP attests 73%

An EP must implement 5 CDS interventions related to 4 or more CQMs

EHR technology required to enable multiple CDS related functionalities

EP attests “yes”

An EP must record problems in standardized data for more than 80% of patients

EHR technology required to be able to record problems in SNOMED CT

EP attests 93%

ONC Health IT Certification and CMS Meaningful UseRelationship

16

Certified HIT Product List

• All products at www.healthit.gov/chpl • Test Result Transparency: The final rule

requires that ONC-ACBs submit a hyperlink of the test results used to issue a certification to a Complete EHR or EHR Module.

• Includes information on what was tested, write-ups on the “usability” assessments performed, and more.

17

Certified Health IT Surveillance

• ONC-AA performs surveillance & technical assessment of the ONC-ACBs– According to ONC program requirements and standards governing certification

bodies • ONC-ACBs perform surveillance of certified Health IT products

– Proactive and reactive• Proactive: ONC priority areas of Exchange, Safety, Security , Population Management

(quality measurement)• Reactive: product complaints, large number of inherited certified status request, etc.

– ONC Surveillance guidance/ priority areas:• First/ CY 2014

http://www.healthit.gov/sites/default/files/onc-acb_2013annualsurveillanceguidance_final_0.pdf

• Report for CY 2014 due to ONC late Feb 2015• Second, CY 2015

http://www.healthit.gov/sites/default/files/onc-acb_cy15annualsurveillanceguidance.pdf • Plans for CY 2015 just received and under review by ONC

• NIST NVLAP performs surveillance of the ONC-ATLs 18

Questions/Discussion /Contacts

ONC Health IT Certification Program: www.healthit.gov/certification

Certified Health IT Product List: www.healthit.gov/chpl

ONC Certification team mailbox: ONC.Certification@hhs.gov

19

Lana Lowry NIST HIT Usability Project Lead

llowry@nist.gov

How Usability of EHRs and Workflow Impact Patient Safety

• Only safety related usability must be evaluated

• Helps vendors, hospitals, and other stakeholders to ensure that EHR use errors are minimized

• Provides technical guidance for summative usability evaluations prior to deployment or implementation of an HER

• The summative usability testing evaluation is meant to be independent from factors that engender creativity, innovation, or competitive features of the system

• Examples of safety-related usability issues that have been reported by healthcare workers include poorly designed EHR screens that slow down the user and might sometimes endanger patients, warning and error messages that are confusing and often conflicting, and alert fatigue (both visual and audio) from too many messages, leading users to ignore potentially critical messages.

EHR Usability Protocol (EUP) – a Core Validation Tool

21

Focus of the EUP

• Clearly distinguishes between usability aspects that pertain to user satisfaction and usability features that impact clinical safety

• Limited critical usability aspects that pertain to the clinical safety must be embedded into the system and must be required as a core functionality not a competition feature; a "barrier to entry" to the marketplace on safety is an expected outcome.

• Typical measures for clinical safety are adverse events (wrong patient, wrong treatment, wrong medication, delay of treatment, unintended treatment).

• Accepted usability/safety standards should be considered industry standard practices. Any company is free to go above and beyond the basic standard, however the minimum standards for usability in safety enhanced design should be established and articulated to address patient safety.

22

EUP is a model for understanding relationship between usability issues and patient safety outcomes through:

• Step I. Usability Application Analysis led by the development team, which identifies the characteristics of the system’s anticipated users, use environments, scenarios of use, and use related usability risks that may induce medical errors

• Step II. Expert Review/Analysis of EHR Application, an independent evaluation of the critical components of the user interface in the context of execution of various use case scenarios and usability principles

• Step III. Usability Testing, involving creation of a test plan and then conducting a test that will assess usability for the given EHR application including use efficiency and presence of features that may induce potential medical errors

EUP Key Areas

23

Three-step process for design evaluation and human user performance testing for the EHR

24

• EUP emphasis is on ensuring necessary and sufficient usability validation and remediation has been conducted so that

(a) use error is minimized and (b) use efficiency is maximized.

• The EUP focuses on identifying and minimizing critical usability issues

• The intent of the EUP is to validate that the application’s user interface is free from critical usability issues and supports error-free user interaction

Objective of the EUP

25

• Elimination of “never events”

• Identification and mitigation of critical use errors

• Identification of areas for potential UI improvement and record user acceptance / satisfaction

• Report summative testing results in CIF (Common Industry Format) www.nist.gov/customcf/get_pdf.cfm?pub_id=907312

Goal of the EUP

Relationship between EUP and NIST UCD Guidelines

27

UCD EUP

User Needs, Workflows & Environments Application Analysis

Engage Users Expert Review/Analysis of EHRs

Set Performance Objectives Application Analysis

Design Application Analysis

Test and Evaluate Usability Testing

EUP is a Key Component of UCD

28

• Performance is examined by collecting user performance data that are relevant indicators of the presence of safety risks

• These measures may include, but are not limited to, objective measures of successful task completion, number of errors and corrected errors, performance difficulties, and failures to complete the task successfully or in proper sequence

• Performance is also evaluated by conducting post-test interviews focused on what users identify as risks based on confusion or misunderstanding when carrying out directed scenarios of use

• The goal of the validation test is to make sure that critical interface design issues are not causing patient safety-related use error; in other words, that the application’s user interface supports error-free user interaction

How to Measure Performance?

29

Theoretical Example of EUP Scenario

30

Theoretical Example of EUP Scenario

31

Theoretical Example of EUP Findings

N Use Errors in Usability Test Severity Rating

6/18 When reviewing a patient chart, the clinician does not detect that new lab critical results are available

2

17/18 User copies the wrong (older) data from one note to the current note being written

3

1/18 When mcg/kg/min is ordered, the user incorrectly selects units of mcg/hr and accepts the wrong dose

4

15/18 A medication schedule is changed, the user fails to select the “refresh” button in the menu and does not comply with the new medication schedule

2

Scale is 1 (least) to 4 (most)32

Relating Usability and Patient Safety

33

Example: Wrong Patient Record Open

EHR: Patient A EHR: Patient B

Imaging: Patient A

34

Example: Wrong Mode for Action

Direct Dose Mode (mcg/min)Weight Dose Mode (mcg/kg/min)

35

Example: Incomplete Data Displayed

Lidocaine Hydrochlor

36

Example: Data not Readily Available

80 mg

37

Definitions

Usability: How useful, usable, and satisfying a system is for the intended users to accomplish goals in the work domain by performing certain sequences of tasks

Workflow: A set of tasks – grouped chronologically into processes – and the set of people or resources needed for those tasks that are necessary to accomplish a goal

Workaround: Actions that do not follow explicit rules, assumptions, workflow regulations, or intentions of system designers

38

Relating Workflow and Patient Safety

Poorly supported work processes suboptimal nonstandard care, poor decision support, dropped tasks

Missed information delays in diagnosis, missed/redundant treatment, wrong patient

Inefficient clinical documentation copy/paste, “smart text”, templates, scribes

Provider dissatisfaction workarounds, slower adoption rates in specialty areas

High rates of false alarms ignored alarms, alerts, reminders

39

Methods: Modeling with SMEs

Ambulatory care physicians; collegial discussions

Interdisciplinary team meetings – human factors, informatics, physicians

Process maps

Goal-means decomposition diagram

Insights for moving towards “patient visit management system”

40

Workflow “Buckets” in Ambulatory Care

Before patient visit

During patient visit

Physician encounter

Discharge

Visit documentation

41

Balance workload

Does pt have significant complexity?

Clinical overview and review new findings/labs

Review prior history and physical

Before Patient Visit

yes

no

42

Balance workload

Does pt have significant complexity?

Clinical overview and review new findings/labs

Review prior history and physical

yes

no

Before Patient Visit

43

Balance workload

Does pt have significant complexity?

Clinical overview and review new findings/labs

Review prior history and physical

yes

no

Before Patient Visit

44

Balance workload

Does pt have significant complexity?

Clinical overview and review new findings/labs

Review prior history and physical

yes

no

Before Patient Visit

45

Patient Checkin, Vital

Signs and Chief

Complaint

Get History, Signs and

Symptoms, review

Review of Systems,

make Presumptive

Diagnosis

Initiate intent to order medications, procedures, labs,

consults

Verify medications and allergies

Discharge

Patient education

Document relevant history, physical, assessment, plan

Documentation to support bil ling

Documentation for others (legal,

research, compliance, MU)

Does patient need education?

Yes

Does patient need a summary?

No

NoIs referral needed?

Explicit Orders: Medications, procedures, labs, imaging, consults/

referral

Does patient need a clinical procedure?

No

Yes Clinical Procedure

Warm up and

remember pertinent

information

Form initial treatment plan

Examine patient, physical

Yes Give patient summary

Physician and/or others tells/reviews

patient initial assessment, plan,

and “to do” activities, motivates

following plan

Pick ICD/CPT codes, verify insurance, investigate requirement for public

reporting

Is more information

needed?No

YesReview chart/

research guidelines, sideline consult

Collect Medicatio

n Reconciliation data

and Review of Systems

data

Verify dosage for some medications

Document medications reconciled

During Patient Visit

Physician Encounter

Before Patient Visit

Discharge Visit Docm

46

Recommendations for EHR Developers

• Increase efficiency: – Reviewing results with the patient– Drafting pre-populated orders to be formally executed later– Supporting drafting documentation with shorthand notations without a keyboard

• Design for empathetic body positioning/eye contact

• Support dropping tasks and delaying task completion

• Verification of alarms and alerts and data entry without “hard stops”

47

Recommendations for Ambulatory Care

• Moderate organizational design flexibility

• Design room to support patient rapport & EHR access

• Minimize redundant data entry via interoperability

• Reduce clinic pace or increase flexibility of pace

• Ensure functionality that supports continuity in task performance in the case of interruption

• Relax requirements to enter detailed data for others during fast-paced patient visits

48

49

Next Meeting: Friday, December 12, 2014 1:00 PM-3:00 PM Eastern Time

top related