onc hit certification program test results summary for ...€¦ · email:...

204
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015 Page 1 of 12 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Practice Partner® Product Version: 11.0ss Domain: Ambulatory Test Type: Complete EHR 1.2 Developer/Vendor Information Developer/Vendor Name: McKesson Address: 5995 Windward Parkway Alpharetta GA 30005 Website: http://www.mckesson.com/bps Email: [email protected] Phone: 563-585-4773 Developer/Vendor Contact: Tom Reinecke

Upload: others

Post on 14-Jun-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 1 of 12

ONC HIT Certification Program

Test Results Summary for 2014 Edition EHR Certification

Part 1: Product and Developer Information

1.1 Certified Product Information

Product Name: Practice Partner® Product Version: 11.0ss Domain: Ambulatory Test Type: Complete EHR

1.2 Developer/Vendor Information

Developer/Vendor Name: McKesson Address: 5995 Windward Parkway Alpharetta GA 30005 Website: http://www.mckesson.com/bps Email: [email protected] Phone: 563-585-4773 Developer/Vendor Contact: Tom Reinecke

Page 2: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 2 of 12

Part 2: ONC-Authorized Certification Body Information

2.1 ONC-Authorized Certification Body Information

ONC-ACB Name: Drummond Group

Address: 13359 North Hwy 183, Ste B-406-238, Austin, TX 78750

Website: www.drummondgroup.com

Email: [email protected]

Phone: 817-294-7339

ONC-ACB Contact: Bill Smith

This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative:

Bill Smith

Certification Body Manager ONC-ACB Authorized Representative Function/Title

9/14/2015

Signature and Date

2.2 Gap Certification The following identifies criterion or criteria certified via gap certification

§170.314

(a)(1) (a)(19) (d)(6) (h)(1)

(a)(6) (a)(20) (d)(8) (h)(2)

(a)(7) (b)(5)* (d)(9) (h)(3)

(a)(17) (d)(1) (f)(1)

(a)(18) (d)(5) (f)(7)** *Gap certification allowed for Inpatient setting only **Gap certification allowed for Ambulatory setting only

x No gap certification

Page 3: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 3 of 12

2.3 Inherited Certification

The following identifies criterion or criteria certified via inherited certification

§170.314

x (a)(1) (a)(16) Inpt. only x (c)(2) x (f)(2)

x (a)(2) (a)(17) Inpt. only x (c)(3) x (f)(3)

x (a)(3) (a)(18) x (d)(1) (f)(4) Inpt. only

x (a)(4) (a)(19) x (d)(2) (f)(5) Amb. only

x (a)(5) (a)(20) x (d)(3)

x (a)(6) (b)(1) x (d)(4) (f)(6) Amb. only

x (a)(7) (b)(2) x (d)(5)

x (a)(8) x (b)(3) x (d)(6) (f)(7)

x (a)(9) x (b)(4) x (d)(7) (g)(1)

x (a)(10) x (b)(5) x (d)(8) x (g)(2)

x (a)(11) (b)(6) Inpt. only (d)(9) Optional x (g)(3)

x (a)(12) x (b)(7) (e)(1) x (g)(4)

x (a)(13) (b)(8) x (e)(2) Amb. only (h)(1)

x (a)(14) (b)(9) x (e)(3) Amb. only (h)(2)

x (a)(15) x (c)(1) x (f)(1) (h)(3)

No inherited certification

Page 4: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 4 of 12

Part 3: NVLAP-Accredited Testing Laboratory Information

Report Number: TEB-09092015-1988-1

Test Date(s): 10/9/2013, 10/16/2013, 11/1/2013, 11/15/2013, 3/4/2014, 9/9/2015

3.1 NVLAP-Accredited Testing Laboratory Information

ATL Name: Drummond Group EHR Test Lab

Accreditation Number: NVLAP Lab Code 200979-0

Address: 13359 North Hwy 183, Ste B-406-238, Austin, TX 78750

Website: www.drummondgroup.com

Email: [email protected]

Phone: 817-709-1627

ATL Contact: Kyle Meadors

For more information on scope of accreditation, please reference NVLAP site.

Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory Representative:

Timothy Bennett

Test Proctor ATL Authorized Representative Function/Title

9/14/2015

Nashville, TN Signature and Date Location Where Test Conducted

3.2 Test Information

3.2.1 Additional Software Relied Upon for Certification

Additional Software Applicable Criteria Functionality provided by Additional Software

Surescripts Network for Clinical Interoperability

b.1, b.2, e.1

Direct messaging

No additional software required

Page 5: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 5 of 12

3.2.2 Test Tools

Test Tool Version

x Cypress 2.4.1 x ePrescribing Validation Tool 1.03

HL7 CDA Cancer Registry Reporting Validation Tool 1.0.3 HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool 1.8.2

x HL7 v2 Immunization Information System (IIS) Reporting Validation Tool 1.7.1

x HL7 v2 Laboratory Results Interface (LRI) Validation Tool 1.7 x HL7 v2 Syndromic Surveillance Reporting Validation Tool 1.7 x Transport Testing Tool 181 x Direct Certificate Discovery Tool 3.0.4

Edge Testing Tool 0.0.5

No test tools required

3.2.3 Test Data

Alteration (customization) to the test data was necessary and is described in Appendix [insert appendix letter]

x No alteration (customization) to the test data was necessary

3.2.4 Standards

3.2.4.1 Multiple Standards Permitted The following identifies the standard(s) that has been successfully tested where more than one standard is permitted

Criterion # Standard Successfully Tested

(a)(8)(ii)(A)(2)

x §170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

(a)(13)

x §170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.207(j) HL7 Version 3 Standard: Clinical Genomics; Pedigree

Page 6: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 6 of 12

Criterion # Standard Successfully Tested

(a)(15)(i)

x §170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

(a)(16)(ii) §170.210(g)

Network Time Protocol Version 3 (RFC 1305)

§170. 210(g) Network Time Protocol Version 4 (RFC 5905)

(b)(2)(i)(A)

x §170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(b)(7)(i)

x §170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(b)(8)(i)

§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(e)(1)(i)

Annex A of the FIPS Publication 140-2 [list encryption and hashing algorithms] AES-256 SHA-1

(e)(1)(ii)(A)(2) §170.210(g)

Network Time Protocol Version 3 (RFC 1305)

x §170. 210(g) Network Time Protocol Version 4 (RFC 5905)

(e)(3)(ii)

Annex A of the FIPS Publication 140-2 [list encryption and hashing algorithms] AES-256 SHA-1

Common MU Data Set (15)

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

x §170.207(b)(2) The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT-4)

Page 7: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 7 of 12

Criterion # Standard Successfully Tested

None of the criteria and corresponding standards listed above are applicable

3.2.4.2 Newer Versions of Standards The following identifies the newer version of a minimum standard(s) that has been successfully tested

Newer Version Applicable Criteria

No newer version of a minimum standard was tested

3.2.5 Optional Functionality

Criterion # Optional Functionality Successfully Tested

(a)(4)(iii) Plot and display growth charts

(b)(1)(i)(B) Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(1)(i)(C) Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(b)(2)(ii)(B) Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(2)(ii)(C) Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(e)(1) View, download and transmit data to a third party utilizing the Edge Protocol IG version 1.1

(f)(3) Ambulatory setting only – Create syndrome-based public health surveillance information for transmission using the standard specified at §170.205(d)(3) (urgent care visit scenario)

(f)(7) Ambulatory setting only – transmission to public health agencies – syndromic surveillance - Create Data Elements

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature)

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)

x No optional functionality tested

Page 8: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 8 of 12

3.2.6 2014 Edition Certification Criteria* Successfully Tested

Criteria # Version

Criteria # Version

TP** TD*** TP TD

(a)(1) 1.3 1.5 (c)(3) 1.11 1.11

(a)(2) 1.2 (d)(1) 1.2

(a)(3) 1.2 1.4 (d)(2) 1.6 (a)(4) 1.4 1.3 (d)(3) 1.3 (a)(5) 1.4 1.3 (d)(4) 1.3 (a)(6) 1.3 1.4 (d)(5) 1.2 (a)(7) 1.3 1.3 (d)(6) 1.2 (a)(8) 1.3 (d)(7) 1.2 (a)(9) 1.3 1.3 (d)(8) 1.2 (a)(10) 1.2 1.4 (d)(9) Optional 1.2 (a)(11) 1.3

x (e)(1) 1.11 1.5 (a)(12) 1.3 (e)(2) Amb. only 1.2 1.6 (a)(13) 1.2 (e)(3) Amb. only 1.3

(a)(14) 1.2 (f)(1) 1.2 1.2 (a)(15) 1.5 (f)(2) 1.3 1.3

(a)(16) Inpt. only 1.3 1.2 (f)(3) 1.3 1.3

(a)(17) Inpt. only 1.2 (f)(4) Inpt. only 1.3 1.3

(a)(18) 1.1 1.5 (f)(5) Amb. only 1.2 1.2

(a)(19) 1.1 1.5

(a)(20) 1.1 1.5 (f)(6) Amb. only 1.4 1.4

x (b)(1) 1.7 1.4

x (b)(2) 1.4 1.6 (f)(7) Amb. only 1.1

(b)(3) 1.4 1.4 (g)(1) 2.0 2.0 (b)(4) 1.3 1.4 (g)(2) 2.0 2.0 (b)(5) 1.4 1.2 (g)(3) 1.4

(b)(6) Inpt. only 1.3 1.3 (g)(4) 1.2

(b)(7) 1.4 1.7 (h)(1) 1.1

(b)(8) 1.2 1.2 (h)(2) 1.1

(b)(9) 1.1 1.1 (h)(3) 1.1 (c)(1) 1.11 1.11

(c)(2) 1.11 1.11

Page 9: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 9 of 12

Criteria # Version

Criteria # Version

TP** TD*** TP TD

No criteria tested *For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method) **Indicates the version number for the Test Procedure (TP) ***Indicates the version number for the Test Data (TD)

Page 10: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 10 of 12

3.2.7 2014 Clinical Quality Measures*

Type of Clinical Quality Measures Successfully Tested:

x Ambulatory

Inpatient

No CQMs tested

*For a list of the 2014 Clinical Quality Measures, please the CMS eCQM Library (Navigation: June 2014 and April 2014 Updates)

Ambulatory CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version

x 2 v4 x 90 v4 x 136 v4 x 155 v3

22 x 117 v3 x 137 v3 x 156 v3

x 50 v3 x 122 v3 x 138 v3 157

52 x 123 v3 x 139 v3 158

x 56 v3 x 124 v3 x 140 v3 159

61 x 125 v3 x 141 v4 160

62 x 126 v3 x 142 v3 161

64 x 127 v3 x 143 v3 x 163 v3

65 x 128 v3 x 144 v3 x 164 v3

x 66 v3 x 129 v4 x 145 v3 x 165 v3

x 68 v4 x 130 v3 x 146 v3 x 166 v4

x 69 v3 x 131 v3 x 147 v4 x 167 v3

74 132 148 169

x 75 v3 133 149 177

77 x 134 v3 x 153 v3 179

82 x 135 v3 x 154 v3 x 182 v4

Inpatient CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version

9 71 107 172

26 72 108 178

30 73 109 185

31 91 110 188

32 100 111 190

53 102 113 55 104 114

60 105 171

Page 11: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 11 of 12

3.2.8 Automated Numerator Recording and Measure Calculation

3.2.8.1 Automated Numerator Recording

Automated Numerator Recording Successfully Tested

(a)(1) (a)(11) (a)(18) (b)(6)

(a)(3) (a)(12) (a)(19) (b)(8)

(a)(4) (a)(13) (a)(20) (b)(9)

(a)(5) (a)(14) (b)(2) (e)(1)

(a)(6) (a)(15) (b)(3) (e)(2)

(a)(7) (a)(16) (b)(4) (e)(3)

(a)(9) (a)(17) (b)(5)

x Automated Numerator Recording was not tested

3.2.8.2 Automated Measure Calculation

Automated Measure Calculation Successfully Tested

x (a)(1) x (a)(11) (a)(18) (b)(6)

x (a)(3) x (a)(12) (a)(19) (b)(8)

x (a)(4) x (a)(13) (a)(20) (b)(9)

x (a)(5) x (a)(14) x (b)(2) x (e)(1)

x (a)(6) x (a)(15) x (b)(3) x (e)(2)

x (a)(7) (a)(16) x (b)(4) x (e)(3)

x (a)(9) (a)(17) x (b)(5)

Automated Measure Calculation was not tested

3.2.9 Attestation

Attestation Forms (as applicable) Appendix

x Safety-Enhanced Design* A

x Quality Management System** B

x Privacy and Security C

*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (a)(18), (a)(19), (a)(20), (b)(3), (b)(4), (b)(9). **Required for every EHR product

3.3 Appendices

Attached below.

Page 12: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015

Page 12 of 12

Test Results Summary Change History

Test Report ID Description of Change Date

2014 Edition Test Report Summary

Page 13: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

October 4, 2013

Drummond Group Inc

13359 North Hwy 183, Ste B-406-238

Austin, TX 78750

Submission of Documented User Centered Design Process and Usability Test Results

This letter and attachments serve as a formal submission of the Documented User Centered Design Process and Usability Test Results on behalf of McKesson Corporation (“McKesson”) for Practice Partner® Ver. 11 for Test Criteria: 170.314.g.3 – Safety Enhanced Design.

We thank you for your consideration.

Regards,

James Reynolds

Executive Director, Practice Partner

McKesson

Page 14: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Practice Partner® User Centered Design Process Documentation

User Centered Design Standard Alignment

The Practice Partner® Ver. 11 User Centered Design process aligns with the McKesson User Centered

Design (UCD) process.

The McKesson UCD process is based on information contained in NISTIR 7741 (Schumacher & Lowry,

2010) and the UCD visual map found on the Usability.gov website

(http://www.usability.gov/methods/process.html). Each of the following criteria have been developed

or iterated (in the case of legacy systems) according to this process:

§ 170.314(a)(1): Computerized Provider Order Entry (CPOE);

§ 170.314(a)(2): Drug-Drug and Drug-Allergy Interaction Checks;

§ 170.314(a)(6): Medication List;

§ 170.314(a)(7): Medication Allergy List;

§ 170.314(a)(8): Clinical Decision Support;

§ 170.314(b)(3): Electronic Prescribing; and

§ 170.314(b)(4): Clinical Information Reconciliation.

McKesson User Centered Design Process The McKesson UCD process consists of various activities carried out in 5 phases: Planning, Research,

Analysis, Design and Evaluation. The individual activities are listed and described in the table below.

Page 15: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Phase Activity ID Activity Description

Planning 1

Develop plan for user-centered design activities

This activity occurs at the beginning of the project in conjunction with the overall project planning activities. The plan identifies the purpose, design strategy, resources, scope of the project, timelines, planned UCD activities, deliverables, and constraints and risks.

Research 2 Site visits/field research

Site visits provide empirical evidence for the analytical processes of user research and task analysis (see below). This type of field research is usually conducted at the beginning of a larger development effort to understand the context of use; that is the environment the task is conducted in, the users, or user groups, and the existing workflows that the new product or feature will support. In the most common form of this research, a user or a few users are shadowed and the researcher takes notes to document relevant aspects of the task environment. Other forms of observation may focus on a physical location at which users interact for handoffs or for information exchange, or also on documenting the flow of information from a patient’s perspective. Interviews may also be conducted to understand workflow, tasks, and needs. These interviews may be structured or unstructured and conducted with direct users or indirect beneficiaries of the technology, such as patients. During any of these types of observational research, artifacts are collected (paper forms, any written form of information exchange) to document current practices and user thinking about the task or the workflow under development. Transcripts and artifacts from field visits can be re-used across multiple design projects, but teams should be aware of differences that might exist between the current design project and possible previous environments and observations. After some time, previous practices may change or have been altered through the introduction of technology, so that new field research might be necessary.

Analysis 3

Analyze information about users

This activity, once complete, can be reused across projects and tasks (but see same caution as above). Habits and knowledge of user groups may change. This activity requires the analysis and synthesis of information about users gathered through field visits or other forms of empirical research. The purpose is to understand the types of users interacting with the system, as well as characteristics of those users. The output from this activity is used during the design process. The activity results in user profiles and, optionally, personas. The data may contain information related to the physical and social contexts in which the user performs tasks.

Analysis 4

Analyze the task/activity that the product supports

This activity occurs in conjunction with user research and before design begins. Its purpose is to learn about the user's goals and identify tasks performed with the system. It can have several levels of detail. It often starts with an overall goal of the system (which includes the user) and decomposes it hierarchically into tasks. The analysis should include the context in which the tasks are performed, determine if the tasks differ based on different roles, or if multiple personas share the same task flow and end goals. Task analysis helps in the discovery of what tasks and workflows the system must support and the appropriate content and scope.

Analysis 5 Evaluate current application/feature

This activity is optional. It occurs in the context of existing system features. The purpose is to create a benchmark from which to compare a new system or enhancement. The evaluation includes, but is not limited to, gathering customer complaints, observing user interactions with the system, and interviewing existing customers. The objective is to identify unnecessary steps in the workflow, redundant or misleading information, or other items that negatively impact the user's ability to effectively interact with the system.

Page 16: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Phase Activity ID Activity Description

Analysis 6 Create Usability Objectives

Usability objectives are defined quality objectives for a system which includes the user interactions with the system.. Typically, usability objectives are based on the following measures, depending upon the workflow/tasks and the importance or prioritization of goals.

Error counts/types

Time to complete the task/workflow

Time required to learn the steps necessary to perform the task

Successful completion of the task

Satisfaction with the steps necessary to perform the task as well as the user interfaces and system performance.

The objectives are compared with the actual measures collected when a user is interacting with the system. When used in conjunction with usability tests, the measures help product management determine whether further design iterations are necessary.

Analysis 7 Write Scenarios

This activity results in scenarios created from the information gathered through user research and task analysis. The scenarios include fictional narratives of typical sets of steps that a user takes to accomplish a task or workflow. The scenarios should cover as many real world situations as possible, including the 'happy paths' and 'alternate paths'. The scenarios are fed into the design and can be used to identify alternate workflows and potential use errors. The scenarios can also be used to document the steps a participant needs to take to complete a test script related to formative or summative testing.

Analysis 8 Determine user requirements

User requirements are created from a user's point of view and describe the expected functionality necessary to solve the user's goal. User requirements are inputs to the UI designs and are particular to a feature. They should not include details about how the user performs the task but, rather, describe what the user is trying to accomplish. They are testable statements of expected functionality and should be traceable to original business requirements. They can be used to create test scripts for usability testing.

Design 9 Create UI design

The information gathered about users, requirements, and tasks is translated into a UI design. Information Architecture (IA) process should be used to classify and organize the system content and navigation to improve user access to and comprehension of the information. The UI design is usually documented in a UI specification which details out all the interactions, characteristics and visual aspects of the design. It is important that the UI design requirements be traced back to user requirements and the original design strategy defined in the UCD plan.

Design 10 Develop a prototype

This activity can occur in parallel with task analysis, user requirements definition, and IA activities. Prototypes are models of the new feature or enhancement. Prototypes can be created at different levels of fidelity. Which level is employed will depend on the goal of the prototype. A prototype can be used to evaluate the design or demonstrate to the project team how users will interact with the system. A prototype may not be reusable by the project.

Page 17: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Phase Activity ID Activity Description

Evaluation 11 Summative testing

This activity occurs when the product is ready for QA and before GA. A group of 10 - 20 users is recruited to perform a set of tasks on one or more interfaces. Objective measurement data is collected, statistically analyzed and compared against the usability quality objectives. The findings may be used to adjust the finished product prior to a GA release.

Evaluation 12 Heuristics evaluations

This activity is optional. It occurs before the development phase of the project. A group of 3 - 5 experts perform separate analyses based upon general design principles. The goal is to identify usability issues early in the design phase. The evaluation can be performed on paper mockups or screen shots. The analysis can focus on performing high priority tasks, or meeting user goals. The output of the evaluators is collated and a list of usability issues is created that can be fed back into the next design iteration.

Evaluation 13 Formative testing

This activity occurs prior to the start of developing the system. It can continue iteratively throughout the development process. A set of 3 - 7 users are given a set of tasks and a prototype. They are asked to perform the tasks to ensure the overall design and navigation meets their needs. This type of testing results in qualitative data that is turned into a list of recommendations to improve the design.

Page 18: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

References Schumacher, R.M. & Lowry, S.Z. (2010). (NISTIR 7741) NIST guide to the processes approach for

improving the usability of electronic health records [NIST Interagency/Internal Report (NISTIR) - 7741].

Retrieved from http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313.

U.S. Department of Health & Human Services. Step-by-Step Usability Guide. Retrieved from

http://www.usability.gov/methods/process.html.

Page 19: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

EHR Usability Test Report Part I

EHR: Practice Partner® Version 11

User Role: Physician

Setting: Ambulatory

Applications Tested

Practice Partner® VER. 11

Dates

Usability Test: September 9-12, 2013

Report: September 25, 2013

Prepared by

Marita Franzke, Ph.D., Senior HFE, McKesson Provider Technologies

Gergana Pavlova-Grozeva, Business Analyst, McKesson Provider Technologies

Contact Marita Franzke

McKesson Provider Technologies 11000 Westmoor Cr., Suite 125

Westminster, CO 80021 (720) 356-7711

[email protected]

Report based on ISO/IEC 25062:2006

Common Industry Format for Usability Test Reports

Page 20: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 2 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Contents

Tables ............................................................................................................................................................... 4

Executive Summary ........................................................................................................................................... 6

Major Findings and Areas for Improvement .................................................................................................. 12

Introduction ...................................................................................................................................................... 16

Intended Users............................................................................................................................................. 16

Method ............................................................................................................................................................ 16

Participants .................................................................................................................................................. 16

Study Design ............................................................................................................................................... 17

Procedures .................................................................................................................................................. 19

Test Locations.............................................................................................................................................. 20

Test Environment ......................................................................................................................................... 20

Test Forms and Tools .................................................................................................................................. 21

Usability Metrics ........................................................................................................................................... 21

Data Scoring ................................................................................................................................................ 22

Results ............................................................................................................................................................ 24

Data Analysis and Reporting ........................................................................................................................ 24

Discussion of the Findings ............................................................................................................................... 29

170.314(a)(1) – CPOE ................................................................................................................................. 29

Test contexts ............................................................................................................................................ 29

Task 3 – Place a new lab order ................................................................................................................ 29

Task 4 – Look up and change existing lab order ....................................................................................... 32

Task 5 – Look up and change existing image order .................................................................................. 34

Task 6 – Place new image order .............................................................................................................. 37

170.314(a)(2) – Drug-drug and Drug-Allergy Interaction Checks .................................................................. 39

Test contexts ............................................................................................................................................ 39

Task 8 and 9 – Order new prescription, receive interaction warning, and cancel ...................................... 39

170.314(a)(6) – Medication List, also 170.314(a)(1) – CPOE - Medication Orders ................................... 42

Test contexts ............................................................................................................................................ 42

Task 7 – Access current prescription/order/medication list item and change entry.................................... 42

Page 21: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 3 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Task 10 and 11 – Access historical prescription/order/medication list item and change entry ................... 44

Task 8 and 9 – Enter new medications (receive interaction alert and cancel) ........................................... 46

170.314(b)(3) – Electronic Prescribing ......................................................................................................... 49

Test context .............................................................................................................................................. 49

Task 11 – Change med list and e-prescribe.............................................................................................. 49

170.314(a)(8) – Clinical Decision Support .................................................................................................... 51

Test context .............................................................................................................................................. 51

Task 1 – Select patient and review health maintenance alerts .................................................................. 51

Task 2 – Use the InfoButton to review treatment options for patient’s problem. ........................................ 53

List of Appendices ........................................................................................................................................... 57

Appendix A: Demographic Questionnaire ..................................................................................................... 58

Appendix B: Administrator Script and General Instructions .......................................................................... 60

Appendix C: Informed Consent Form ........................................................................................................... 62

Appendix D: Usability Test Instructions ........................................................................................................ 63

Appendix E: Experimenters’ Short Biographies ............................................................................................ 66

Appendix F: Participant Instructions for Each Task ...................................................................................... 67

Appendix G: End-of-Task Questionnaires .................................................................................................... 70

Appendix H: System Usability Scale (SUS) Questionnaire ........................................................................... 75

Appendix I: System and Patient Scenario for Physicians’ Tasks .................................................................. 76

Page 22: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 4 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Tables

Table 1. Physicians’ Tasks: Meaningful Use Criteria, Risk, and Application Tested ........................................... 6

Table 2. Result Summary by Task ..................................................................................................................... 8

Table 3. Result Summary by Meaningful Use Criterion .................................................................................... 11

Table 4. Participant Demographics .................................................................................................................. 17

Table 5. Physicians’ Tasks – same as Table 1, repeated here for easier reference ......................................... 18

Table 6. Test Environment, Technical Data ..................................................................................................... 20

Table 7. Data Scoring: Measures & Rationale ................................................................................................. 22

Table 8. Usability Data by Task and Subtask - same as Table 2, repeated here for easier reference .............. 24

Table 9. Usability Data by Meaningful Use Criteria – same as Table 3, repeated here for easier reference ..... 27

Table 10. Usability Data for Task 3 – Order new lab ....................................................................................... 30

Table 11. Participant Errors for Task 3 – Order new lab ................................................................................. 30

Table 12. Participant Comments for Task 3 – Order new lab .......................................................................... 31

Table 13. Usability Data for Task 4 – Look up and change existing lab order ................................................. 32

Table 14. Participant Errors for Task 4 – Look up and change existing lab order ............................................ 33

Table 15. Participant Comments for Task 4 – Look up and change existing lab order .................................... 33

Table 16. Usability Data for Task 5 – Look up and change existing image order ............................................ 34

Table 17. Participant Errors for Task 5 – Look up and change existing image order ....................................... 35

Table 18. Participant Comments for Task 5 – Look up and change existing image order ............................... 36

Table 19. Usability Data for Task 6 – Place new image order ......................................................................... 37

Table 20. Participant Errors for Task 6 – Place new image order ................................................................... 38

Table 21. Participant Comments for Task 6 – Place new image order ............................................................ 38

Table 22. Usability Data for Tasks 8 and 9 – Interaction Warnings ................................................................. 40

Table 23. Participant Errors for Tasks 8 and 9 – Interaction Warnings ............................................................ 40

Table 24. Participant Comments for Tasks 8 and 9 – Interaction Warnings .................................................... 41

Table 25. Usability Data for Task 7 – Lookup medication and change ............................................................ 42

Table 26. Participant Errors for Task 7 - Lookup medication and change ....................................................... 43

Table 27. Participant Comments for Task 7 - Lookup medication and change ................................................ 44

Table 28. Usability Data for Task 10 and 11 – Lookup historical medication and change ............................... 45

Table 29. Participant Errors for Task 10 and 11 – Lookup historical medication and change .......................... 45

Page 23: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 5 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Table 30. Participant Comments for Tasks 10 and 11 – Lookup historical medication and change ................. 46

Table 31. Usability Data for Tasks 8 and 9 – Add new medication order ........................................................ 47

Table 32. Participant Errors for Task 8 and 9 – Add new medication order ..................................................... 47

Table 33. Participant Comments for Task 8 and 9 – Add new medication order ............................................. 48

Table 34. Usability Data for Task 11 – Fill prescription and e-prescribe .......................................................... 50

Table 35. Participant Errors for Task 11 – Fill prescription and e-prescribe .................................................... 50

Table 36. Participant Comments for Task 11 – Fill prescription and e-prescribe ............................................. 51

Table 37. Usability Data for Task 1 – Select patient and review health maintenance alerts ............................ 52

Table 38. Participant Errors for Task 1 – Select patient and review health maintenance alerts ....................... 52

Table 39. Participant Comments for Task 1 – Select patient and review health maintenance alerts ............... 53

Table 40. Usability Data for Task 2 – Use the InfoButton ................................................................................ 54

Table 41. Participant Errors for Task 2 – Use the InfoButton .......................................................................... 54

Table 42. Participant Comments for Task 2 – Use the InfoButton ................................................................... 55

Page 24: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 6 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Executive Summary A usability test of Practice Partner® VER. 11 was conducted with 22 physician end users of Practice

Partner during the week of September 9-12, 2013 by Marita Franzke and Gergana Pavlova of the

McKesson Provider Technologies. The testing was conducted on two different customer sites in York,

ME and Pensacola FL. The purpose of this test was to test and validate the usability of five Meaningful

Use Criteria as implemented in the functions of the Practice Partner® VER. 11. The Office of the

National Coordinator for Health Information Technology (ONC) requires vendors to test 7 Meaningful

Criteria outlined in the test procedures for §170.314(g)(3) Safety-enhanced design1 for ambulatory

products. The criteria focused on the performance of nursing and administrator tasks will be discussed

in two other reports. Those reports detail the results of usability studies involving nurses/medical

assistants, and administrators, as those clinical tasks are typically conducted by those user roles.

During the usability test, 22 physicians served as participants; they were all familiar with previous

releases of the Practice Partner EHR. In this study, they used this software in 11 simulated, but

representative tasks. More specifics on the sample demographics and participant experience with the

EHR will be detailed below in the Method section.

Below is a list of all tasks employed, which were chosen to evaluate the usability of 5 of the 7

Meaningful Criteria outlined in the test procedures for §170.314(g)(3) Safety-enhanced design for

ambulatory use. Table 1 below lists the physicians' tasks in order of performance, the linked

Meaningful Use criteria, and a risk rating for performing each task.

Table 1. Physicians’ Tasks: Meaningful Use Criteria, Risk, and Application Tested

Task Meaningful Use Criterion Tested

Application Risk

1 Select Patient and Review Health Maintenance Alerts

170.314(a)(8) – Clinical decision support

Practice Partner

Medium

2 Use the InfoButton Search to research problem/treatment information

170.314(a)(8) – Clinical decision support

Practice Partner/

Third Party

Medium

3 Place a new lab order 170.314(a)(1) – CPOE Practice Partner

Medium

1 Test Procedure for §170.314(g)(3) Safety-enhanced design. Approved Test Procedure Version 1.2. December 14, 2012; and Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 Edition; Revisions to the Permanent Certification Program for Health Information Technology, Final Rule.

Page 25: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 7 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

4 Look up and change existing lab order

170.314(a)(1) – CPOE Practice Partner

Medium

5 Look up and change existing image order

170.314(a)(1) – CPOE Practice Partner

Medium

6 Place a new image order 170.314(a)(1) – CPOE Practice Partner

Medium

7 Look up medication order/prescription and change (make inactive)

170.314(a)(1) – CPOE

170.314(a)(6) – Medication list

Practice Partner

High

8 Order/prescribe new medication/ receive drug-allergy interaction alert/cancel

170.314(a)(1) – CPOE

170.314(a)(6) – Medication list

170.314(a)(2) – Drug-Drug and Drug-Allergy interaction checks

Practice Partner

High

9 Order/prescribe new medication/ receive drug-drug interaction alert/cancel

170.314(a)(1) – CPOE

170.314(a)(6) – Medication list

170.314(a)(2) – Drug-Drug and Drug-Allergy interaction checks

Practice Partner

High

10 Look up medication history 170.314(a)(6) – Medication list

Practice Partner

High

11 Change med list (activate inactive medication) and e-prescribe

170.314(a)(1) – CPOE

170.314(a)(6) – Medication list

170.314(b)(3) – Electronic prescribing

Practice Partner

High

During the 45 minute, one-on-one usability test, each participant was greeted by an administrator when

they entered the conference room. Each participant had prior experience with a previous version of the

Practice Partner ® EHR. The participants were asked to review a self-guided training module that

reviewed the new InfoButton feature, and presented instructions for the usability study and using the

testing software. The administrator introduced the test, and instructed participants to complete a series

of tasks (given one at a time) using the EHR Under Test (EHRUT). During the testing, data logging

software timed the test and recorded user screen captures and audio. These recordings were later

used to score action steps, errors and task completion rates. The test administrator did not assist the

participant in completing the tasks.

The following types of data were collected for each participant:

Number of tasks successfully completed without assistance

Time to complete each task

Number and types of errors

Page 26: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 8 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Path deviations

Participant’s verbalizations and written comments

Participant’s satisfaction ratings of the system

All participant data was de-identified so that no correspondence can be made from the identity of the

participant to the data collected. At the end of the test session, each participant completed a post-test

questionnaire.

The results from the System Usability Scale scored the subjective satisfaction with the system based

on performance with these tasks to be: 71.76.

According to Tullis and Albert (2008), raw scores under 60 represent systems with poor usability;

scores over 80 would be considered above average.2 The details of our results are summarized in

Tables 2 and 3 below. Usability judgments were also collected at the end of each major task. The

averages obtained from those task-based usability scores were fairly high, and varied between 4.12

and 4.94 even for the ordering tasks that the participants were not familiar with. The NISTIR 7742

states that “common convention is that average ratings for systems judged easy to use should be 3.3.,

or above.” One single task presented a notable difficulty to the participants and scored only at 2. We

will give special attention to the analysis of this task in the Findings and Recommendation section

below.

Various additional recommended metrics, in accordance with the examples set forth in the NIST Guide

to the Processes Approach for Improving the Usability of Electronic Health Records, were used to

evaluate the usability of the EHRUT. Following is a summary of the performance and rating data

collected on the EHRUT.

Table 2. Result Summary by Task

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

1

Select Patient and Review Health Maintenance Alerts

100% 1.38 50.70

(15.52) 33.01 1.54

0.71

(0.59)

4.65

(0.61)

Select Patient 17 100% 1.68

34.65

(15.71) 20.01 1.73

0.71

(0.59)

4.82

(0.39)

Review Alerts 17 100% 0.62

16.05

(6.68) 13.00 1.24 0

4.44

(.73)

2 Use the InfoButton Search to research problem/treatmen

17 82% 1.25

79

(47.97)

26.93 3.98 .43

(.76)

4.36

(.84)

2 See Tullis, T. & Albert, W (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149).

Page 27: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 9 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

t information

Navigation 17 82% 1.45

42.15

(43.49) 10.58 2.25

.36

(.74)

4.43

(.94)

InfoButton Lookup

17 82% .95 36.85

(23.2) 16.35 2.93

.07

(.27)

4.43

(.84)

3 Place a new lab order

17 100% 1.40 61.64

(27.43) 45.03 1.37

1.17

(1.18)

4.12

(.86)

Navigation 17 100% 2.29

25.02

(19.39) 32.57 2.01

.35

(.7)

4.12

(.93)

Place the order

17 100% 1.10 36.61

(13.02) 12.46 1.12

.82

(.88)

4.12

(.86)

4 Look up and change existing lab order

17 88.24% 1.4 28.32

(30.53) 9.91 2.86

.13

(.35)

4.67

(.62)

Find order 17 88.24% 1.43 13.35

(19.59) 3.85 3.47

.13

(.35)

4.67

(.82)

Change Order 17 88.24% 1.36 11.28

(9.9) 6.06 1.86 0

4.73

(.62)

5 Look up and Cancel existing image order

17 5.88% 1.87 102.46

(NA)* NA NA 2 2

Find existing image order

17 94.12% 2 4.79

(1.228) 3.33 1.44 0

4.31

(1.14)

Cancel Order 17 5.88% 1.86 95.22

(NA) NA NA 2 2

6 Place new image order

17 100% 1.61 34.23

(14.60) 19.7 1.73

.41

(.71)

4.41

(.51)

Navigate to Order Screen

17 100% 1.82 7.66

(6.57) 4.61 1.66 0

4.61

(.49)

Order Image 17 100% 1.56 26.58

(15.82) 15.17 1.75

.41

(.71)

4.41

(.51)

7 Look up medication order/prescription and change (make inactive)

17 94.12% 1.29 19.01

(6.02) 13.36 1.42

.31

(.47)

4.94

(.25)

Find prescription

17 94.12% 1.5 6.84

(3.31) 4.23 1.62

.06

(.25)

4.88

(.33)

Inactivate prescription/order

17 94.12% 1.19 12.18

(4.97) 9.12 1.33

.25

(.45)

4.94

(.25)

Page 28: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 10 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

8 Order new prescription/ review and cancel

17 100% 1.38 72.91

(34.14) 35.1 2.08

.411

(.71)

4.41

(.87)

Write prescription

17 100% 1.57 58.78

(30.38) 27.70 2.12

.06

(.24)

4.41

(.94)

Acknowledge drug-allergy

interactions and cancel order

17 100% 1 14.13

(7.09) 7.39 1.91

.35

(.7)

4.47

(1.01)

9 Order new prescription review and cancel

17 94.12% 1.60

56.64

(22.35)

30.62 1.85 .62

(.72)

4.31

(.7)

Write prescription

17 94.12% 1.7 38.33

(23.4) 17.23 2.23

.13

(.34)

4.44

(.63)

Acknowledge drug-drug

interactions and cancel order

17 94.12% 1.44 18.31

(8.03) 13.40 1.37

.5

(.73)

4.25

(.86)

10 Look up medication history

17 100% 1.64 28.28

(16.13) 13.25 2.13

.87

(1.15)

4.47

(.74)

Find medication

17 100% 1.33 20.42

(5.25) 3.25 2.14 0

4.47

(.64)

Find details 17 100% 1.8

21.34

(13.6) 10.00

2.13

.87

(1.13)

4.53

(.64)

11 Fill prescription and e-prescribe

16 80% 1.65 40.28

(23.84) 28.41 1.41

.23

(.43)

4.5

(.76)

Change med list (make active)

16 80% 1.32 10.38

(8.48) 4.99 2.08

.18

(.4)

4.5

(.76)

e-prescribe 16 80% 2.17 34.13

(19.63) 23.42 1.46

.08

(.29)

4.5

(.76)

The usability result data from Table 2 is grouped by Meaningful Use Criteria in Table 3 below, to allow

for better overview of the usability scores for each criterion.

Page 29: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 11 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Table 3. Result Summary by Meaningful Use Criterion

Meaningful Use Criteria and Tasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

170.314(a)(1) – CPOE

Ordering …

labs 17 100% 1.10

36.61

(13.02) 1.12

.82

(.88)

4.12

(.86)

images 17 100% 1.56

26.58

(15.82) 1.75

.41

(.71)

4.41

(.51)

medications 17 100% 1.57

58.78

(30.38) 2.12

.06

(.24)

4.41

(.94)

medications 17 94.12% 1.7

38.33

(23.4) 2.23

.13

(.34)

4.44

(.63)

Finding existing …

Lab orders 17 88.24% 1.43

13.35

(19.59) 3.47

.13

(.35)

4.67

(.82)

Image orders 17 94.12% 2

4.79

(1.228) 1.44 0

4.31

(1.14)

medications 17 94.12% 1.5

6.84

(3.31) 1.62

.06

(.25)

4.88

(.33)

Medications

(historical) 17 100% 1.33

20.42

(5.25) 2.14 0

4.47

(.64)

Changing existing …

Lab orders 17 88.24% 1.36

11.28

(9.9) 3.47

.13

(.35)

4.67

(.82)

Image orders

(cancel) 17 5.88% 1.86

95.22

(NA) NA 2 2

Medications

(discontinue) 17 94.12% 1.19

12.18

(4.97) 1.33

.25

(.45)

4.94

(.25)

Medications

(activate/renew) 16 80% 1.32

10.38

(8.48) 2.08

.18

(.4)

4.5

(.76)

170.314(a)(2) – Drug-Drug and Drug-Allergy Interaction Checks

Acknowledging drug-allergy interaction and cancel

17 100% 1 14.13

(7.09) 1.91

.35

(.7)

4.47

(1.01)

Acknowledging drug-drug interaction and

cancel

17 94.12% 1.44 18.31

(8.03) 1.37

.5

(.73)

4.25

(.86)

170.314(a)(6) – Medication list

Find current medication

17 94.12% 1.5 6.84

(3.31) 1.62

.06

(.25)

4.88

(.33)

Find medication history

17 100% 1.33 20.42

(5.25) 2.14 0

4.47

(.64)

Change medication 16 80% 1.32 10.38

(8.48) 2.08

.18

(.4)

4.5

(.76)

Page 30: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 12 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Meaningful Use Criteria and Tasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

(activate)

Change medication

(discontinue) 17 94.12% 1.19

12.18

(4.97) 1.33

.25

(.45)

4.94

(.25)

Record new medication

17 100% 1.57 58.78

(30.38) 2.12

.06

(.24)

4.41

(.94)

Record new medication

17 94.12% 1.7 38.33

(23.4) 1.37

.5

(.73)

4.25

(.86)

170.314(a)(8) – Clinical decision support

Acknowledge Health Maintenance alert

17 100% 0.62 16.05

(6.68) 1.24 0

4.44

(.73)

Use InfoButton 17 82% 1.25 79

(47.97) 2.93

.43

(.76)

4.36

(.84)

170.314(b)(3) – Electronic prescribing

e-prescribe, transmit electronically to pharmacy

16 80% 2.17 34.13

(19.63) 1.46

.08

(.29)

4.5

(.76)

In addition to the performance data, qualitative observations were made. A more detailed discussion of

result details, tasks, and findings is provided later in the document (in Discussion of the Findings).

Major Findings and Areas for Improvement

This section summarizes the findings and areas for improvement at a high level. For additional detail, please

see section "

Page 31: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 13 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Discussion of the Findings”.

Finding Recommendation

CPOE, Labs and Images

The order interface of Practice Partner generally

proved to be easy to learn and use –none of the

participants had used order entry before. There

were a few smaller and one more notable

exception:

- If users entered the order function

through the Order shortcut button, they

could have a difficult time finding the list

of existing orders or verifying that an

order had been submitted.

- The ‘cancel’ function is hidden so well

that users simply could not find it – it also

became confused with a more

accessible ‘delete’ function.

- It is possible that an associated hospital

instructed their end users to not use

right-click functions (this was mentioned

by one of our participants).

- Some users struggled with the search

function, expected the search string to

match any word, not just the first word in

the target set.

1. Training: When introducing CPOE make

users enter it through the order folder which

leads to the order list first. Teach the use of

the Order button only as a shortcut after

users are well aware of the function of the

order list.

2. Design: Consider always flowing to the

order list after the completion of an order –

even if the order was started from the Order

button. Users want confirmation that the

order has indeed been placed – and this

saves them an extra step.

3. Training: When introducing CPOE, pay

special attention to the right-click context

menu functions, and practice these – they

are difficult to discover and difficult to

remember if they have been included in

training.

4. Design: Consider making the ‘cancel’

function easier to find – possibly exchange

the ‘delete’ button with a ‘cancel’ button.

Canceling orders is possibly a more

frequently used function than deleting.

5. Training: During training, familiarize users

with the search algorithm so users know

exactly what to expect and how to use this

search function.

6. Design: Consider using a search algorithm

that matches the search string in any word,

not just the first one.

Drug-Drug and Drug-Allergy Interaction

Checks

The warnings were well written and easy to

understand.

- Users had some difficulty with selecting

the correct function keys which resulted

from an apparent confusion between

‘accepting’ the warning and ‘canceling’

1. Design: Make the button labels more direct.

Consider using ‘Override’ instead of ‘OK’

and “Cancel Order’ instead of ‘Cancel’. If

‘Cancel Order’ is selected, eliminate the

extra step of having to cancel the order in

the order window.

Page 32: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 14 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

the prescription.

Medication List (also CPOE for Medication

Orders/Prescriptions)

In general, the med list functions were well

practiced and appeared easy to use. Testing

discovered a few smaller issues.

- When filling prescriptions, some users

shied from using templates (which can

save many steps), because they were

not sure that changes to a template

would carry forward.

- A small glitch in the search dialog makes

it possible that a user selects a

medication instead of initiating a search

when using the ‘enter’ key.

1. Training: Emphasize that templates can

save many clicks and that any fields in a

template can be changed for each particular

prescription once it has been applied.

2. Design: Consider indicating that a template

has been modified through some visual cue

(italicizing, adding an asterisk, etc. – some

marker that can be used consistently for

this purpose).

3. Design: In the search dialog, when the user

places focus in the search field by placing

the cursor, immediately make the ‘search’

button the default function when the ‘enter’

key is selected.

Clinical Decision Support

Health maintenance alerts exhibited no usability

issues. Access to clinical information via the

InfoButton was appreciated as a new function by

a number of participants.

- Access using the right-click context

menu presented a small stumbling block

… many participants did not remember

this method despite the training they had

received/

1. Design: If the right-click method is going to

be used more frequently, make its function

consistent across all Practice Partner

windows and functional areas.

Alternatively, consider a small icon or link to

indicate that Clinical content is available.

2. Training: The right-click function needs to

receive some emphasis and practice during

training, so that a habit can be formed,

otherwise it will be easily forgotten.

E-prescribing

In general this was easy to do once the

transmission mode indicator had been set.

- A few participants had a difficult time

finding the transmission mode indicator.

1. Training: During training, introduce the list

of transmission choices, found under the

‘Print’ label.

2. Design: Consider relabeling this function to

‘transmission’, ‘transmission mode’, or

similar.

Page 33: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 15 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

General Observation:

While the participants seemed to be well

oriented within Practice Partner and seemed to

know to navigate it effortlessly, there was one

small inconsistency in the use of its controls that

led to some struggle:

In some contexts, double-clicking on a list item

leads to more choices, in other similar contexts

it doesn’t. In other contexts the right-click

context menu is used to introduce more choices.

1. Review Practice Partner with respect to this

control method. Try to apply it consistently

independent of immediate context. Double-

click was a method tried many times by

many participants, they expected to find

details and additional functions. Right-click

was never tried. If introducing this as a new

and quick way to access functions, it needs

to be introduced across all lists in a similar

manner, and training should accompany

this introduction especially training of super

users to promote.

Page 34: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 16 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Introduction The McKesson Practice Partner® EHR tested for this study was accessible to the participants in an

integrated environment, as would be typical in a provider implementation. The entire environment was

software version VER. 11. The usability testing environment and tasks attempted to simulate realistic

implementations, workflows and conditions. The purpose of the study was to test and validate the

usability of the current user interface, and provide evidence of usability in the EHR Under Test

(EHRUT). To this end, measures of effectiveness, efficiency and user satisfaction were captured during

the usability testing.

Intended Users

Intended physician users of the EHRUT are mid-level providers in ambulatory practices or clinics, such

as nurse practitioners, physician’s assistants, and physicians of any specialty. Users may have varying

levels of experience in the clinical setting and may have varying years of experience performing clinical

functions with the EHRUT. They can be expected to have received training on the EHRUT, and to be

frequent users (weekly or daily use). Depending on site or specialty, they may have varying levels of

experience with the specific MU functions and tasks.

Method

Participants

A total of 22 participants were tested on the EHRUT for the physician role in the ambulatory setting.

Participants were associated with clinics in one of two customer sites (nine were associated with site A

and the remaining 13 with site B). The participants had been recruited by IT staff in their organizations.

None of the participants had a direct connection to the development of the product. Eighteen

Participants were MDs or ODs, and four were PA’s or NPs.

All participants had experience using the EHRUT, ranging from 1 to 20 years. All participants use the

system for prescribing medications and writing physicians’ notes on a daily basis. However, most

participants were not, or not very familiar with the ordering functions (CPOE), as both sites had only

started implementing this feature in their sites recently. None of the participants had used the

InfoButton feature which was an enhancement with VER. 11. A demographics questionnaire was filled

out before the session began; the questionnaire is included as Appendix A: Demographic

Questionnaire.

Six of the participants were excluded from the analysis because of technical difficulties encountered

during the first day of testing.

Table 4 lists all 17 participants included in the analysis by demographic characteristics, such as

demographics, education, and EHR experience. Participant names were replaced with Participant IDs

so that an individual’s data cannot be tied back to individual identities.

Page 35: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 17 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Table 4. Participant Demographics

ID Gender Age Education Specialty Years in practice

Years using Practice Partner

Frequency of use (filling prescriptions)

1 f 31-40 MD Pediatrics 2 2 Many times a day

2 m > 61 MD Internal Medicine 33 5 Many times a day

3 f 41-50 MD Internal Medicine 15 1 Many times a day

4 f 31-40 PA Dermatology 10 1 Many times a day

5 f 21-30 PA GI 1 1 Many times a day

6 f 31-40 MD Ophthalmology 9 7 Many times a day

7 m 41-50 MD Geriatrics 21 15 Many times a day

8 m 41-50 MD Gastroenterology 14 3 Many times a day

9 m 51-60 MD Internal Medicine 23 19 Many times a day

10 m 41-50 MD Internal Medicine 18 6 Many times a day

11 m 51-60 MD OB-Gyn 29 1.5 Many times a week

12 m 31-40 MD Internal Medicine 7 2 Many times a day

13 m 31-40 MD Otolaryngology/ENT 1 1 Many times a day

14 m 31-40 MD PM&R 5 5 Less than once a month

15 f 31-40 ARNP Dermatology 7 1 Many times a day

16 m 51-60 MD Plastic Surgery 24 20 Many times a day

17 f 51-60 MD Neurology/Sleep Disorders

20 15 Many times a day

A short, self-paced training PowerPoint presentation was provided to participants before the study, to

review the new InfoButton functionality. This training module was similar to what end-users would

usually receive when introduced to new functionality. No training or review of the order CPOE

functionality was included in the training, because the expectation was that participants would have

been familiar with this function. The training also included some explanation of the software used to

capture the session data.

Participants were scheduled for 45 minute sessions which provided time between sessions for debrief

by the administrator, and to get the system ready for the next participant. A spreadsheet was used to

keep track of the participant schedule, and the participant demographics.

Study Design

Overall, the objective of this test was to uncover areas where the application performed well — that is,

effectively, efficiently, and with satisfaction — and areas where the application failed to meet the needs

of the participants. The data from this test may serve as a baseline for future tests with an updated

version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In

Page 36: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 18 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

short, this testing serves as both a means to record or benchmark current usability, but also to identify

areas where improvements must be made. Specifically, the purpose of this test was to test the usability

of 5 of the 7 Meaningful Use Criteria that are defined in 45 CFR Part 170, RIN 0991-AB823 for

ambulatory use. The current study focused on the 5 MU criteria used typically by physicians. Two

separate reports detail the findings from tasks associated with the MU criteria typically conducted by

nurses and administrators. Table 5 below cross-tabulates which task in the study of physicians tasks is

associated with which MU criterion and lists the risk associated with that task.

Table 5. Physicians’ Tasks – same as Table 1, repeated here for easier reference

Task Meaningful Use Criterion Tested

Application Risk

1 Select Patient and Review Health Maintenance Alerts

170.314(a)(8) – Clinical decision support

Practice Partner

Medium

2 Use the InfoButton Search to research problem/treatment information

170.314(a)(8) – Clinical decision support

Practice Partner/

Third Party

Medium

3 Place a new lab order 170.314(a)(1) – CPOE Practice Partner

Medium

4 Look up and change existing lab order

170.314(a)(1) – CPOE Practice Partner

Medium

5 Look up and change existing image order

170.314(a)(1) – CPOE Practice Partner

Medium

6 Place a new image order 170.314(a)(1) – CPOE Practice Partner

Medium

7 Look up medication order/prescription and change (make inactive)

170.314(a)(1) – CPOE

170.314(a)(6) – Medication list

Practice Partner

High

8 Order/prescribe new medication/ receive drug-allergy interaction alert/cancel

170.314(a)(1) – CPOE

170.314(a)(6) – Medication list

170.314(a)(2) – Drug-Drug and Drug-Allergy interaction checks

Practice Partner

High

3 45 CFR Part 170, RIN 0991-AB82. Health Information Technology: Standards, Implementation Specifications, and

Certification Criteria for Electronic Health Record Technology, 2014 Edition; Revisions to the Permanent Certification Program for Health Information Technology.

Page 37: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 19 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

9 Order/prescribe new medication/ receive drug-drug interaction alert/cancel

170.314(a)(1) – CPOE

170.314(a)(6) – Medication list

170.314(a)(2) – Drug-Drug and Drug-Allergy interaction checks

Practice Partner

High

10 Look up medication history 170.314(a)(6) – Medication list

Practice Partner

High

11 Change med list (activate inactive medication) and e-prescribe

170.314(a)(1) – CPOE

170.314(a)(6) – Medication list

170.314(b)(3) – Electronic prescribing

Practice Partner

High

During the usability test, participants interacted with the above listed functions within the McKesson

PracticePartner® VER. 11. Each participant used the system in the exact same configuration, and was

provided with the same instructions. The system was evaluated for effectiveness, efficiency and

satisfaction as defined by measures collected and analyzed for each participant:

o Number of tasks successfully completed without assistance

o Time to complete tasks

o Number of errors

o Path deviations

o Participant verbalizations (comments)

o Participant’s satisfaction ratings of the system

Procedures

The test participant was greeted by the facilitator upon entering the conference room. Once the

participant was seated, he or she was verbally instructed and oriented to the test situation (see

Appendix B: Administrator Script and General Instructions). Each participant was asked to sign an

informed consent form (Appendix C: Informed Consent Form), and then filled out the demographics

questionnaire (Appendix A: Demographic Questionnaire). Then the participant was asked to step

through a set of self-paced training slides. Participants spent between 5-10 minutes reviewing this

presentation that introduced the InfoButton functionality. The presentation also included usability test

instructions and an introduction to the Morae recording software (Appendix D: Usability Test

Instructions). During this presentation participants had a chance to ask questions and discuss the new

feature and any issues concerning the study itself.

One test administrator per participant ran the test. Typically two sessions were happening in the same

room, but enough space was provided so that these two sessions did not interrupt or distract each

other. In some cases, additional persons (hospital officials and a product director) were present to

observe, but they did not interact with the participants. Data logging was achieved automatically by the

Page 38: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 20 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Morae recording software. Detailed data analysis and scoring of the recorded protocols was done after

the test procedure by coding the recording with more fine grained task, action and error markers.

Two facilitators conducted the test: a senior usability professional (Ph.D. Cognitive Psychology with a

Human Factors emphasis.), with over 20 years of experience in the field of User Experience Design

and Research, and a Business Analyst (see Appendix E: Experimenters’ Short Biograph).

For each task, the testing software provided an on-screen description of the task to be performed. See

"Appendix F: Participant Instructions for Each Task" for the complete text of each on-screen task

prompt.

Task timing began when the participant clicked the Start Task button. Task time was stopped when the

participant clicked the End Task button. Participants were instructed to click End Task when they had

successfully completed the task or when they could proceed no further.

After each task the participant was given a short satisfaction questionnaire related to that task

(Appendix G: End-of-Task Questionnaires). After all tasks were completed, the participant was given a

System Usability Scale (SUS) Questionnaire (Appendix H: System Usability Scale (SUS)

Questionnaire).

Following the test, the administrators provided time for the participants to informally communicate

impressions about the usability of the EHRUT and about the testing process.

Test Locations

Participants were tested in their own facilities. In facility A (York, ME), an IT training room was used on

one day, and a conference room on another day. In facility B (Pensacola, FL) the boardroom of the

hospital building was used for all sessions. While all sites tried to create a quiet space for the sake of

testing, their busy professions did not always allow for this. On occasion there was background

conversation and other distractions associated with busy hospital environments. Given the usually

hectic context of clinical practices and the use of EMRs, this represented fair and realistic conditions

under which to test the software.

Test Environment

The participants performed the test tasks on the facilitator’s laptops which were both running

independent copies of the EHR server, client and interaction engine. All of these were Practice Partner

VER. 11. The specifications of the two facilitator machines are as follows: (1) an HP Elitebook Folio

9470m laptop, DPI setting normal (96 DPI), Highest Color Quality (32 bit), 1366 X 768 pixel screen

resolution, and a 60 Hertz screen refresh rate, and (2) a Dell Latitude E6410 laptop, DPI setting normal

(96 DPI), Highest Color Quality (32 bit), 1366 X 768 pixel screen resolution, and a 60 Hertz screen

refresh rate. Audio was recorded through each system’s internal microphone. Participants interacted

with the remote software through their keyboard and mouse or touchpad.

Table 6. Test Environment, Technical Data

Page 39: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 21 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Connection to the InfoButton Content

Site A: WAN. Dial-in into McKesson VPN, connection into the vendors website for content.

Site B: LAN. Dial-in into McKesson VPN, connection into the vendors website for content.

PracticePartner®

VER. 11 Usability Test

Environment

Two special usability test environments were created using the generic data configuration used for the PracticePartner Environment. The environments simulates the complete EHR including all functions and backend support software that a real implementation would provide, including data integration with external systems allowing for data flow from pharmacy solutions, integration with electronic data transfer to pharmacies, and allowing integration with the InfoButton reference vendor. No other user groups had access to this environment, ensuring a pristine test environment with full control of the configuration of the test data.

It is important to note that while all efforts were made to create a fully functioning test environment, the configurations of menus and menu choices, applications, alerts, etc. were similar but not the same as what any particular provider/customer would use. All of the customized functions are different from site to site, depending on hospital policies and implementation.

Our test environment is configured to show what the system is capable of and therefore presents more alerts, confirmation screens, list items etc., than many facilities would choose to implement.

Generally, the system performance (i.e. response time) was representative to what actual users would experience in a field implementation. Data from tasks or trials where system response time slowed perceivably were excluded from the analysis.

EHR software and version

PracticePartner® Version 11

Testing software Morae Recorder Version 3.3.2 (participant)

Morae Manager Version 3.3.2 (data analysis)

Test Forms and Tools

During the usability test, various documents and instruments were used, including:

o Informed Consent

o Demographic Questionnaire

o Experimenter’s Verbal Instructions

o Training slides

o Observer’s Task Sheets

o Post-task surveys (presented through Morae)

o Post-test survey (System Usability Scale, SUS, presented through Morae)

o Most of these materials are included in the appendices.

Usability Metrics

According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health

Records, EHRs should support a process that provides a high level of usability for all users. The goal is

Page 40: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 22 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction.

To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability

testing. The goals of the test were to assess:

o Effectiveness of the EHRUT by measuring participant success rates and errors.

o Efficiency of the EHRUT by measuring the average task time and path deviations.

o Satisfaction with the EHRUT by measuring ease of use ratings.

Data Scoring

The following table details how tasks were scored, errors evaluated, and the time data analyzed.

Table 7. Data Scoring: Measures & Rationale

Measures Rationale and Scoring

Effectiveness:

Task Success

A task is counted as a “Success” if the participant was able to achieve the correct outcome, without user interface directed assistance. Direction was occasionally provided with understanding task instructions, but never in how a task was to be accomplished.

A task is counted as a “Failure” if the participant was never able to find the correct action, or ended the task believing that they had accomplished all tasks without having done so.

The total number of successes are calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times are recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency.

Optimal task performance time, as benchmarked by the 3 users who finished the task without errors and deviations and in the shortest amount of time, was calculated by averaging the times of these 3 users. In cases where more than 2 participants failed a task, the optimal task performance time was calculated using the 2 users who finished the task without errors and deviations in the shortest amount of time. No adjustments were made to the optimal time. Average group times were divided by the optimal performance time for a measure of how close to optimal efficiency each task was performed. It was decided not to define time allotments, because it was felt that different adjustments would have to be made for new tasks and for existing task flows; and we could not find a generally accepted empirical or principled way to determine meaningful adjustments. All participants were allowed to finish the tasks, and the time ratios provide a measure of how much longer an average user would take to perform these tasks.

We felt this was a cleaner, and less biased measure than to define task allotment times based on arbitrary time adjustments. Using the same or no adjustment for a number of studies would also allow better comparison across studies, rather than trying to compare rates or ratios that had all been derived by different adjustments.

Effectiveness:

Task Failures and

Errors

If the participant abandons the task, does not reach the correct answer or performs it incorrectly the task is counted as a “Failure.” For each task the percentage of participants who successfully completed it are reported. The failures can be calculated by subtracting the successful completions from 100.

The total number of errors is calculated for each task and then divided by the total number of times that task was attempted. Not all deviations are counted as errors.

Errors are selections of wrong menu items, wrong list items, or entering wrong data formats. If an error results in a number of corrective steps, only the first actions, resulting in the error is counted as error. All corrective steps are counted as deviations.

On a qualitative level, an enumeration of errors and error types is collected.

Efficiency: The participant’s path (i.e., steps) through the application is recorded. Deviations occur if

Page 41: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 23 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Measures Rationale and Scoring

Task Deviations the participant, for example, chooses a path to task completion which requires more steps than the optimal, fills out additional fields, or performs any steps to correct a previous error. This path is compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

Efficiency:

Task Time

Each task is timed from when the participant clicks the Start Task button until the participant clicks the End Task button. Only task times for tasks that are successfully completed are included in the average task time analysis. Average time per task is calculated for each task. Variance measures (standard deviation) are also calculated.

Satisfaction:

Task Rating

Each participant’s subjective impression of the ease of use of the application is measured

by administering both a simple post-task questionnaire as well as a post-session

questionnaire. After each task, the participant is asked to rate “Overall, this task was: on a

scale of 1 (Very Difficult) to 5 (Very Easy)”. If the task included a number of subtasks, a

question addressing the ease of each subtask was included (see Appendix G: End-of-Task

Questionnaires).

These data are averaged across participants. Common convention is that average ratings

for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in and likeability of the [EHRUT] overall, the testing team administers the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.” See the full System Usability Scale questionnaire in "Appendix H: System Usability Scale (SUS) Questionnaire".

Page 42: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 24 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Results

Data Analysis and Reporting

The results of the usability test were calculated according to the methods specified in the Usability

Metrics section above.

Table 8. Usability Data by Task and Subtask - same as Table 2, repeated here for easier reference

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

1

Select Patient and Review Health Maintenance Alerts

100% 1.38 50.70

(15.52) 33.01 1.54

0.71

(0.59)

4.65

(0.61)

Select Patient 17 100% 1.68

34.65

(15.71) 20.01 1.73

0.71

(0.59)

4.82

(0.39)

Review Alerts 17 100% 0.62

16.05

(6.68) 13.00 1.24 0

4.44

(.73)

2 Use the InfoButton Search to research problem/treatment information

17 82% 1.25

79

(47.97)

26.93 2.93 .43

(.76)

4.36

(.84)

Navigation 17 82% 1.45

42.15

(43.49) 10.58 3.98

.36

(.74)

4.43

(.94)

InfoButton Lookup

17 82% .95 36.85

(23.2) 16.35 2.25

.07

(.27)

4.43

(.84)

3 Place a new lab order

17 100% 1.40 61.64

(27.43) 45.03 1.37

1.17

(1.18)

4.12

(.86)

Navigation 17 100% 2.29

25.02

(19.39) 32.57 2.01

.35

(.7)

4.12

(.93)

Place the order

17 100% 1.10 36.61

(13.02) 12.46 1.12

.82

(.88)

4.12

(.86)

4 Look up and change existing lab order

17 88.24% 1.4 28.32

(30.53) 9.91 2.86

.13

(.35)

4.67

(.62)

Find order 17 88.24% 1.43 13.35

(19.59) 3.85 3.47

.13

(.35)

4.67

(.82)

Change Order 17 88.24% 1.36 11.28

(9.9) 6.06 1.86 0

4.73

(.62)

5 Look up and Cancel existing image order

17 5.88% 1.87 102.46

(NA)* NA NA 2 2

Find existing 17 94.12% 2 4.79

(1.228) 3.33 1.44 0

4.31

(1.14)

Page 43: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 25 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

image order

Cancel Order 17 5.88% 1.86 95.22

(NA) NA NA 2 2

6 Place new image order

17 100% 1.61 34.23

(14.60) 19.7 1.73

.41

(.71)

4.41

(.51)

Navigate to Order Screen

17 100% 1.82 7.66

(6.57) 4.61 1.66 0

4.61

(.49)

Order Image 17 100% 1.56 26.58

(15.82) 15.17 1.75

.41

(.71)

4.41

(.51)

7 Look up medication order/prescription and change (make inactive)

17 94.12% 1.29 19.01

(6.02) 13.36 1.42

.31

(.47)

4.94

(.25)

Find prescription

17 94.12% 1.5 6.84

(3.31) 4.23 1.62

.06

(.25)

4.88

(.33)

Inactivate prescription/order

17 94.12% 1.19 12.18

(4.97) 9.12 1.33

.25

(.45)

4.94

(.25)

8 Order new prescription/ review and cancel

17 100% 1.38 72.91

(34.14) 35.1 2.08

.411

(.71)

4.41

(.87)

Write prescription

17 100% 1.57 58.78

(30.38) 27.70 2.12

.06

(.24)

4.41

(.94)

Acknowledge drug-allergy

interactions and cancel order

17 100% 1 14.13

(7.09) 7.39 1.91

.35

(.7)

4.47

(1.01)

9 Order new prescription review and cancel

17 94.12% 1.60

56.64

(22.35)

30.62 1.85 .62

(.72)

4.31

(.7)

Write prescription

17 94.12% 1.7 38.33

(23.4) 17.23 2.23

.13

(.34)

4.44

(.63)

Acknowledge drug-drug

interactions and cancel order

17 94.12% 1.44 18.31

(8.03) 13.40 1.37

.5

(.73)

4.25

(.86)

10 Look up 17 100% 1.64

28.28

(16.13) 13.25 2.13

.87

(1.15)

4.47

(.74)

Page 44: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 26 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

medication history

Find medication

17 100% 1.33 20.42

(5.25) 3.25 2.14 0

4.47

(.64)

Find details 17 100% 1.8

21.34

(13.6) 10.00

2.13

.87

(1.13)

4.53

(.64)

11 Fill prescription and e-prescribe

16 80% 1.65 40.28

(23.84) 28.41 1.41

.23

(.43)

4.5

(.76)

Change med list (make active)

16 80% 1.32 10.38

(8.48) 4.99 2.08

.18

(.4)

4.5

(.76)

e-prescribe 16 80% 2.17 34.13

(19.63) 23.42 1.46

.08

(.29)

4.5

(.76)

The results from the System Usability Scale scored the subjective satisfaction with the system based

on performance with these tasks to be: 71.76.

According to Tullis and Albert (2008), raw scores under 60 represent systems with poor usability;

scores over 80 would be considered above average.4 The details of our results are summarized in

Tables 2 and 3 below. Usability judgments were also collected at the end of each major task. The

averages obtained from those task-based usability scores were fairly high, and varied between 4.12

and 4.94 even for the ordering tasks that the participants were not familiar with. The NISTIR 7742

states that “common convention is that average ratings for systems judged easy to use should be 3.3.,

or above.” One single task presented a notable difficulty to the participants and scored only at 2. We

will give special attention to the analysis of this task in the Findings and Recommendation section

below.

Various additional recommended metrics, in accordance with the examples set forth in the NIST Guide

to the Processes Approach for Improving the Usability of Electronic Health Records, were used to

evaluate the usability of the EHRUT. Following is a summary of the performance and rating data

collected on the EHRUT.

In the following table, Table 9, the usability metrics are reported grouped out by MU criterion. Please

note that the MU criteria related to clinician tasks have been tested with two other user groups (nurses

and administrators). Those results are summarized in 2 separate reports.

4 See Tullis, T. & Albert, W (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149).

Page 45: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 27 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Table 9. Usability Data by Meaningful Use Criteria – same as Table 3, repeated here for easier reference

Meaningful Use Criteria and Tasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

170.314(a)(1) – CPOE

Lab orders

Place a new lab order 17 100% 1.40

61.64

(27.43) 1.37

1.17

(1.18)

4.12

(.86)

Navigation 17 100% 2.29

25.02

(19.39) 2.01

.35

(.7)

4.12

(.93)

Place the order 17 100% 1.10

36.61

(13.02) 1.12

.82

(.88)

4.12

(.86)

Look up and change existing lab order

17 88.24% 1.4 28.32

(30.53) 2.86

.13

(.35)

4.67

(.62)

Access order 17 88.24% 1.43 13.35

(19.59) 3.47

.13

(.35)

4.67

(.82)

Change Order 17 88.24% 1.36 11.28

(9.9) 1.86 0

4.73

(.62)

Image Orders

Place new image order

17 100% 1.61 34.23

(14.60) 1.73

.41

(.71)

4.41

(.51)

Navigate to Order Screen

17 100% 1.82 7.66

(6.57) 1.66 0

4.61

(.49)

Order Image 17 100% 1.56 26.58

(15.82) 1.75

.41

(.71)

4.41

(.51)

Look up and Cancel existing image order

17 5.88% 1.87 102.46

(NA)* NA NA 2

Find existing image order

17 94.12% 2 4.79

(1.228) 3.33 1.44 0

Cancel Order 17 5.88% 1.86 95.22

(NA) NA NA 2

Medication Orders (prescribing)

Look up medication order/prescription and change (make inactive)

17 94.12% 1.29 19.01

(6.02) 13.36 1.42

.31

(.47)

Order new prescription/ review and cancel

17 100% 1.38 72.91

(34.14) 2.08

.411

(.71)

4.41

(.87)

Order new prescription review and cancel

17 94.12% 1.60

56.64

(22.35)

1.85 .62

(.72)

4.31

(.7)

170.314(a)(2) – Drug-Drug and Drug-Allergy Interaction Checks

Page 46: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 28 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Meaningful Use Criteria and Tasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

Acknowledging drug-allergy interaction

and cancel

17 100% 1 14.13

(7.09) 1.91

.35

(.7)

4.47

(1.01)

Acknowledging drug-drug interaction and

cancel

17 94.12% 1.44 18.31

(8.03) 1.37

.5

(.73)

4.25

(.86)

170.314(a)(6) – Medication list

Find current medication

17 94.12% 1.5 6.84

(3.31) 1.62

.06

(.25)

4.88

(.33)

Find medication history

17 100% 1.33 20.42

(5.25) 2.14 0

4.47

(.64)

Change medication

(activate) 16 80% 1.32

10.38

(8.48) 2.08

.18

(.4)

4.5

(.76)

Change medication

(discontinue) 17 94.12% 1.19

12.18

(4.97) 1.33

.25

(.45)

4.94

(.25)

Record new medication

17 100% 1.57 58.78

(30.38) 2.12

.06

(.24)

4.41

(.94)

Record new medication

17 94.12% 1.7 38.33

(23.4) 1.37

.5

(.73)

4.25

(.86)

170.314(a)(8) – Clinical decision support

Acknowledge Health Maintenance alert

17 100% 0.62 16.05

(6.68) 1.24 0

4.44

(.73)

Use InfoButton 17 82% 1.25 79

(47.97) 2.93

.43

(.76)

4.36

(.84)

170.314(b)(3) – Electronic prescribing

e-prescribe, transmit electronically to pharmacy

16 80% 2.17 34.13

(19.63) 1.46

.08

(.29)

4.5

(.76)

Page 47: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 29 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Discussion of the Findings This section presents the study findings, organized by Meaningful Use Criteria.

170.314(a)(1) – CPOE

Test contexts

CPOE tasks were tested in two different contexts:

Labs and images are frequently ordered to be performed by different entities, and a classic ordering

function exists in Practice Partner. CPOE in this functionality was a new function to the participants in

both sites, but training was not provided on this functionality. The result from these tasks (tasks 3-6)

should therefore be seen as an indication of how easy it would be for a novice to learn ordering through

exploration in Practice Partner.

Writing a medication order in an ambulatory context is equivalent to writing a prescription. A number

of tasks were included that tested the interaction with the medication list, from which prescriptions can

be written, and they will be reviewed in the Medication list and e-prescription sections below. Because

of the way this functionality is implemented in Practice Partner these tasks 7-11 are associated with

CPOE, Medication list, Drug-drug, drug-allergy interactions, and e-prescribing. These tasks were all

well-practiced for the participants in our study.

Task 3 – Place a new lab order

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(1) – CPOE

Task Description

This task asked participants to fill out a STAT order for a urine analysis. This is the first time the

participants had to fill an order. They had not received instructions, so we can expect lots of

exploration and small navigational errors in this task.

Risk Analysis

Medium. While getting timely and accurate labs drawn for patients is crucial for medical decision

making, it does not run the risk of causing immediate harm.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Page 48: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 30 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Exclusions

None

Quantitative Summary

Table 10. Usability Data for Task 3 – Order new lab

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

3 Place a new lab order

17 100% 1.40 61.64

(27.43) 45.03 1.37

1.17

(1.18)

4.12

(.86)

Navigation 17 100% 2.29

25.02

(19.39) 32.57 2.01

.35

(.7)

4.12

(.93)

Place the order

17 100% 1.10 36.61

(13.02) 12.46 1.12

.82

(.88)

4.12

(.86)

Participant Errors

Table 11. Participant Errors for Task 3 – Order new lab

Participant Error Severity Error Description

4, 6, 20,

15, 15 Minor Navigation exploration

4 Minor Wrong order type selected, immediately corrected by selecting the correct one

4, 21, 21,

12, 13, 14,

15, 18, 7,

7, 8, 9

Minor Control exploration

Effectiveness

All participants performed this task successfully. Several participants commented that they were not

familiar with this task and appeared a little hesitant to try it. As the list of errors shows, errors fall mostly

into the category of navigation errors (accessing the wrong tab or menu to find orders), and control

exploration errors (understanding which parts of the order entry screen can be clicked, or double-

clicked to achieve the desired result). But even though navigation mistakes were made, 14/17

participants found the order entry screen immediately, and even though many minor mistakes were

Page 49: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 31 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

made in filling the order, the order entry screen was forgiving enough and self-explanatory enough, so

that all participants eventually filled and submitted their first order successfully.

Efficiency

Given that this was the participants’ first interaction with the order functionality, the step deviation ratio

(2.29 for navigation, and 1.10 for placing the order) were very small, and the time deviations varied

accordingly (2.01 for navigation and 1.12 for placing the order). Overall, it took the participants a little

over a minute to place their first order. It is important to note that they were able to cut this time in half

by the second new order (task 6, place image order), which also demonstrates that the interface

supported learning very well.

Comments

Comments are listed only if a participant provided a written comment.

Table 12. Participant Comments for Task 3 – Order new lab

Participant Do you have any additional comments?

20 for no instructions at all not too bad

21 A keyboard shortcut and/or better tabs for switching between windows in PracticePartner would be very useful in general.

13 fairly easy and user friendly

15 First time I used the order menu button- Again, I think should be easy after using it more

9 When the culture returns, I would like to be notified in practice partner. This way, I can review results quicker.

Satisfaction

The participants’ comments show that the users were surprised at how easy ordering seemed given

that they had never used it and had not received instructions, the usability score mirrors this

observation (4.12).

Major Findings

Most users found one of two ways of entering orders immediately. We noticed later that the shortcut to

the order entry screen, even though it was especially efficient, produced the unwanted side effect that

users did not see their orders in the order list after the order sequence and were not sure if it had

actually been placed. The conceptually more meaningful way to enter orders appears to be through the

list of orders because it presents the context of existing orders, both before starting the order and after

it has been placed. It might be good to point this out during training. One could also consider always

displaying the list of orders at the end of the order sequence, even if the user has started the order

through clicking the shortcut button. Other observations: (1) some users struggled with the search

algorithm (they expected their entered search term to match any part of the string they were looking for

Page 50: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 32 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

instead of only the first word), (2) some users thought they could click the order status in the table on

the order form to change the status (from routine to stat).

Task 4 – Look up and change existing lab order

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(1) – CPOE

Task Description

This task asked participants to change an existing blood glucose order from routine to stat. This is the

second order task, but the first time participants have to find/access an existing order.

Risk Analysis

Medium. While getting timely and accurate labs drawn for patients is crucial for medical decision

making, it does not run the risk of causing immediate harm or impact patient safety directly.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary

Table 13. Usability Data for Task 4 – Look up and change existing lab order

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

4 Look up and change existing lab order

17 88.24% 1.4 28.32

(30.53) 9.91 2.86

.13

(.35)

4.67

(.62)

Find order 17 88.24% 1.43

13.35

(19.59) 3.85 3.47

.13

(.35)

4.67

(.82)

Change Order

17 88.24% 1.36 11.28

(9.9) 6.06 1.86 0

4.73

(.62)

Page 51: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 33 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Errors Table 14. Participant Errors for Task 4 – Look up and change existing lab order

Participant Error Severity Error Description

4, 4, 5, 7,

20 Minor navigation

Effectiveness

All but 2 participants finished this task successfully. The error table shows that the exploration of

navigation and the function of the controls in the order window allowed them to perform this task more

effectively with much fewer mistakes. All of the navigation mistakes resulted from participants who

clicked on the order shortcut button first. To find the list of orders, the order tab in the patient record

has to be selected. One participant failed this task because she could never find that order tab – she

was trying to access the existing orders through the order window.

Efficiency

Most participants found the existing order quickly and with few extra steps (step deviation score was

1.43). However a few participants looked at the orders screen first, and took a little longer to find the

order tab that links to the list of orders, leading to a longer time deviation score (3.47). Once the order

screen was found most participants quickly made their change, however one participant could not find

out how to change the order status. Step and time deviations for making the change are fairly small

though (1.36 and 1.86, respectively).

Comments

Comments are listed only if a participant provided a written comment.

Table 15. Participant Comments for Task 4 – Look up and change existing lab order

Participant Do you have any additional comments?

4 no prior training

5 Never done before but easy to do.

20 again no idea where existing order would be

9 Next to the order, I would like one of the columns to mention routine vs stat. Also, will a PIN be required to order these like it would be required for a medication.

Page 52: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 34 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Satisfaction

The satisfaction ratings were high, 4.67 across the whole task, reflecting that participants started to feel

confident that they would know how to use the order functionality.

Major Findings

The observations from Task 3 apply here as well. In this task, where existing orders need to be

accessed, the order shortcut button presents a bit of a trap. If participants see this as the main way to

access the order functionality they might not notice that access to the list of existing orders is through a

tab in the patient tab – emphasize this in training when introducing the order functionality. Changing the

status indicator is generally easy once it has been found.

Task 5 – Look up and change existing image order

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(1) – CPOE

Task Description

This task asked participants to find an existing CT and cancel it. This is the third order task, but the first

image order and the first and only time participants have to cancel an order.

Risk Analysis

Medium. While getting timely and accurate labs drawn for patients is crucial for medical decision

making, it does not run the risk of causing immediate harm or impact patient safety directly.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary

Table 16. Usability Data for Task 5 – Look up and change existing image order

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

5 Look up and 17 5.88% 1.87 102.46 NA NA 2 2

Page 53: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 35 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

Cancel existing image order

(NA)*

Find existing image order

17 94.12% 2 4.79

(1.228) 3.33 1.44 0

4.31

(1.14)

Cancel Order

17 5.88% 1.86 95.22

(NA) NA NA 2 2

Errors Table 17. Participant Errors for Task 5 – Look up and change existing image order

Participant Error Severity Error Description

18, 9, 7, 6,

5, 17, 16,

15, 14, 13,

12, 21, 20

Severe Uses delete instead of cancel function because they can’t find a cancel option.

9, 7, 7, 6,

6, 5, 5, 15,

15, 14, 14,

14, 13, 12,

21, 20, 20,

20, 20, 20,

6, 4, 4

Minor

Navigation, participant is looking for ‘cancel’ option but cannot find Each count in the leftmost column represents the start of another unsuccessful navigational exploration, so some participants started a few before selecting the ‘delete’ button.

Effectiveness

Only 1 participant finished this task successfully. Most other participants searched the order window

and the order list window for a ‘cancel’ functionality, and finally gave up and used the readily available

‘delete’ function, even though many were quite certain that this was not what they had been asked to

do. Clearly, the problem was not with finding the existing order – this was done in less than 5 seconds,

and by nearly 95 % of the participants. Image orders are just as easy to find as lab orders.

Efficiency

The one successful participant searched for 95 seconds before finding the ‘cancel’ function, but many

participants searched for at least that long before giving up or selecting the wrong function. With only

one successful participant, deviation scores and mean error scores could not be meaningfully derived.

Page 54: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 36 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Comments

Comments are listed only if a participant provided a written comment.

Table 18. Participant Comments for Task 5 – Look up and change existing image order

Participant Do you have any additional comments?

5 As long as deleting an order cancels it that is a fast way to do it.

6 Not clear if deleting an order cancels the order.

20 had to be told where existing orders were

13 slightly confusing with "cancel" button vs. deleting item from list

15 Just not familiar with the order menue.

7 not clear about deleting vs cancelling orders

9 Is there any record of this order existing. Maybe it should state beside the order that it was cancelled be a certain provider.

Satisfaction

Finding the order was perceived as easy (the average satisfaction rating was 4.31). Only one

participant found the ‘cancel’ function, and he found this subtask very difficult (with a rating of 2). The

comments above demonstrate that participants were confused about the difference between canceling

and deleting, and simply could not find the right-click menu functions.

Major Findings

When evaluating this task, one has to keep in mind that none of the participants had received an

instruction, and none of them had ever performed this task before. However, the function of canceling

an order (which was hidden behind a right-click context menu) all but escaped discovery – which is a

startling contrast to all other order actions which had been generally easy to discover. The use of right-

click context menus is not implemented consistently throughout Practice Partner (although it does exist

in other parts of the application), and the tested users (except for one) clearly didn’t ever think about

using it. [What added to this issue may have been a policy at an associated hospital ‘to not use any of

the right-click menu’s’, that was brought our attention by one of our participants. We don’t know for

sure, but this may have played into the willingness of our participants to try right-clicking as an

exploratory action.] However, many, if not all tried double-clicking the order to see if they could access

more functions in this manner. To make matters worse, a ‘delete’ button was clearly visible on the

bottom of the order list. Although many participants suspected rightly that this was not the correct

action (as they wanted to preserve a note about who canceled the order and why), many ended up

choosing this as the only alternative they could find. Suggestions are to promote right-click context

menus through training and super user education, to review whether right-click vs. double click

Page 55: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 37 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

methods might need to be implemented more consistently throughout Practice Partner, and to consider

turning the ‘delete’ button which is clearly visible into the ‘cancel’ action which is clinically more relevant

and probably a more frequent user action. The ‘delete’ action could be moved into the context menu.

As our ‘info-button’ task shows (see details below), even if instructions on using right-click context

menus are provided, many users forget – it is not an instruction that is easily remembered from one

single training session, because it is based on pure recall vs. recognition of a cue in the display.

Task 6 – Place new image order

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(1) – CPOE

Task Description

This task asked participants to place an order for a stat renal ultrasound. This is the second time that

the participants have to enter a new order. Except for the difference in the particulars of the order, they

should start feeling familiar with this task at this point.

Risk Analysis

Medium. While getting timely and accurate images taken for patients is crucial for medical decision

making, it does not run the risk of causing immediate harm or impact patient safety directly.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary

Table 19. Usability Data for Task 6 – Place new image order

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

6 Place new image order

17 100% 1.61 34.23

(14.60) 19.7 1.73

.41

(.71)

4.41

(.51)

Navigate to Order Screen

17 100% 1.82 7.66

(6.57) 4.61 1.66 0

4.61

(.49)

Order Image 17 100% 1.56

26.58

(15.82) 15.17 1.75

.41

(.71)

4.41

(.51)

Page 56: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 38 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Errors Table 20. Participant Errors for Task 6 – Place new image order

Participant Error Severity Error Description

4, 5, 15,

16, 16 Minor Exploring control

5, 5 Minor Misleading search term used

Effectiveness

All participants finished this task successfully. The error and comments tables show that ordering has

become easier, although some smaller struggles still exist. Two participants tried to search for the

order type and ran into difficulties because the search engine only looks for matches in the first word.

This led two participants to conclude that the order type did not exist (they entered ‘ultrasound’, and

since it looks for matches with the first word, it only shows ‘ultrasound therapy’ but not ‘renal

ultrasound’). If the facilitator would not have asked them to look in other ways, they would have given

up their search under normal circumstances. Participants also still struggled with expanding the search

tree, but those issues were easily overcome.

Efficiency

Overall, participants found the order screen quickly at this point, and also were able to place their order

quickly. They were able to cut the time they needed to fill the order form in half when compared with

their first order task (task 4). On average the order was placed in close to 30 seconds.

Comments

Comments are listed only if a participant provided a written comment.

Table 21. Participant Comments for Task 6 – Place new image order

Participant Do you have any additional comments?

5 Ultrasound of the kidney was not in the lookup group, but there in the double click of the radiology group. I would very rarely order that test, so would not have found it without coaching.

20 now that i know !

7 practice makes things easier

Page 57: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 39 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Satisfaction

The greater familiarity was evidenced in two of the comments (see participant 20 and 7 above), and in

relatively high satisfaction scores (4.41 across the whole task).

Major Findings

The main finding from this task was that the search algorithm for this particular search can be

misleading. Participants expect a ‘partial’ match, no matter what position that match is in (first or

second search term), and can make the wrong conclusion that an item does not exist if the desired

order item does not display in the list.

170.314(a)(2) – Drug-drug and Drug-Allergy Interaction Checks

Test contexts

Tasks 8 and 9 were included to test whether participants would be able to understand drug-drug (task

8) and drug-allergy (task 9) warnings and would be able to cancel orders successfully after receiving

them. These tasks were part of a longer sequence of tasks dealing with finding a good treatment plan

for the patient. This started with discontinuing an existing prescription (or medication ‘order’, task 7),

trying to place two orders or prescriptions that both resulted in interaction alerts (Tasks 8 and 9),

looking up a historical medication and activating that prescription (Tasks 10 and 11).

Task 8 and 9 – Order new prescription, receive interaction warning, and cancel

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(8) – Clinical decision support

Task Description

In task 8 participants are asked to order/prescribe amoxicillin which results in a drug-allergy warning for

the patient. In task 9 an order/prescription for Cipro results in a drug-drug interaction warning. Here we

will analyze the reaction to the interaction warnings.

Risk Analysis

High. Creating alerts can help clinicians to recognize potential interaction issues for patients and allow

them to address those issues. However, alerts must be used with care as too many can lead to alert

fatigue, resulting in clinicians missing important alerts due to the possibility of false alarms.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Page 58: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 40 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Exclusions

None

Quantitative Summary

Table 22. Usability Data for Tasks 8 and 9 – Interaction Warnings

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

8 Acknowledge drug-allergy

interactions and cancel order

17 100% 1 14.13

(7.09) 7.39 1.91

.35

(.7)

4.47

(1.01)

9 Acknowledge drug-drug

interactions and cancel order

17 94.12% 1.44 18.31

(8.03) 13.40 1.37

.5

(.73)

4.25

(.86)

Errors Table 23. Participant Errors for Tasks 8 and 9 – Interaction Warnings

Participant Error Severity Error Description

Task 8

6, 6, 5, 5, 06 Medium

Participant misinterprets the logic of the controls in the warning window. “OK” in the window means ‘okay to order anyway = override’, but clearly many participants think it simply means ‘ok’ to having seen the alert window. After making the initial mistake they correctly say ‘no’ to a secondary window that asks whether it is okay to place the order despite the interaction warning. All of these participant appear stunned by this order of events, and one mistakenly says ‘yes’ to the second window. Some of them make the same mistake a number of times in a row, and again in the next task which is structurally similar to the first one.

Task 9

15, 15, 20,

9, 13, 06,

06, 6, 7, 15,

Medium

Participant misinterprets the logic of the controls in the warning window. “OK” in the window means ‘okay to order anyway = override’, but clearly many participants think it simply means ‘ok’ to having seen the alert. After making the initial mistake they correctly say ‘no’ to a secondary window that asks whether it is okay to place the order despite the interaction warning. All of these participant appear stunned by this order of events, and one mistakenly says ‘yes’ to the second window. Some of them make the same mistake a number of times in a row, and again in the next task which is structurally similar to the first one.

14 Severe Overrides warning by mistake

Page 59: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 41 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Effectiveness

All participants performed task 8 successfully, during task 9 one participant gets tricked into overriding

the interaction alert. As noted above in the error analysis, many participants are seen stumbling across

the logic of the button controls in the alert window. To click “OK” in this window means “Override”, and

“Cancel” means “Cancel Order/Prescription”. However, many participants interpret these to be

standard “OK” and “Cancel” buttons as seen in many pop-up windows that simply allow to acknowledge

the warning. After clicking “OK” a second window asks whether ordering the drug is the intended

action, with “Yes” being the default choice. Many of the participants were stunned by this sequence

and repeatedly went through it without quite understanding what they needed to do to cancel (‘cancel’

the order on the order screen.) However, in all but one case, the orders eventually were cancelled

correctly. One participant got confused enough to override the warning. Furthermore, experience with

this task did not seem to help matters. In fact, more mistakes were made during the second pass

through the interaction alert than the first one (see error table above).

Efficiency

These tasks were performed quite efficiently. Step deviation ratios were 1 and 1.44 and time deviation

ratios were 1.91 and 1.37.

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 24. Participant Comments for Tasks 8 and 9 – Interaction Warnings

Participant Do you have any additional comments?

6 "Cancel" on warning should cancel entire Rx, instead of going back to Rx to cancel.

Satisfaction

Although participants stumbled over the glitch in the logic of the warning window, they found these two

tasks easy to perform (task 8 = 4.47, and task 9 = 4.29).

Major Findings

In general, participants had little difficulty with this task, however, given that the whole purpose of the

alerts is to present a clear and simple warning, as a mistake in this context can be associated with a

high patient safety risk, one might want to consider changing the flow of the alert windows, or at least

the name of the buttons to more clearly communicate their meaning. OK = Override, Cancel = Cancel

Order. If Override is selected, then OK would be the default on the next warning message. If Cancel it

selected, then the order should be canceled without having the user do this step (as it is currently

implemented).

Page 60: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 42 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

170.314(a)(6) – Medication List, also 170.314(a)(1) – CPOE - Medication Orders

Test contexts

These tasks were part of a longer sequence of tasks dealing with finding a good treatment plan for the

patient. This started with discontinuing an existing prescription (or medication ‘order’, task 7), trying to

place two orders or prescriptions that both resulted in interaction alerts (Tasks 8 and 9), looking up a

historical medication and activating that prescription (Tasks 10 and 11). All of these tasks also support

the maintenance of the patient’s medication list, by accessing, adding and changing items on the list.

Task 7 – Access current prescription/order/medication list item and change entry

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(1) – CPOE

170.314(a)(6) – Medication List

Task Description

In task 7 participants were asked to find an existing prescription for Levaquin and discontinue it.

Risk Analysis

High. Maintaining an accurate medication list and prescriptions has a direct impact on the medications

taken by a patient. Inaccurately prescribed medications could have a high patient safety risk.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary

Table 25. Usability Data for Task 7 – Lookup medication and change

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

Page 61: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 43 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

7 Look up medication order/prescription and change (make inactive)

17 94.12% 1.29 19.01

(6.02) 13.36 1.42

.31

(.47)

4.94

(.25)

Find prescription

17 94.12% 1.5 6.84

(3.31) 4.23 1.62

.06

(.25)

4.88

(.33)

Inactivate prescription/order

17 94.12% 1.19 12.18

(4.97) 9.12 1.33

.25

(.45)

4.94

(.25)

Errors Table 26. Participant Errors for Task 7 - Lookup medication and change

Participant Error Severity Error Description

20, 15, 17 Minor Navigation

20 Severe Delete instead of inactivate

12 Medium Mistakenly indicated that medication was effective (it should have been labeled as ineffective)

13, 8 Minor Explore control

Effectiveness

All but one participant performed this task successfully. The one participant who failed indicated in her

comments that she was not performing this general function anymore, but had her staff do it for her. All

the other participants are very familiar with this task. The only effectiveness issue with this task is with

filling out the DC reason. If the participant fills out the reason, they are asked again in another window

whether the medication was effective (this second response is what gets logged in the database).

Because of this, several of the participants had learned to disregard filling out the field in the first dialog,

and simply answer the question with ‘no’ in the second.

Efficiency

Since this is a well-practiced task, the participant performed it with few extra steps (overall step

deviation ratio was 1.29), and with little extra time (overall time deviation ration was 1.42).

Page 62: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 44 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 27. Participant Comments for Task 7 - Lookup medication and change

Participant Do you have any additional comments?

5 Done plenty of these

20 Have not used this aspect of the program since e prescribing

9 I accidentally clicked end task before completing it. I will perform this task next.

Satisfaction

The high satisfaction ratings reflect the familiarity with this task (4.94).

Major Findings

In general, participants had no difficulty with this task. They knew how to navigate to the medication

list, look up existing medications and knew how to discontinue it. Additionally, some users had found a

way to make the double-dare of indicating the ineffectiveness of the medication more efficient.

Task 10 and 11 – Access historical prescription/order/medication list item and change

entry

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(1) – CPOE

170.314(a)(6) – Medication List

Task Description

In task 11 participants were asked to find a previous prescription for Nitrofurantoin and re-activate it.

Risk Analysis

High. Maintaining an accurate medication list and prescriptions has a direct impact on the medications

taken by a patient. Inaccurately prescribed medications could have a high patient safety risk.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Page 63: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 45 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Exclusions

None

Quantitative Summary

Table 28. Usability Data for Task 10 and 11 – Lookup historical medication and change

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

10 Look up medication history

17 100% 1.64 28.28

(16.13) 13.25 2.13

.87

(1.15)

4.47

(.74)

Find medication

17 100% 1.33 20.42

(5.25) 3.25 2.14 0

4.47

(.64)

Find details 17 100% 1.8

21.34

(13.6) 10.00 2.13

.87

(1.13)

4.53

(.64)

11 Change med list (make active)

16 80% 1.32 10.38

(8.48) 4.99 2.08

.18

(.4)

4.5

(.76)

Errors

Table 29. Participant Errors for Task 10 and 11 – Lookup historical medication and change

Participant Error Severity Error Description

Task 10 Lookup historical med and details

16, 14, 12,

8, 15, 17, 5,

21, 05

Minor Control exploration = these participants were looking for the ‘details’ of the prescription and tried double-clicking the prescription list entry.

Effectiveness

All participants found the list of historical medications easily and without navigation errors. Finding the

details of the prescription was slightly more difficult. Most participants seemed to expect that double-

clicking the list entry would open up a window with more details and options (see error table and

comment 21 below). After some exploration all participants found a way to look at the details. All

Page 64: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 46 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

participants easily figured out how to renew (or re-activate) the prescription, but some eventually failed

this task because they could not figure out how to electronically submit (see discussion of e-prescribing

below).

Efficiency

Finding the historical medication was done quickly (step deviation ratio 1.3, time deviation ratio 2.14),

finding the details was slightly less efficient (step deviation ratio 1.8), but making the change was also

done quickly and easily (step deviation ratio 1.32).

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 30. Participant Comments for Tasks 10 and 11 – Lookup historical medication and change

Participant Do you have any additional comments?

5 This was my first time to look at a prior prescription and details. It wasn't really hard to do. With just a year or so of Practice Partner I haven't run into the need, but will in the future.

21 I would think double clicking the previous prescription would pull up the detail screen.

12 would be easy to have a botton with rapid access instead of having to have to edit the prescription

Satisfaction

Satisfaction ratings were fairly high (4.47, 4.53, and 4.5), and several participants who had not used

this function before verbally expressed that they liked finding past medication history in this manner

(see comment 5 above).

Major Findings

Participants had little difficulty with this task. The only slight stumble was the expectation to double-

click for opening more details about the medication. As mentioned before, it might be a good idea to

review the use of double-clicking and right-clicking for consistency across all functions of Practice

Partner. Our participants seemed often confused about when one of the methods would or would not

work and what their results would be. Generally, double-clicking was tried many times, right-clicking

was difficult to discover.

Task 8 and 9 – Enter new medications (receive interaction alert and cancel)

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(1) – CPOE

Page 65: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 47 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

170.314(a)(6) – Medication List

Task Description

In task 8 and 9 participants were asked to enter two new medications. After receiving interaction alerts

they were supposed to cancel these prescription orders. Here only the first step, entering the

medications will be evaluated. The Alerts are evaluated above in the section dealing with drug-drug

and drug-allergy interaction alerts.

Risk Analysis

High. Maintaining an accurate medication list and prescriptions has a direct impact on the medications

taken by a patient. Inaccurately prescribed medications could have a high patient safety risk.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary

Table 31. Usability Data for Tasks 8 and 9 – Add new medication order

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

8 Write prescription

17 100% 1.57 58.78

(30.38) 27.70 2.12

.06

(.24)

4.41

(.94)

9 Write prescription

17 94.12% 1.7 38.33

(23.4) 17.23 2.23

.13

(.34)

4.44

(.63)

Errors

Table 32. Participant Errors for Task 8 and 9 – Add new medication order

Participant Error Severity Error Description

18, Task 8 Medium Picks wrong medication by accident

20, Task 9 Medium Misspells name of drug

Page 66: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 48 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Effectiveness

The medication order is filled by all participants successfully in task 8 and all but one participant in task

9. During task 8, four participants fill out the order by hand (rather than using a template); and during

task 9 two participants use the manual method. When asked about this later, the participants

commented that they were not sure that any changes to the template (Frequency, for example) were

saved, because the name of the template did not appear changed (see also user comment 9 below).

Using the template can save participants many steps and also helps with invoking the alert checks.

Triggering the alerts sometimes fails with manually entered drugs that don’t trigger an NDC number.

The one task failure came from the participant who does not use the medication list (she has assistants

helping her with that). Another issue appeared when users misspell the names of medications. When

they decide to change the search term and use the keyboard shortcut to start the search (hit enter), a

previous selection in the previous result list becomes selected. This could lead to mistaken selections

of medications. It happened once in our case.

Efficiency

Medication order entry is a well-practiced task and most participants were able to fill the prescriptions

without error and quickly. Step deviation ratios and time deviation ratios were relatively small.

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 33. Participant Comments for Task 8 and 9 – Add new medication order

Participant Do you have any additional comments?

Task 8

4 can program remember trends for individual users

5 No problems.

6 Rx field busy. Layout does not flow well. NDC?

20 we are now e scribing

21 The NDC number did not purport in when I put in the medication. Therefore, I could not send in the medication and did not receive any allergy warning.

Task 9

5

When putting in a template for a drug and modifying the template to reflect a different dosage schedule, the top of the template remains unchanged. To me this is a bit confusing for the user, and I think it would be good to know the dosing has changed on the whole template.

9 This time I searched for the medication and the NDC field was filled out.

Page 67: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 49 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Satisfaction

The relatively high satisfaction ratings reflect the familiarity with this task (4.41 and 4.91).

Major Findings

In general, participants had little difficulty with this task. There seems to be some uncertainty about the

result of selecting prescription templates and modifying them. An indication in the template name,

showing that this application of the template has been modified, might help address this issue. If it

were addressed, users could fill prescriptions more efficiently (fewer clicks), and interaction alerts would

get triggered with more accuracy. The other issue is with the search screen for templates and

medication names. When the search field gets put in focus a second time, the default ‘enter’ action

should be the ‘Search’ button, rather than the ‘OK’ button for the list item. Making this mistake could be

a safety concern, because a user could inadvertently select the wrong medication for a patient (even

though this would not be very likely, because they would review the medication on the next screen.

This problem should be addressed though.

170.314(b)(3) – Electronic Prescribing

Test context

Electronic Prescribing was tested as part of task 11. After renewing the Nitrofurantoin prescription,

participants were asked to electronically submit it to the patient’s default Pharmacy.

Task 11 – Change med list and e-prescribe

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(1) – CPOE

170.314(a)(6) – Medication List

170.314(b)(3) – Electronic Prescribing

Task Description

After renewing the Nitrofurantoin prescription, participants were asked to electronically submit it to the

patient’s default Pharmacy.

Page 68: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 50 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Risk Analysis

Medium. Being able to transmit prescriptions electronically rules out some misinterpretation and can

speed up the process of getting prescriptions but does not directly lead to patient safety issues.

Scoring Notes

One participant had to be excluded from scoring because he could not finish the task due to long

system delays.

Exclusions

Participant 6 – long system delays

Quantitative Summary

Table 34. Usability Data for Task 11 – Fill prescription and e-prescribe

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

11 Fill prescription and e-prescribe

16 80% 1.65 40.28

(23.84) 28.41 1.41

.23

(.43)

4.5

(.76)

Change med list (make active)

16 80% 1.32 10.38

(8.48) 4.99 2.08

.18

(.4)

4.5

(.76)

e-prescribe 16 80% 2.17

34.13

(19.63) 23.42 1.46

.08

(.29)

4.5

(.76)

Errors Table 35. Participant Errors for Task 11 – Fill prescription and e-prescribe

Participant Error Severity Error Description

21, 14, 17,

20 Medium Print instead of transmit

Effectiveness

Four participants did not successfully complete this task. They overlooked the control for setting the

transmit method and kept it in print mode. While this does not preclude the prescription from being

executed, it ultimately did not meet the goal set in this task.

Page 69: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 51 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Efficiency

The participants who did set the transmit mode, did so fairly efficiently (overall step deviation ratio was

2.17), and with little extra time (overall time deviation ration was 1.46).

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 36. Participant Comments for Task 11 – Fill prescription and e-prescribe

Participant Do you have any additional comments?

5 Not bad …

Satisfaction

The participants who finished this task successfully were fairly satisfied with it (satisfaction rating for e-

prescribing was 4.5).

Major Findings

The label for the field that allows to set the transmit method might be misleading, which may have led to

the somewhat low success rate. Changing it to ‘transmit’ might help to point users in the right direction.

170.314(a)(8) – Clinical Decision Support

Test context

Clinical decision supports were tested in two different tasks. Task 1 looked at the usability of the health

maintenance alerts. Task 2 looked at the functionality of the new InfoButton.

Task 1 – Select patient and review health maintenance alerts

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(8) – Clinical Decision Support

Task Description

Participants were asked to select their patient and review any health maintenance alerts.

Page 70: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 52 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Risk Analysis

Medium. While health maintenance alerts are a great tool for preventative medicine, they do not

usually present information that is relevant to immediate patient safety.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None.

Table 37. Usability Data for Task 1 – Select patient and review health maintenance alerts

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

1

Select Patient and Review Health Maintenance Alerts

17 100% 1.38 50.70

(15.52) 33.01 1.54

0.71

(0.59)

4.65

(0.61)

Select Patient

17 100% 1.68 34.65

(15.71) 20.01 1.73

0.71

(0.59)

4.82

(0.39)

Review Alerts

17 100% 0.62 16.05

(6.68) 13.00 1.24 0

4.44

(.73)

Errors Table 38. Participant Errors for Task 1 – Select patient and review health maintenance alerts

Participant Error Severity Error Description

NA NA NA

Effectiveness

All participants successfully completed this task, and no errors were recorded.

Efficiency

Participants did this task quickly (overall step deviation ratio was 1.38, and overall time deviation ratio

was 1.54). They took a little time to look at the alerts, but many specialists voiced that they would not

take the time to read these alerts as they perceive them to be more helpful to primary physicians.

Page 71: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 53 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 39. Participant Comments for Task 1 – Select patient and review health maintenance alerts

Participant Do you have any additional comments?

5 For an acute visit I wouldn't normally look at these, but it may be a good thing to see the reminder to work toward an annual visit if it is overdue.

20 primary care uses alerts Specialists may not

12 To much information in the alert box. Needs to be more selective

7 great reminder

Satisfaction

The satisfaction rating for reviewing the alerts was 4.4. It was perceived to be easy, however some

participants mentioned that it was too much information, or that they would not take the time to read

alerts like this.

Major Findings

This was an easy task; however some participants questioned the utility of such alerts, while others

found them interesting and useful.

Task 2 – Use the InfoButton to review treatment options for patient’s problem.

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(8) – Clinical Decision Support

Task Description

Participants were asked to use the Infobutton to research treatment options for the patient’s main

problem.

Risk Analysis

Medium. While updated clinical information is a great tool for providers, not having access to it, or

being able to find it, does not usually present information that is relevant to immediate patient safety.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Page 72: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 54 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Exclusions

None.

Table 40. Usability Data for Task 2 – Use the InfoButton

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

2 Use the InfoButton Search to research problem/treatment information

17 82% 1.25 79

(47.97) 26.93 2.93

.43

(.76)

4.36

(.84)

Navigation 17 82% 1.45

42.15

(43.49) 10.58 3.98

.36

(.74)

4.43

(.94)

InfoButton Lookup

17 82% .95 36.85

(23.2) 16.35 2.25

.07

(.27)

4.43

(.84)

Errors Table 41. Participant Errors for Task 2 – Use the InfoButton

Participant Error Severity Error Description

15, 8, 8 Minor Navigation error

20, 20, 7, 5, Minor Double click or single click instead of right-click

Effectiveness

Three participants failed to complete this task because they could not figure out on their own how to

navigate to the InfoButton and gave up. The training powerpoint had introduced them to using the

right-click to access the context menu, but many participants had forgotten about this by the time they

got to the task (some referred back to the training material to help them remember – this was scored as

success as this represents typical user behavior when new tasks are learned). As the error table

shows below, discovering the right-click option made this task error prone. However, once this had

been discovered, several participants found this option useful and easy to use.

Efficiency

As the time deviation ratio shows (3.98), navigating to the Infobutton content, was difficult to discover –

even after having received training on it. Navigating the vendor site also indicated some issues (some

Page 73: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 55 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

participants had a hard time discovering how the shortcuts on the third party vendor site worked, but

analysis of third party vendor content is beyond the scope of this report.)

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 42. Participant Comments for Task 2 – Use the InfoButton

Participant Do you have any additional comments?

4 Would you have access to this from the note?

5 Just initially remembering the right click to get there was the only issue. Once I've done it a time or two it will be easy. Good information readily found.

20 like UpToDate

21 UpToDate is a little less useful for my specialty (ENT) than it is for others.

13 very helpful and easy to use

15 just will take time to become habit

18 Intuitve

5 Once I knew how to find the infosearch button, it was easy.

7 great ooption

9 Only able to be used with up to date subscription.

Satisfaction

The satisfaction rating for the overall task was 4.36. It was perceived to be easy, and a number of

participants welcomed this integration of information lookup. A few participants commented that the

content at the site did not correspond with their needs.

Major Findings

Right-click context menus can be an efficient way to navigate, but they are difficult to remember, and

even more difficult to discover. Consider embedding a link or an icon that can be clicked to navigate to

the InfoButton content, instead.

Page 74: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 56 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Page 75: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 57 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

List of Appendices

The following appendices include supplemental data for this usability test report.

Appendix A: Demographic Questionnaire

Appendix B: Administrator Script and General Instructions

Appendix C: Informed Consent Form

Appendix D: Usability Test Instructions

Appendix E: Experimenters’ Short Biograph

Appendix F: Participant Instructions for Each Task

Appendix G: End-of-Task Questionnaires

Appendix H: System Usability Scale (SUS) Questionnaire

Page 76: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 58 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Appendix A: Demographic Questionnaire

Page 77: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 59 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Page 78: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 60 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Appendix B: Administrator Script and General Instructions

Before participant arrives:

o Start remote desktop application and log into test environment

o Wait with logging into applications until Participant is ready (would time out)

o Start Morae Recorder, but don’t click the red button yet

o Open the training presentation; leave in presentation mode

When participant arrives:

o Welcome Participant

o Make sure participant has filled out Informed Consent sheet, make sure address is legible.

o Make sure participant has filled out Demographics sheet

o Get Participant started on training presentation

Verbal Instructions:

- “Thank you for participating in this usability study today. During this study will be doing

some tasks that are very familiar to you. But you will also perform some new tasks. To

familiarize yourself with these new functions, please walk through this short PowerPoint.

- You will notice that our system is similar to the one you are using, but it is not the same.

Some of the configurations will be slightly different from what you are used to, simply

because this is a somewhat limited test system.

- The PowerPoint will also tell you more about participating in a usability study, and show

you how to interact with the recording software.

- You can always ask me questions during these training slides.”

When participant is done with training presentation:

o Start Morae (press red button)

o Log into applications

Verbal Instructions:

- Remember that this system may be configured slightly different from what you are used

to.

- You start by clicking this button here. You can also move this window around, so that

you can see the system better. If you like, you can use these printed instructions and

these Navigation Instructions to follow along.

Page 79: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 61 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

- When you have started, I won’t be able to answer any more questions. Do you have any

questions before you start?”

Page 80: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 62 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Appendix C: Informed Consent Form

Page 81: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 63 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Appendix D: Usability Test Instructions

Introduction to the Study

• Thank you for participating in this usability study.

• Your participation is very valuable and critical for certification.

• Usability testing is required by ARRA 2; vendors cannot certify their EHR’s unless they have been

evaluated by our end-users

• Your feedback will help us improve the usability of Pratice Partner.

• During the study, you will be performing a few routine tasks using Pratice Partner, most of which will be

very familiar to you.

• This presentation provides a review of the routine tasks.

• The following screens present an overview of features in self-guided presentation mode.

• Please take 10 minutes to familiarize yourself with these new functions.

• You can use the page up and down functions to navigate through the presentation.

• If you want to see a part again, you can always back up to previous screens by using the Page Up keys

on your keyboard.

Functionality Review

Study Instructions

• Thank you for participating in this study. Your input is very important.

• Our session today will last about 30 minutes. During that time you will use Pratice Partner Version 11 to

perform tasks.

Objective of the Study

• The purpose of the study is to uncover:

o Areas where the application performs well; that is, effectively, efficiently, and with satisfaction

o Areas where the application does not fully meet your needs.

• Please note that we are not testing you, we are testing the application. If you have difficulty, this means

only that something needs to be improved in the application.

• The study administrator will be here in case you need specific help with the procedures, but the

administrator will not provide help in how to use the application.

Starting the Study

• The system will show an instruction window. The window displays over the EHR screens you will be

using to complete the task.

Page 82: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 64 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

• Click the Start button to start the study.

• Within the instruction window, the system provides instructions for completing the first task.

• After reading the on-screen instructions, click the Start Task button.

• The instruction window will collapse vertically so that it will not obscure the application.

• You can click the Show Instructions button in the instruction window at any time to expand the

window and read the instructions again.

Performing the Task

• Using the EHR screens, begin performing the task. Please try to complete the tasks on your own

following the on-screen instructions very closely.

• You can also move and reposition the instruction window.

• If you place the mouse cursor focus in the instruction window, reposition the mouse cursor back in the

application window before continuing the task.

Indicating Task Completion

• When you are satisfied that you have completed the task, or when you have completed as much as you

can, click the End Task button.

• The system will open a short survey with questions about the task you have just completed.

• After answering the Participant Survey questions for the task, click the Done button.

• The system will present instructions for the next task.

• Between some tasks, the study administrator may adjust the application to start the next task at the

right location.

Completing the Study

• After you complete the Participant Task Evaluation for the final task, the system will open a System

Usability Scale Questionnaire to enable you to record your overall impressions of the application

Page 83: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 65 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

including all the tasks you performed. Note that you will need to scroll vertically to view and answer all

of the questions.

• After completing the questionnaire, click the Done button to complete the study.

• The instruction window shows a message indicating that the study is complete. Tell the study

administrator that you have completed the study.

• The administrator will click the OK button and save your work.

Page 84: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 66 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Appendix E: Experimenters’ Short Biographies

Marita Franzke received her Ph.D. in Cognitive Psychology after completing a dissertation on Exploration-

based Learning of Display Based User Interfaces at the University of Colorado. She has worked in the field of

Human Computer Interaction since 1991, conducting research on usability and adoption of new end user

technologies in the fields of telecommunication (U S WEST Advanced Technologies, Media One Labs, and

AT&T Broadband Labs), Educational Software (Pearson Knowledge Technologies) and EMR Technologies

(McKesson Provider Technologies). She has 21 years of experience in the field of user research and one year

experience in the field of EMR usability.

Gergana Pavlova joined McKesson Practice Partner over 5 years ago as a technical writer. A year ago she

joined the Practice Partner Product Management team to help with the functional design requirements and

user interface designs of the Meaningful Use Stage 2 Incentive Program capabilities. She has an MS in

English and certificates in User-Centered Design in Business Analysis from the University of Washington.

Page 85: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 67 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Appendix F: Participant Instructions for Each Task

Introduction

You will be treating a 43 year old female patient today.

Some of her data are already in the system. She has been showing signs of a UTI since last week. Her current prescription does not seem to be effective.

You will be asked to perform 11 short tasks using the EHR provided to you.

For each task, we will provide you with specific instructions.

When you are ready, please click the START button.

Task 1

Task Instructions:

Your patient comes to the office with symptoms of a UTI. To get started ...

1) Find your patient in the patient list.

2) Select your patient.

3) Read the patient alerts.

4) Close the patient alert window.

Task 2

Task Instructions:

Your patient has been diagnosed with Urinary Tract Infection (UTI). You want to get a

quick glimpse at the newest treatment options for the disease.

1) Find the information about treatment options for cystitis.

2) You are done when you have found this information.

Task 3

The patient also complains of mild mid back pain. You need to check if the patient has

not developed a kidney infection.

1) Place a STAT order for URIN CULTURE.

Page 86: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 68 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Task 4

Task Instructions:

You also notice that the patient already has a repeated order for a blood glucose check

coming up. You decide to change that to STAT to make sure her infection has not

adversely affected her blood glucose levels.

1) Find the order “Glucose, Fasting".

2) Change the date to STAT.

Task 5

Task Instructions:

You want to order an image of her kidneys. You see there is an existing order for an

image.

1) Find the existing order for CT of Abdomen.

2) Then cancel it.

Task 6

Task Instructions:

You think that an ultrasound is a better way to image her kidneys.

1) Order a STAT Renal Ultrasound.

Task 7

Task Instructions:

You get ready to come up with a treatment plan. Review the existing antibiotics

prescription.

1) Find the prescription for Levaquin.

2) Cancel the prescription because it does not seem to be working.

Task 8

Task Instructions:

You will now order a different antibiotic to treat your patients’ UTI. The prescription will be

sent electronically to the patient’s pharmacy (Walgreens).

1) Enter an order for Amoxicillin, 875 mg, BID 7 days.

Page 87: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 69 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

2) If you get a warning, cancel the order.

Task 9

Task Instructions:

Now try to order a different antibiotic. The prescription will be sent electronically to the patient’s pharmacy (Walgreens).

1) Enter an order for Cipro, 250 mg, oral, once a day for 7 days.

2) If you get a warning, cancel the order.

Task 10

Task Instructions:

The patient notices that you can’t find an antibiotic that works, so she tells you that she

took some kind of antibiotic that started with ‘nitro’ in the past, without any problems.

1) Find out which antibiotic was prescribed in the past without incident, then ...

2) Review the details about this prescription.

3) When you have reviewed the details you are done with this task.

Task 11

Task Instructions:

You decide to prescribe NITROFURANTOIN because it has worked for this patient in the past.

1) Activate the discontinued NITROFURANTOIN prescription, then ...

2) Send the prescription electronically to the patient’s pharmacy (Walgreens).

Page 88: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 70 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Appendix G: End-of-Task Questionnaires

1 - Select Patient and Alerts

1. Finding and selecting the patient was ...

very difficult 1 2 3 4 5 very easy

2. Reading the Health Maintenance alerts was ...

very difficult 1 2 3 4 5 very easy

3. Overall this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments about Health Maintenance alerts?

Comments:

2 - InfoButton

1. Finding the InfoButton control was ...

very difficult 1 2 3 4 5 very easy

2. Finding the Treatment Options was ...

very difficult 1 2 3 4 5 very easy

3. Overall this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments about the InfoButton?

Comments:

3 - Order Lab

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Ordering the Lab was ...

very difficult 1 2 3 4 5 very easy

Page 89: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 71 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

3. Overall this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments?

Comments:

4 - Access and Modify Lab Order

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Finding the existing lab order was ...

very difficult 1 2 3 4 5 very easy

3. Changing the order to STAT was ...

very difficult 1 2 3 4 5 very easy

4. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

5. Do you have any additional comments about this task?

Comments:

5 - Access and Cancel Image Order

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Finding the existing image order was ...

very difficult 1 2 3 4 5 very easy

3. Canceling the order was ...

very difficult 1 2 3 4 5 very easy

4. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

Page 90: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 72 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

5. Do you have any additional comments about this task?

Comments:

6 - Place Image Order

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Placing a new image order was ...

very difficult 1 2 3 4 5 very easy

3. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments about this task?

Comments:

7 - Access and Inactivate Med Order

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Finding the existing prescription was ...

very difficult 1 2 3 4 5 very easy

3. Inactivating the prescription was ...

very difficult 1 2 3 4 5 very easy

4. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

5. Do you have any additional comments about this task?

Comments:

8 - Order Med (Drug-Allergy)

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

Page 91: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 73 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

2. Entering the prescription was ...

very difficult 1 2 3 4 5 very easy

3. The interaction warning was easy to understand ...

disagree 1 2 3 4 5 agree

4. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

5. Do you have any additional comments about this task?

Comments:

9 - Order Med (Drug-Drug)

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Entering the prescription was ...

very difficult 1 2 3 4 5 very easy

3. The interaction warning was easy to understand ...

disagree 1 2 3 4 5 agree

4. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

5. Do you have any additional comments about this task?

Comments:

9 - Order Med (Drug-Drug)

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Entering the prescription was ...

very difficult 1 2 3 4 5 very easy

Page 92: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 74 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

3. The interaction warning was easy to understand ...

disagree 1 2 3 4 5 agree

4. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

5. Do you have any additional comments about this task?

Comments:

10- Medication History

1. Finding the screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Finding the prescription details was ...

very difficult 1 2 3 4 5 very easy

3. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments about this task?

Comments:

11 - Change Med List

1. Activating the prescription was ...

very difficult 1 2 3 4 5 very easy

2. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

3. Do you have any additional comments about this task?

Comments:

Page 93: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 75 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Appendix H: System Usability Scale (SUS) Questionnaire

This questionnaire was presented to the participant after the completion of all tasks and task-based

questionnaires.

Thank you for participating in this study. The following questions address your experience with all the tasks in this study.

1. I think that I would like to use this system frequently

Strongly Disagree 1 2 3 4 5 Strongly Agree

2. I found the system unnecessarily complex

Strongly Disagree 1 2 3 4 5 Strongly Agree

3. I thought that the system was easy to use

Strongly Disagree 1 2 3 4 5 Strongly Agree

4. I think that I would need the support of a technical person to be able to use this system

Strongly Disagree 1 2 3 4 5 Strongly Agree

5. I found the various functions in this system were well integrated

Strongly Disagree 1 2 3 4 5 Strongly Agree

6. I thought there was too much inconsistency in this system

Strongly Disagree 1 2 3 4 5 Strongly Agree

7. I would imagine that most people would learn to use this system very quickly

Strongly Disagree 1 2 3 4 5 Strongly Agree

8. I found the system very cumbersome to use

Strongly Disagree 1 2 3 4 5 Strongly Agree

9. I felt very confident using the system

Strongly Disagree 1 2 3 4 5 Strongly Agree

10. I needed to learn a lot of things before I could get going with this system

Strongly Disagree 1 2 3 4 5 Strongly Agree

Page 94: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 76 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Appendix I: System and Patient Scenario for Physicians’ Tasks

Setting: Ambulatory

Participant Group: Physician/NursePractitioner

PATIENT SETUP

Name: XXX Blue

Age:43

DOB: 02/03/1970

Gender: female

123 XXX, Denver, CO 80238

Telephone Number: 303-329-6802

Usual provider: Physician

Preferred Pharmacy: Walgreens, NPI number 1003086380

In patient’s chart:

Allergy List:

Penicillin Potassium

moderate

skin rash

onset date: 01/01/1995

Problem List:

Asthma:

o onset date: 1995

o entered on the Other Problems tab

o apply HM template per patient after creating the proble m

Diabetes:

o onset date: 2000

o entered on the other problems tab

o apply HM template per patient after creating the proble m

Acute UTI

o current date (predate to a date next week)

o listed as a Major Problem

Med List:

Levaquin 5007D

o Date 09/07/2013

Azmacort, MDI 2PUFFS Q6H X 30 D

o Date 01/01/2000

Page 95: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 77 of 77

EHR: Practice Partner VER. 11, Part I – Physician – Ambulatory September 25, 2013

Inactive on the Med History:

Nitrofurantoin 50 MG

o date 01/01/ 2010,

o discontinue 02/02/2010,

o therapy completed, effective

Orders on the current electronic chart:

Glucose, Fasting,

o every 30 days,

o start date some time back

CT of Abdomen, STAT, predate (or do within 7 days)

SYSTEM SETUP

Order Names

Create a new order name (“Renal Ultrasound”) and add it to an Order Tree

Health Maintenance Template(s)

Apply a HM template for 40-49 Year Old Females, do I need to add a special procedure?

Apply a HM template for patients with Asthma, do I need to set a special procedure?

Select Enable Active Health Maintenance Reminders check box on Special Features Records 6 tab

Page 96: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

EHR Usability Test Report Part II

EHR: Practice Partner® Ver. 11

User Role: Nurse

Setting: Ambulatory

Applications Tested

Practice Partner® Ver. 11

Dates

Usability Test: September 9-12, 2013

Report: October 2, 2013

Prepared by

Marita Franzke, Ph.D., Senior HFE, McKesson Provider Technologies

Gergana Pavlova, Business Systems Analyst, McKesson Provider Technologies

Contact Marita Franzke

McKesson Provider Technologies 11000 Westmoor Cr., Suite 125

Westminster, CO 80021 (720) 356-7711

[email protected]

Report based on ISO/IEC 25062:2006

Common Industry Format for Usability Test Reports

Page 97: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 2 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Contents

Tables ............................................................................................................................................................... 4

Executive Summary ........................................................................................................................................... 5

Major Findings and Areas for Improvement .................................................................................................... 8

Introduction ...................................................................................................................................................... 11

Intended Users............................................................................................................................................. 11

Method ............................................................................................................................................................ 11

Participants .................................................................................................................................................. 11

Study Design ............................................................................................................................................... 13

Procedures .................................................................................................................................................. 14

Test Locations.............................................................................................................................................. 15

Test Environment ......................................................................................................................................... 15

Test Forms and Tools .................................................................................................................................. 16

Usability Metrics ........................................................................................................................................... 16

Data Scoring ................................................................................................................................................ 16

Results ............................................................................................................................................................ 19

Data Analysis and Reporting ........................................................................................................................ 19

Discussion of the Findings ............................................................................................................................... 21

170.314(b)(4) – Clinical information reconciliation ........................................................................................ 21

Task 1 – Open incoming CMA .................................................................................................................. 21

Task 2 – Perform Medication Reconciliation ............................................................................................. 24

Task 3 – Perform Allergy Reconciliation ................................................................................................... 26

Task 4 – Perform Problem Reconciliation ................................................................................................. 28

170.314(a)(7) – Medication allergy list ......................................................................................................... 30

Test contexts ............................................................................................................................................ 30

Task 5 – Look up and edit existing allergy list entry .................................................................................. 30

Task 6 – Create a new allergy list entry .................................................................................................... 33

Task 7 – Confirm review of allergy list ...................................................................................................... 34

List of Appendices ........................................................................................................................................... 37

Appendix A: Demographic Questionnaire ..................................................................................................... 38

Page 98: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 3 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Appendix B: Administrator Script and General Instructions .......................................................................... 40

Appendix C: Informed Consent Form ........................................................................................................... 42

Appendix D: Usability Test Instructions ........................................................................................................ 43

Appendix E: Experimenters’ Short Biographies ............................................................................................ 46

Appendix F: Participant Instructions for Each Task ...................................................................................... 47

Appendix G: End-of-Task Questionnaires .................................................................................................... 49

Appendix H: System Usability Scale (SUS) Questionnaire ........................................................................... 52

Appendix I: System and Patient Scenario for Nurse’s Tasks ........................................................................ 53

Page 99: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 4 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Tables

Table 1. Nurses' Tasks: Meaningful Use Criteria, Risk, and Application Tested ................................................. 5

Table 2. Result Summary by Meaningful Use Criterion and Task ...................................................................... 7

Table 3. Participant Demographics .................................................................................................................. 12

Table 4. Nurses’ Tasks – same as Table 1, repeated here for easier reference .............................................. 13

Table 6. Test Environment, Technical Data ..................................................................................................... 15

Table 6. Data Scoring: Measures & Rationale ................................................................................................. 17

Table 7. Result Summary by Meaningful Use Criterion and Task same as Table 2, repeated here for easier

reference ......................................................................................................................................................... 19

Table 8. Usability Data for Task 1 – Open incoming CMA .............................................................................. 22

Table 9. Participant Errors for Task 1 – Open incoming CMA ......................................................................... 22

Table 10. Participant Comments for Task 1 – Open incoming CMA ................................................................ 23

Table 11. Usability Data for Task 2 – Perform medication reconciliation ......................................................... 24

Table 12. Participant Errors for Task 2 – Perform medication reconciliation ................................................... 24

Table 13. Participant Comments for Task 2 – Perform medication reconciliation ............................................ 25

Table 14. Usability Data for Task 3 – Perform allergy reconciliation................................................................ 27

Table 15. Participant Errors for Task 3 – Perform allergy reconciliation .......................................................... 27

Table 16. Participant Comments for Task 3 – Perform allergy reconciliation ................................................... 27

Table 17. Usability Data for Task 4 – Perform problem reconciliation ............................................................. 29

Table 18. Participant Errors for Task 4 – Perform problem reconciliation ........................................................ 29

Table 19. Participant Comments for Task 4 – Perform problem reconciliation ................................................ 29

Table 20. Usability Data for Task 5 – Lookup and edit existing allergy list entry ............................................. 31

Table 21. Participant Errors for Task 5 – Lookup and edit existing allergy list entry ........................................ 31

Table 22. Participant Comments for Task 5 – Lookup and edit existing allergy list entry ................................. 32

Table 23. Usability Data for Task 6 – Add an allergy to the allergy list ............................................................ 33

Table 24. Participant Errors for Task 6 – Add an allergy to the allergy list....................................................... 33

Table 25. Participant Comments for Task 6 – Add an allergy to the allergy list ............................................... 34

Table 26. Usability Data for Task 7 – Confirm review of allergy list ................................................................. 35

Table 27. Participant Errors for Task 7 – Confirm review of allergy list ........................................................... 35

Table 28. Participant Comments for Task 7 – Confirm review of allergy list .................................................... 35

Page 100: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 5 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Executive Summary A usability test of Practice Partner® VER. 11 was conducted with 24 nurse/medical assistant end users

of Practice Partner during the week of September 9-12, 2013 by Marita Franzke and Gergana Pavlova

of the McKesson Provider Technologies. The testing was conducted on two different customer sites in

York, ME and Pensacola FL. The purpose of this test was to test and validate the usability of five

Meaningful Use Criteria as implemented in the functions of the Practice Partner® VER. 11. The Office

of the National Coordinator for Health Information Technology (ONC) requires vendors to test 7

Meaningful Criteria outlined in the test procedures for §170.314(g)(3) Safety-enhanced design1 for

ambulatory products. The criteria focused on the performance of physician and administrator tasks will

be discussed in two other reports. Those reports detail the results of usability studies involving

physicians and system administrators, as those clinical tasks are typically conducted by those user

roles.

During the usability test, 24 nurses and clinical staff served as participants; they were all familiar with

previous releases of the Practice Partner EHR. In this study, they used this software in 7 simulated, but

representative tasks. More specifics on the sample demographics and participant experience with the

EHR will be detailed below in the Method section.

Below is a list of all tasks employed, which were chosen to evaluate the usability of 2 of the 7

Meaningful Criteria outlined in the test procedures for §170.314(g)(3) Safety-enhanced design for

ambulatory use. Table 1 below lists the nurses’ tasks in order of performance, the linked Meaningful

Use criteria, and a risk rating for performing each task.

Table 1. Nurses' Tasks: Meaningful Use Criteria, Risk, and Application Tested

Task Meaningful Use Criterion Tested

Application Risk

1 Open incoming CMA § 170.314(b)(4) Clinical information reconciliation

Practice Partner

High

2 Perform Medication Reconciliation § 170.314(b)(4) Clinical information reconciliation

Practice Partner/

Third Party

High

3 Perform Allergy Reconciliation § 170.314(b)(4) Clinical information reconciliation

Practice Partner

High

1 Test Procedure for §170.314(g)(3) Safety-enhanced design. Approved Test Procedure Version 1.2. December 14, 2012; and Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 Edition; Revisions to the Permanent Certification Program for Health Information Technology, Final Rule.

Page 101: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 6 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

4 Perform Problem Reconciliation and finish import of CMA

§ 170.314(b)(4) Clinical information reconciliation

Practice Partner

High

5 Look up and change existing allergy in the allergy list

§ 170.314(a)(7) Medication allergy list

Practice Partner

High

6 Add an allergy to the allergy list § 170.314(a)(7) Medication allergy list

Practice Partner

High

7 Confirm review of allergy list § 170.314(a)(7) Medication allergy list

Practice Partner

High

During the 45 minute, one-on-one usability test, each participant was greeted by an administrator when

they entered the conference room. Each participant had prior experience with a previous version of the

Practice Partner ® EHR. The participants were asked to review a self-guided training module that

reviewed the new Clinical Data Reconciliation feature, and presented instructions for the usability study

and using the testing software. The administrator introduced the test, and instructed participants to

complete a series of tasks (given one at a time) using the EHR Under Test (EHRUT). During the

testing, data logging software timed the test and recorded user screen captures and audio. These

recordings were later used to score action steps, errors and task completion rates. The test

administrator did not assist the participant in completing the tasks.

The following types of data were collected for each participant:

Number of tasks successfully completed without assistance

Time to complete each task

Number and types of errors

Path deviations

Participant’s verbalizations and written comments

Participant’s satisfaction ratings of the system

All participant data was de-identified so that no correspondence can be made from the identity of the

participant to the data collected. At the end of the test session, each participant completed a post-test

questionnaire.

The results from the System Usability Scale scored the subjective satisfaction with the system based

on performance with these tasks to be: 77.68.

According to Tullis and Albert (2008), raw scores under 60 represent systems with poor usability;

scores over 80 would be considered above average.2 The details of our results are summarized in

Tables 2 and 3 below. Usability judgments were also collected at the end of each major task. The

2 See Tullis, T. & Albert, W (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149).

Page 102: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 7 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

averages obtained from those task-based usability scores varied between 4.0 and 4.93, quite a bit

above the recommendation put forward in NISTIR3 7742. The NISTIR 7742 states that “common

convention is that average ratings for systems judged easy to use should be 3.3., or above.”

Various additional recommended metrics, in accordance with the examples set forth in the NIST Guide

to the Processes Approach for Improving the Usability of Electronic Health Records, were used to

evaluate the usability of the EHRUT. Following is a summary of the performance and rating data

collected on the EHRUT.

Table 2. Result Summary by Meaningful Use Criterion and Task

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

§ 170.314(b)(4) -- Clinical information reconciliation

1

Open incoming CMA

15 92.86 2.0 119.75

(66.38) 52.07 2.30

1.36

(.84)

4.14

(.77)

Navigate to function

15 100 2.13 43.10

(43.68) 23.13 1.86

.05

(.76)

4.0

(1.11)

Locate and open CMA 15 100 2.16

63.62

(42.83) 19.63 3.24

.86

(.36)

4.5

(.85)

Open existing patient chart 15 92.86 1.27

13.02

(7.13) 9.32 1.40 0

4.14

(.77)

2 Perform Medication Reconciliation

14 85.71 1.75 52.6

(37.75) 19.72 2.67

1.33

(.98)

4

(1.28)

3 Perform Allergy Reconciliation

10 100 1.15 23.47

(22.12) 11.08 2.11

1.2

(.63)

4.8

(.42)

4 Perform Problem Reconciliation and finish import of CMA

9 88.89 1.25 36.96

(23.11) 11.52 3.21

.5

(.53)

4.63

(.74)

§ 170.314(a)(7) -- Medication allergy list

5 Look up and edit existing allergy list entry

14 92.86 1.46 72.72

(47.74) 34.21 2.13

1.75

(1.87)

4.71

(.47)

Find existing allergy list item

14 92.86 1.07 43.99

(25.32) 24.66 1.78

.33

(.65)

4.57

(.51)

Change allergy list item

14 92.86 2.51 32.55

(42.46) 9.55 3.41

1.69

(2.14)

4.71

(.47)

3 Schumacher, Robert M., and Lowry, Svetlana Z. (2010), Customized Common Industry Format Template for Electronic

Health Record Usability Testing. National Institute of Standards and Technology, NISTIR 7742.

Page 103: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 8 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

6 Add an allergy to the allergy list

15 100 1.46 58.26

(22.59) 34.76 1.68

.07

(.26)

4.6

(.63)

7 Confirm review of allergy list

15 100 2.4 17.22

(10.68) 8.71 1.98

.07

(.26)

4.93

(.26)

In addition to the performance data, qualitative observations were made. A more detailed discussion of

result details, tasks, and findings is provided later in the document (in Discussion of the Findings).

Major Findings and Areas for Improvement

This section summarizes the findings and areas for improvement at a high level. For additional detail,

please see the section "Discussion of the Findings”.

Findings Recommendations

Clinical Reconciliation

Clinical reconciliation (the import of external

medical summaries and reconciliation with

existing patient charts) is conceptually difficult

and complex in terms of the number of decisions

involved.

Two issues made this task especially difficult for

our participants: (1) Finding the xml document

with the external summary was challenging

because participants believed they had to enter

the file name rather than browsing to it.

(2) The reconciliation process was often cut

short because participants clicked the ‘Import’

button before they had reviewed all data – and

this action could not be reversed.

A few smaller issues having to do with

reconciling each list are discussed in detail in

the “Discussion of the Findings” section of the

report.

1. Training: When introducing this

functionality it should be preceded by

training that not only provides hands-on

examples and allows users to understand

the user interface, but also by a conceptual

explanation of data exchange between

practices and EHRs. Users need to

understand where external records may

result from, how to obtain and store them –

organizational policies and resulting

methods and procedures will have to be

disseminated to end users. This training

should also include training on how to

navigate within the windows file structure

(to find files for import), the effects of

merging external records with the existing

data, as well as how to use the Import

functionality.

2. Design: When launching the import

process, it may be possible to eliminate the

first dialog and immediately launch the

windows browse dialog. Users don’t need a

function to enter the file name by hand – it

Page 104: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 9 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

will always be safer and easier to browse to

existing files. If a password needs to be

entered, intersect a simple password dialog.

3. Design: Consider disabling the ‘Import’

button until all data categories have been

reconciled.

4. Design: Launch a dialog after clicking the

‘Import’ button that asks the users to verify

that they are ready to import and merge

both data files, and gives them a chance to

cancel, in case they are not.

5. Design: Given the conceptual difficulties

with this task, one might consider

implementing this function through a

‘wizard’ like functionality, that steps users

through each decision point one-by-one,

and presents the reconciled lists at the end.

Medication Allergy List

Generally, the participants were able to find the

allergy list quickly and add items to the list.

We identified one issue concerning access to

existing allergy items. The medication allergy

list and medication list are presented in the

same window. However the buttons at the

bottom of this window mostly pertain to the

functions of the medication list. Participants

were confused about this and took a while to

discover that double-clicking opens allergies for

editing (rather than any of the buttons on the

bottom of the window.)

1. Training: Make sure that users are aware of

hidden access methods, such as double-

clicking and the right-click menu. Knowing

that these two methods are used in Practice

Partner will make it much more likely that

users will discover hidden functionality.

Specifically make sure that super users are

aware of this, so that they can disseminate

these hints to other users.

2. Design: Consider expanding the ‘Allergy’

button in the dialog into several options

“edit’ and ‘new’. Also consider grouping

buttons affecting the medication list, and

creating a divider between them and the

allergy buttons to clearly communicate

which button/function pertains to which list.

General Issues

Several of the usability issues came about

because of inconsistencies in how list items can

be edited, deleted or otherwise acted upon.

Lists that appeared in the same dialog box had

different interaction methods (click + button,

double-click + form, right-click, or checkboxes).

This created confusion in users, because it

makes it difficult to distinguish when and where

1. Review Practice Partner with respect to

inconsistencies in interaction methods on

list items and attempt to standardize.

Page 105: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 10 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

a certain method works, and when not.

Inconsistencies like this can lead to inefficient

user behavior in the best case and failure to

discover important functions in the worst case.

Page 106: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 11 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Introduction The McKesson Practice Partner® EHR functions tested for this study were all accessible to the

participants in an integrated environment, as would be typical in a provider implementation. The entire

environment was software version VER. 11. The usability testing environment and tasks attempted to

simulate realistic implementations, workflows and conditions. The purpose of the study was to test and

validate the usability of the current user interface, and provide evidence of usability in the EHR Under

Test (EHRUT). To this end, measures of effectiveness, efficiency and user satisfaction were captured

during the usability testing.

Intended Users

Intended users of the EHRUT are mid-level clinical staff in ambulatory practices or clinics, such as

registered nurses, licensed practical nurses, or medical assistants. Users may have varying levels of

experience in the clinical setting and may have varying years of experience performing clinical functions

with the EHRUT. They can be expected to have received training on the EHRUT, and to be frequent

users (weekly or daily use). Depending on site or specialty, they may have varying levels of experience

with the tasks.

Method

Participants

A total of 24 participants were tested on the EHRUT for the nurse role in the ambulatory setting.

Participants were associated with clinics in one of two customer sites (nine were associated with site A

and the remaining 15 with site B). The participants had been recruited by IT staff in their organizations.

None of the participants had a direct connection to the development of the product. Seventeen were

medical/ophthalmic assistants, four were licensed practicing nurses (LPN), two were accredited nurses

anesthesia (COA), and one was a registered nurse (RN).

All participants had experience using the EHRUT, ranging from 9 months to 19 years. All participants

use the system for entering problem, medication, or allergy data on a daily basis. However, most

participants were not familiar with the Clinical Data Reconciliation feature which was an enhancement

with ER11. A demographics questionnaire was filled out before the session began; the questionnaire is

included as Appendix A: Demographic Questionnaire.

Nine of the participants (all from site A) were excluded from the analysis because the initial training

materials to introduce the clinical reconciliation function did not contain any conceptual explanations of

this functionality. It became clear that users needed more information about data information exchange

between practices, and methods for achieving this. It was also necessary to set expectations about how

quickly these practices might come into effect. While testing at site A, the testing sessions became

interrupted many times because the participants had too many questions about these topics. After

some discussion, the facilitators made a special attempt at addressing these issues in depth and giving

Page 107: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 12 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

participants time to explore all their questions during the training session, which led to much better

testing sessions at site B.

Table 3 lists all 15 participants included in the analysis by demographic characteristics, such as

demographics, education, and EHR experience. Participant names were replaced with Participant IDs

so that an individual’s data cannot be tied back to individual identities.

Table 3. Participant Demographics

ID Gender Age Education/Job Title Years in practice

Years using Practice Partner

Frequency of use (entering medication data)

5 f 20-30 MA 3 years 3 years Many times a day

6 f 51-60 MZ 1.5 years

1.5 years Many times a day

7 m 31-40 LPN 7 + years

3 years Many times a day

8 f 41-50 LPN 20 years 6 years Many times a day

9 f 20-30 MA 7 years 7 years Many times a week

10 f 20-30 MA 5 years 3 years Many times a day

11 f 31-40 MA 11 years 4 years Many times a day

12 f 31-40 Certified Ophtalmic Assistant

7 years 7 years Many times a day

13 f 41-50 OA 7 years 7 years Many times a day

14 f 31-40 RMA 9 years 13 years Many times a day

15 f 31-40 EMR Order Entry Specialist

18 years 15 years A few times a month

16 f 31-40 Ophtalmic surgical assistant, scribe

13 years 5 years Many times a week

17 f 31-40 RMA 12 years 4 years Missing

18 f 41-50 Medical Assitant/Team Leader

20 years 18 years Many times a day

19 f 31-40 MA 2 years 2 years Missing

A short, self-paced training PowerPoint presentation was provided to participants before the study to

review the new Clinical Data Reconciliation functionality. This training module was similar to what end-

users would usually receive when introduced to new functionality. The training also included some

explanation of the software used to capture the session data.

Participants were scheduled for 45 minute sessions which provided time between sessions for debrief

by the administrator, and to get the system ready for the next participant. A spreadsheet was used to

keep track of the participant schedule, and the participant demographics.

Page 108: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 13 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Study Design

Overall, the objective of this test was to uncover areas where the application performed well — that is,

effectively, efficiently, and with satisfaction — and areas where the application failed to meet the needs

of the participants. The data from this test may serve as a baseline for future tests with an updated

version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In

short, this testing serves as both a means to record or benchmark current usability, but also to identify

areas where improvements must be made. Specifically, the purpose of this test was to test the usability

of 2 of the 7 Meaningful Use Criteria that are defined in 45 CFR Part 170, RIN 0991-AB824 for

ambulatory use. The current study focused on the 2 MU criteria used typically by nurses. Two separate

reports detail the findings from tasks associated with the MU criteria typically conducted by physicians

and administrators. Table 4 below cross-tabulates which task in the study of nurse/medical assistants’

tasks is associated with which MU criterion and lists the risk associated with that task.

Table 4. Nurses’ Tasks – same as Table 1, repeated here for easier reference

Task Meaningful Use Criterion Tested

Application Risk

1 Open incoming CMA § 170.314(b)(4) Clinical information reconciliation

Practice Partner

High

2 Perform Medication Reconciliation § 170.314(b)(4) Clinical information reconciliation

Practice Partner/

Third Party

High

3 Perform Allergy Reconciliation § 170.314(b)(4) Clinical information reconciliation

Practice Partner

High

4 Perform Problem Reconciliation and finish import of CMA

§ 170.314(b)(4) Clinical information reconciliation

Practice Partner

High

5 Look up and change existing allergy in the allergy list

§ 170.314(a)(7) Medication allergy list

Practice Partner

High

6 Add an allergy to the allergy list § 170.314(a)(7) Medication allergy list

Practice Partner

High

7 Confirm review of allergy list § 170.314(a)(7) Medication allergy list

Practice Partner

High

4 45 CFR Part 170, RIN 0991-AB82. Health Information Technology: Standards, Implementation Specifications, and

Certification Criteria for Electronic Health Record Technology, 2014 Edition; Revisions to the Permanent Certification Program for Health Information Technology.

Page 109: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 14 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

During the usability test, participants interacted with the above listed functions within the McKesson

Practice Partner® VER. 11. Each participant used the system in the exact same configuration, and was

provided with the same instructions. The system was evaluated for effectiveness, efficiency and

satisfaction as defined by measures collected and analyzed for each participant:

o Number of tasks successfully completed without assistance

o Time to complete tasks

o Number of errors

o Path deviations

o Participant verbalizations (comments)

o Participant’s satisfaction ratings of the system

Procedures

The test participant was greeted by the facilitator upon entering the conference room. Once the

participant was seated, he or she was verbally instructed and oriented to the test situation (see

Appendix B: Administrator Script and General Instructions). Each participant was asked to sign an

informed consent form (Appendix C: Informed Consent Form), and then filled out the demographics

questionnaire (Appendix A: Demographic Questionnaire). Then the participant was asked to step

through a set of self-paced training slides. Participants spent between 5-10 minutes reviewing this

presentation that introduced the Clinical Data Reconciliation functionality, the facilitator explained the

purpose for this functionality and solicited and answered any additional questions that participants had

about exchanging data between practices and EHRs. The presentation also included usability test

instructions and an introduction to the Morae recording software (Appendix D: Usability Test

Instructions). During this presentation participants had a chance to ask questions and discuss the new

feature and any issues concerning the study itself.

One test administrator per participant ran the test. Typically two sessions were happening in the same

room, but enough space was provided so that these two sessions did not interrupt or distract each

other. In some cases, additional persons (hospital officials and a product director) were present to

observe, but they did not interact with the participants. Data logging was achieved automatically by the

Morae recording software. Detailed data analysis and scoring of the recorded protocols was done after

the test procedure by coding the recording with more fine grained task, action and error markers.

Two facilitators conducted the test: a senior usability professional (Ph.D. Cognitive Psychology with a

Human Factors emphasis.), with over 20 years of experience in the field of User Experience Design

and Research, and a business systems analyst with over 5 years experience documenting and

designing the user interfaces of Practice Partner (see Appendix E: Experimenters’ Short Biograph).

For each task, the testing software provided an on-screen description of the task to be performed. See

"Appendix F: Participant Instructions for Each Task" for the complete text of each on-screen task

prompt.

Page 110: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 15 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Task timing began when the participant clicked the Start Task button. Task time was stopped when the

participant clicked the End Task button. Participants were instructed to click End Task when they had

successfully completed the task or when they could proceed no further.

After each task the participant was given a short satisfaction questionnaire related to that task

(Appendix G: End-of-Task Questionnaires). After all tasks were completed, the participant was given a

System Usability Scale (SUS) Questionnaire (Appendix H: System Usability Scale (SUS)

Questionnaire).

Following the test, the administrators provided time for the participants to informally communicate

impressions about the usability of the EHRUT and about the testing process.

Test Locations

Participants were tested in their own facilities. In facility A, an IT training room was used on one day,

and a conference room on another day. In facility B the boardroom of the hospital building was used

for all sessions. While all sites tried to create a quiet space for the sake of testing, their busy

professions did not always allow for this. On occasion there was background conversation and other

distractions associated with busy hospital environments. Given the usually hectic context of clinical

practices and the use of EMRs, this represented fair and realistic conditions under which to test the

software.

Test Environment

The participants performed the test tasks on the facilitator’s laptops which were both running

independent copies of the EHR server and client. All of these were Practice Partner Ver. 11. The

specifications of the two facilitator machines are as follows: (1) an HP Elitebook Folio 9470m laptop,

DPI setting normal (96 DPI), Highest Color Quality (32 bit), 1366 X 768 pixel screen resolution, and a

60 Hertz screen refresh rate, and (2) a Dell Latitude E6410 laptop, DPI setting normal (96 DPI), Highest

Color Quality (32 bit), 1366 X 768 pixel screen resolution, and a 60 Hertz screen refresh rate. Audio

was recorded through each systems' internal microphone. Participants interacted with the remote

software through their keyboard and mouse or touchpad.

Table 5. Test Environment, Technical Data

Practice Partner®

VER. 11 Usability Test

Environment

Two special usability test environments were created using the generic data configuration used for the Practice Partner Environment. The environments simulates the complete EHR including all functions and backend support software that a real implementation would provide, including data integration with external systems allowing for data flow from pharmacy solutions, integration with electronic data transfer to pharmacies. No other user groups had access to this environment, ensuring a pristine test environment with full control of the configuration of the test data.

It is important to note that while all efforts were made to create a fully functioning test environment, the configurations of menus and menu choices, applications, alerts, etc. were similar but not the same as what any particular provider/customer would use. All of the customized functions are different from site to site, depending on hospital policies and implementation.

Our test environment is configured to show what the system is capable of and therefore presents more alerts, confirmation screens, list items etc., than many facilities would

Page 111: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 16 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

choose to implement.

Generally, the system performance (i.e. response time) was representative to what actual users would experience in a field implementation. Data from tasks or trials where system response time slowed perceivably were excluded from the analysis.

EHR software and version

Practice Partner® Version 11

Testing software Morae Recorder Version 3.3.2 (participant)

Morae Manager Version 3.3.2 (data analysis)

Test Forms and Tools

During the usability test, various documents and instruments were used, including:

o Informed Consent

o Demographic Questionnaire

o Experimenter’s Verbal Instructions

o Training slides

o Observer’s Task Sheets

o Post-task surveys (presented through Morae)

o Post-test survey (System Usability Scale, SUS, presented through Morae)

Most of these materials are included in the appendices.

Usability Metrics

According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health

Records, EHRs should support a process that provides a high level of usability for all users. The goal is

for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction.

To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability

testing. The goals of the test were to assess:

o Effectiveness of the EHRUT by measuring participant success rates and errors.

o Efficiency of the EHRUT by measuring the average task time and path deviations.

o Satisfaction with the EHRUT by measuring ease of use ratings.

Data Scoring

The following table details how tasks were scored, errors evaluated, and the time data analyzed.

Page 112: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 17 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Table 6. Data Scoring: Measures & Rationale

Measures Rationale and Scoring

Effectiveness:

Task Success

A task is counted as a “Success” if the participant was able to achieve the correct outcome, without user interface directed assistance. Direction was occasionally provided with understanding task instructions, but never in how a task was to be accomplished.

A task is counted as a “Failure” if the participant was never able to find the correct action, or ended the task believing that they had accomplished all tasks without having done so.

The total number of successes are calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times are recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency.

Optimal task performance time, as benchmarked by the 3 users who finished the task without errors and deviations and in the shortest amount of time, were calculated by averaging the times of these 3 users. In cases where more than 2 participants failed a task, the optimal task performance time was calculated using the 2 users who finished the task without errors and deviations in the shortest amount of time. No adjustments were made to the optimal time. Average group times were divided by the optimal performance time for a measure of how close to optimal efficiency each task was performed. It was decided not to define time allotments, because it was felt that different adjustments would have to be made for new tasks and for existing task flows, and we could not find a generally accepted empirical or principled way to determine meaningful adjustments. All participants were allowed to finish the tasks, and the time ratios provide a measure of how much longer an average user would take to perform these tasks.

We felt this was a cleaner, and less biased measure than to define task allotment times based on arbitrary time adjustments. Using the same or no adjustment for a number of studies would also allow better comparison across studies, rather than trying to compare rates or ratios that had all been derived by different adjustments.

Effectiveness:

Task Failures and

Errors

If the participant abandons the task, does not reach the correct answer or performs it incorrectly the task is counted as a “Failure.” For each task the percentage of participants who successfully completed it are reported. The failures can be calculated by subtracting the successful completions from 100.

The total number of errors is calculated for each task and then divided by the total number of times that task was attempted. Not all deviations are counted as errors.

Errors are selections of wrong menu items, wrong list items, or entering wrong data formats. If an error results in a number of corrective steps, only the first actions, resulting in the error is counted as error. All corrective steps are counted as deviations.

On a qualitative level, an enumeration of errors and error types is collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application is recorded. Deviations occur if the participant, for example, chooses a path to task completion which requires more steps than the optimal, fills out additional fields, or performs any steps to correct a previous error. This path is compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

Efficiency:

Task Time

Each task is timed from when the participant clicks the Start Task button until the participant clicks the End Task button. Only task times for tasks that are successfully completed are included in the average task time analysis. Average time per task is calculated for each task. Variance measures (standard deviation) are also calculated.

Satisfaction:

Task Rating

Each participant’s subjective impression of the ease of use of the application is measured

by administering both a simple post-task questionnaire as well as a post-session

questionnaire. After each task, the participant is asked to rate “Overall, this task was on a

scale of 1 (Very Difficult) to 5 (Very Easy)”. If the task included a number of subtasks, a

question addressing the ease of each subtask was included (see Appendix G: End-of-Task

Page 113: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 18 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Measures Rationale and Scoring

Questionnaires).

These data are averaged across participants. Common convention is that average ratings

for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in and likeability of the [EHRUT] overall, the testing team administers the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.” See the full System Usability Scale questionnaire in "Appendix H: System Usability Scale (SUS) Questionnaire".

Page 114: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 19 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Results

Data Analysis and Reporting

The results of the usability test were calculated according to the methods specified in the Usability

Metrics section above.

Table 7. Result Summary by Meaningful Use Criterion and Task same as Table 2, repeated here for easier reference

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed

/ Optimal)

Mean (SD)

Mean (SD)

§ 170.314(b)(4) -- Clinical information reconciliation

1

Open incoming CMA

15 92.86 2.0 119.75

(66.38) 52.07 2.30

1.36

(.84)

4.14

(.77)

Navigate to function

15 100 2.13 43.10

(43.68) 23.13 1.86

.05

(.76)

4.0

(1.11)

Locate and open CMA 15 100 2.16

63.62

(42.83) 19.63 3.24

.86

(.36)

4.5

(.85)

Open existing patient chart 15 92.86 1.27

13.02

(7.13) 9.32 1.40 0

4.14

(.77)

2 Perform Medication Reconciliation

14 85.71 1.75 52.6

(37.75) 19.72 2.67

1.33

(.98)

4

(1.28)

3 Perform Allergy Reconciliation

10 100 1.15 23.47

(22.12) 11.08 2.11

1.2

(.63)

4.8

(.42)

4 Perform Problem Reconciliation and finish import of CMA

9 88.89 1.25 36.96

(23.11) 11.52 3.21

.5

(.53)

4.63

(.74)

§ 170.314(a)(7) -- Medication allergy list

5 Look up and edit existing allergy list entry

14 92.86 1.46 72.72

(47.74) 34.21 2.13

1.75

(1.87)

4.71

(.47)

Find existing allergy list item

14 92.86 1.07 43.99

(25.32) 24.66 1.78

.33

(.65)

4.57

(.51)

Change allergy list item

14 92.86 2.51 32.55

(42.46) 9.55 3.41

1.69

(2.14)

4.71

(.47)

6 Add an allergy to the allergy list

15 100 1.46 58.26

(22.59) 34.76 1.68

.07

(.26)

4.6

(.63)

7 Confirm review of 15 100 2.4

17.22

(10.68) 8.71 1.98

.07

(.26)

4.93

(.26)

Page 115: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 20 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

allergy list

The results from the System Usability Scale scored the subjective satisfaction with the system based

on performance with these tasks to be: 77.68.

According to Tullis and Albert (2008), raw scores under 60 represent systems with poor usability;

scores over 80 would be considered above average.5 The details of our results are summarized in

Tables 2 and 3 below. Usability judgments were also collected at the end of each major task. The

averages obtained from those task-based usability scores varied between 4.0 and 4.93, quite a bit

above the recommendation put forward in NISTIR6 7742. The NISTIR 7742 states that “common

convention is that average ratings for systems judged easy to use should be 3.3., or above.”

Various additional recommended metrics, in accordance with the examples set forth in the NIST Guide

to the Processes Approach for Improving the Usability of Electronic Health Records, were used to

evaluate the usability of the EHRUT. Following is a summary of the performance and rating data

collected on the EHRUT.

Various additional recommended metrics, in accordance with the examples set forth in the NIST Guide

to the Processes Approach for Improving the Usability of Electronic Health Records, were used to

evaluate the usability of the EHRUT.

5 See Tullis, T. & Albert, W (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149).

6 Schumacher, Robert M., and Lowry, Svetlana Z. (2010), Customized Common Industry Format Template for Electronic

Health Record Usability Testing. National Institute of Standards and Technology, NISTIR 7742.

Page 116: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 21 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Discussion of the Findings This section presents the study findings, organized by Meaningful Use Criteria.

170.314(b)(4) – Clinical information reconciliation

Clinical information reconciliation was new functionality developed for Practice Partner Ver. 11. It is a

complex task for which training materials were developed. After training and testing at our first test site,

it became clear that the training needed to include a broader conceptual introduction because the idea

of an electronic information exchange was not a familiar concept to our user base. Because of this, the

data collected from the first 9 participants at site A were excluded from the analysis. The overall task of

opening the data files, performing medication, allergy and problem reconciliation and finishing the

import were broken into smaller tasks for this test, because specific instructions needed to be provided

to a complete novice user.

Task 1 – Open incoming CMA

Analysis Notes

Linked Meaningful Use Criteria

170.314(b)(4) – Clinical information reconciliation

Task Description

In this task, participants were asked to open the CMA for a particular patient.

Risk Analysis

High. Finding the correct file and connecting it with the matching patient is crucial, any confusions

could have devastating effects.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary

Page 117: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 22 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Table 8. Usability Data for Task 1 – Open incoming CMA

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

1

Open incoming CMA

15 92.86 2.0 119.75

(66.38) 52.07 2.30

1.36

(.84)

4.14

(.77)

Navigate to function

15 100 2.13 43.10

(43.68) 23.13 1.86

.05

(.76)

4.0

(1.11)

Locate and open CMA

15 100 2.16 63.62

(42.83) 19.63 3.24

.86

(.36)

4.5

(.85)

Open existing patient chart

15 92.86 1.27 13.02

(7.13) 9.32 1.40 0

4.14

(.77)

Participant Errors Table 9. Participant Errors for Task 1 – Open incoming CMA

Participant Error Severity Error Description

5, 17, 17,

12, 7, 11 Minor

During navigation subtask: participants expected to open the patient’s chart first, rather than opening the CMA first

7 Minor Navigation Error, participant explores the wrong tab

5, 6, 7, 9,

10, 11, 13,

14 ,15 ,16,

17, 18, 19

Minor When opening the CMA, participants expected to type the file name instead of using the browse function. Not one participant included a path name.

16, 16 Medium Participant used the ‘View’ button and never found a way to open the CMA for import.

Effectiveness

All but one participant performed this task successfully. Five participants thought that they had to open

up the patient’s chart first, which was scored as a minor error. 13 out of 15 participants struggled with

the browse dialog by attempting to type the file name by hand – and none asked for or tried to add a

path name to the file name. While this is fairly minor mistake it shows that participants struggled

conceptually with opening an external file, and most took many tries at spelling the file name until they

noticed that they could use the browse function. Users had no difficulties with finding and selecting the

correct patient chart to connect the CMA with, no errors were made during this subtask, and the step

deviation score was low (1.27).

Page 118: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 23 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Efficiency

The ‘locate and open CMA’ subtask stands out in terms of time deviation (3.24) – the cause of this was

the participants’ struggle with finding the browse functionality. This was the first time that participants

were using this functionality, so these times are indicative of how difficult it was to explore and learn the

correct use of this control. It does not measure long term efficiency. Once the CMA had been located,

users connected it with the matching patient chart very quickly and without mistakes (time deviation

score 1.40).

Comments

Comments are listed only if a participant provided a written comment.

Table 10. Participant Comments for Task 1 – Open incoming CMA

Participant Do you have any additional comments?

6 None

9 Na

10 No

11 I had trouble knowing whether to click the "browse" button or the"ok" button when typing in the file name.

16 I think just getting use to the system and it will be great

Satisfaction

Relatively speaking, this task received fairly low satisfaction scores, between 4.0 and 4.5.

Major Findings

It appears that users think of this task as adding a CMA to an already open patient chart instead of the

other way around – so letting them access the functionality from the patient chart (via a button) might

be another option to start this workflow. The browse dialog presented a major hurdle. All but 2 users

struggled with this. It might be possible to skip this step altogether by launching the Windows browse

functionality immediately when this workflow is invoked. If a password protects this functionality, it

might be possible to just ask for the password at this point and then launch into the Windows browse

functionality. This would save the user an unnecessary step and completely avoid this confusion.

Page 119: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 24 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Task 2 – Perform Medication Reconciliation

Analysis Notes

Linked Meaningful Use Criteria

170.314(b)(4) – Clinical information reconciliation

Task Description

Participants were asked to reconcile the medications on the patient chart with the medications on the

CMA. To do this, they had to un-select a medication from the import list and discontinue the same

medication on the patient chart.

Risk Analysis

High. Rendering an accurate medication list is necessary to allow for creating a drug-treatment plan

that takes all currently taken medications and possible drug-drug interactions into account.

Scoring Notes

One participant could not complete this task, because they had mistakenly clicked the import button

immediately after opening the CMA – this rendered this patient’s chart changed, so that tasks 2-4 could

not be attempted.

Exclusions

Participant 16 – Import executed too early.

Quantitative Summary

Table 11. Usability Data for Task 2 – Perform medication reconciliation

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

2 Perform Medication Reconciliation

14 85.71 1.75 52.6

(37.75) 19.72 2.67

1.33

(.98)

4

(1.28)

Errors Table 12. Participant Errors for Task 2 – Perform medication reconciliation

Participant Error Severity Error Description

9, 9, 10,

10, 15, 15,

Minor Participants struggled with discovering the right-click control for existing medications or the un-check control for incoming medications.

Page 120: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 25 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

18

7, 11, 12,

13, 14, 15,

17, 19

Medium Participants did not discontinue the medication on the existing medication list.

7, 12, 14, 1 Severe Participant selected the import button before finishing the clinical reconciliation process (allergies and problems).

Effectiveness

Four participants did not complete this task successfully because they clicked the import button before

finishing the reconciliation. Many participants would have clicked this button if the experimenter would

not have prevented them from doing so (we decided to intervene, because it was impossible to conduct

the next two tasks after importing the data into the patient chart). As the enumeration of errors shows,

there were also issues with operating the list controls: participants either struggled with discovering the

right-click menu, or simply didn’t discontinue the existing medication. After selecting ‘discontinue’ some

participants expected the item to disappear from the list rather than being highlighted yellow.

Efficiency

Because of the struggles with the list item controls, the step deviation ratio (1.75) and time deviation

ratio (2.67) were fairly large. This indicates problems with exploring the functionality of this new

feature, not necessarily the long-term efficiency of this design.

Comments

Comments are listed only if a participant provided a written comment.

Table 13. Participant Comments for Task 2 – Perform medication reconciliation

Participant Do you have any additional comments?

6 None

9 Na

10 No

13 i would have expected to see the imported medication in the "new" clinical area immediately

Satisfaction

The satisfaction ratings were relatively low, 4 across the whole task.

Major Findings

There were several issues with this task: While the two lists looked similar, their functions were either

visible (incoming list, checkboxes) or hidden (context menu: existing med list). This led in the worst

Page 121: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 26 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

case to the (wrong) conclusion that the discontinue function was not available in the existing list. In the

best case, it took participants a while to figure out how to access this function. Participants also

seemed a little confused about the notion of ‘un-selecting’ items from being imported. Several

mentioned that a positive action, such as ‘selecting’ items for import, would have been more intuitive.

Furthermore, participants seemed to want to see a result of their action of ‘selecting’ or importing. One

participant mentioned in the comments and others verbally, that they wanted to ‘move’ items from the

list on the left to the list on the right. In their attempt to make something happen, almost every

participant wanted to click the ‘Import’ button at the bottom of this window. This button instantly imports

the data (even the categories that have not been reviewed) and makes it impossible for the participant

to reverse their action. There are two simple methods for preventing this from happening: (a) disable

the button until all categories have been reconciled, and/or (b) present a warning dialog that allows the

user to cancel out of this action. One might also consider graphically separating the ‘import’ button from

the lists by a divider, to communicate that this function operates on the whole of the reconciliation

process rather than the specific list in view.

Task 3 – Perform Allergy Reconciliation

Analysis Notes

Linked Meaningful Use Criteria

170.314(b)(4) – Clinical information reconciliation

Task Description

Participants were asked to reconcile the allergies on the patient chart with the medications on the CMA

– and to make sure all new allergies were being imported. To do this, they had to check the

‘reconciled’ check box at the bottom of the window (the new allergy was already selected for import by

default).

Risk Analysis

High. Rendering an accurate allergy list is necessary to allow for creating a drug-treatment plan that

takes all currently taken medications and possible drug-allergy interactions into account.

Scoring Notes

Five participants could not complete this task, because they had mistakenly clicked the import button

already – this rendered this patient’s chart changed, so that tasks 3-4 could not be attempted.

Exclusions

Participants 7, 12, 14, 16, 17 – Import executed too early.

Quantitative Summary

Page 122: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 27 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Table 14. Usability Data for Task 3 – Perform allergy reconciliation

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

3 Perform Allergy Reconciliation

10 100 1.15 23.47

(22.12) 11.08 2.11

1.2

(.63)

4.8

(.42)

Errors Table 15. Participant Errors for Task 3 – Perform allergy reconciliation

Participant Error Severity Error Description

6, 16 Minor Control exploration, participants tried to change the state of the incoming allergy list

10, 11, 13,

9, 8, 15, 5,

18, 19

Medium Participants did not select the checkbox that indicates completed allergy reconciliation

15 Severe Participant selected the Import button before finishing all reconciliation steps.

Effectiveness

While all participants found and selected the ‘Allergy’ tab without any problems only one actually

selected the checkbox that indicated that they had reconciled the two allergy lists. Currently, this

checkbox does not have an actual function (other than a mental checkmark for the user that this

function has been performed). If the user does make changes to either list that checkmark becomes

checked automatically. While some users looked and noticed the checkmarks, they did not seem to

understand their function and wanted to click the ‘Import’ button instead.

Efficiency

There were few actions to be done in this subtask, navigation to the allergy list was done without

issues, so the step deviation ratio (1.15) and the time deviation ratio (2.11) were both fairly low.

Comments

Comments are listed only if a participant provided a written comment.

Table 16. Participant Comments for Task 3 – Perform allergy reconciliation

Participant Do you have any additional comments?

6 None

9 Na

Page 123: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 28 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

10 No

13 would like to have an area to add new allergies.

Satisfaction

The satisfaction ratings were relatively higher than in the preceding task, 4.8 – there were few steps

involved so this task was perceived as easy.

Major Findings

This subtask revealed that the participants did not really understand the function of the ‘reconciled’

checkboxes. Part of the problem seems to be that they perceived them to be set by the system as a

result of a different action (changes in list item status) rather than user controls. A button that sets a

control might be more effective – since users were clearly looking for a button to click.

Task 4 – Perform Problem Reconciliation

Analysis Notes

Linked Meaningful Use Criteria

170.314(b)(4) – Clinical information reconciliation

Task Description

Participants were asked to reconcile the problems on the patient chart with the medications on the CMA

– and to make sure no existing problems were being imported. Then they were to finalize the import.

Since all the problems were already on the existing chart, they had to unselect three problems, then

select the import button.

Risk Analysis

High. Rendering an accurate allergy list is necessary to allow for creating a drug-treatment plan that

takes all currently taken medications and possible drug-allergy interactions into account.

Scoring Notes

Six participants could not complete this task, because they had mistakenly clicked the import button

already – this rendered this patient’s chart changed, so that tasks 4 could not be attempted.

Exclusions

Participants 7, 12, 14, 15, 16, 17 – Import executed too early.

Page 124: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 29 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Quantitative Summary

Table 17. Usability Data for Task 4 – Perform problem reconciliation

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

4 Perform Problem Reconciliation and finish import of CMA

9 88.89 1.25 36.96

(23.11) 11.52 3.21

.5

(.53)

4.63

(.74)

Errors Table 18. Participant Errors for Task 4 – Perform problem reconciliation

Participant Error Severity Error Description

8 Minor Confused about which list to work on (right or left)

5, 11, 18,

19 Medium Participants did not select the ‘Import’ button.

Effectiveness

One participant did not make any selections to the problem list (un-selections) and was scored as

unsuccessful. Other participants ended up not selecting the ‘Import’ at the end of this tasks – but this

probably an artifact of having been told to NOT select this button in the previous 3 tasks. One

participant still displayed uncertainty about which list to treat as the ‘import’ list vs. the ‘chart’ list.

Efficiency

If participants did this task successfully they exhibited few extra steps (step deviation ratio was 1.25)

but some uncertainty expressed in a high time deviation ratio (3.21). Even when performing the third

step in the sequence of reconciliations, participants were still not sure about the correct actions.

Comments

Comments are listed only if a participant provided a written comment.

Table 19. Participant Comments for Task 4 – Perform problem reconciliation

Participant Do you have any additional comments?

6 Whenever recociling the patient's chart if it has not been completed allergies, problems, and medications it should stop you until it has been done.

Page 125: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 30 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

9 Na

10 No

Satisfaction

The satisfaction ratings were relatively high, 4.63 – but one participant noted that it would be good to be

stopped from prematurely importing the new record.

Major Findings

No new problems were revealed during this last step in the reconciliation sequence. The uncertainty

about what the function of the ‘import’ buttons was or when to use it led to several participants avoiding

it during this last step.

170.314(a)(7) – Medication allergy list

Test contexts

Usability of the medication allergy list was tested in three tasks, tasks 5-7. Unlike the medication

reconciliation tasks, this functionality was familiar to the participants in our study. During Task 5,

participants were asked to find an existing allergy in the medication allergy list and modify its severity

level. During the Task 6, they were asked to add an allergy to the list, and during Task 7, they were

asked to verify that they had reviewed all allergies.

Task 5 – Look up and edit existing allergy list entry

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(7) – Medication allergy list

Task Description

This task asked participants to find an existing allergy entry and change the severity from mild to

severe.

Risk Analysis

High. An accurate and updated allergy list is important as it alerts physicians to potentially harmful

drug-allergy interactions when creating a treatment plan for patients.

Page 126: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 31 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Scoring Notes

One participant could not execute this task because she mistakenly had forwarded to the next task

instructions without performing the task.

Exclusions

Participant 17 – mistakenly clicked ‘end task’ in the Morae instruction window.

Quantitative Summary

Table 20. Usability Data for Task 5 – Lookup and edit existing allergy list entry

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

5 Look up and edit existing allergy list entry

14 92.86 1.46 72.72

(47.74) 34.21 2.13

1.75

(1.87)

4.71

(.47)

Find existing allergy list item

14 92.86 1.07 43.99

(25.32) 24.66 1.78

.33

(.65)

4.57

(.51)

Change allergy list item

14 92.86 2.51 32.55

(42.46) 9.55 3.41

1.69

(2.14)

4.71

(.47)

Errors Table 21. Participant Errors for Task 5 – Lookup and edit existing allergy list entry

Participant Error Severity Error Description

5, 6, 7, 7, 7,

7, 7, 9, 9, 9,

10, 12, 12,

13, 14, 16,

16, 16 ,16,

16, 16, 19

minor Participants tried to figure out how to enter ‘edit’ mode for an existing allergy entry.

11, 14, 13 minor Navigation mistakes

10, 14 minor Search form mistake when looking up patient

Page 127: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 32 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Effectiveness

All but one participant performed this task successfully. All participants found the allergy entry without

any problem, but 10/14 participants struggled with finding a way to edit it, leading to a number of errors

of opening the wrong dialogs for achieving this goal. All but one participant eventually used ‘double-

click’ to open the edit window and once they had found it, they finished the task correctly without further

issues.

Efficiency

Navigation to the allergy list was performed efficiently. Step and time deviation ratios were small for this

subtask (1.07, and 1.78 respectively). The struggle with opening the edit window led to longer

deviation ratios for the second sub task (2.51 for steps and 3.41 for time).

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 22. Participant Comments for Task 5 – Lookup and edit existing allergy list entry

Participant Do you have any additional comments?

6 None …

9 Na

10 No

16 It was easy

Satisfaction

Satisfaction ratings for this task were fairly high, the relatively lower score for ‘finding the allergy’

probably reflects the struggle with finding a way to open the edit window, not finding the allergy itself

which was quickly located.

Major Findings

The difficulties during this task were caused by inconsistencies in the access methods for opening

existing medication entries and allergy entries that are both presented in the same window. A number

of buttons address functions for the medication list, but there is only one allergy button which allows

creating a new allergy, rather than editing an existing one. Furthermore, ‘double-click’ is not used as an

access method for items on the medication list, so it took participants a while to discover that they could

use it on the allergy list. List functions should be reviewed for consistency and consistent methods for

common actions, such as editing, adding and discontinuing/deleting items should be established and

applied across these common lists.

Page 128: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 33 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Task 6 – Create a new allergy list entry

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(7) – Medication Allergy List

Task Description

Participants were asked to enter a new allergy into the allergy list.

Risk Analysis

High. An accurate and updated allergy list is important as it alerts physicians to potentially harmful

drug-allergy interactions when creating a treatment plan for patients.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary

Table 23. Usability Data for Task 6 – Add an allergy to the allergy list

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

6 Add an allergy to the allergy list

15 100 1.46 58.26

(22.59) 34.76 1.68

.07

(.26)

4.6

(.63)

Error Analysis

Table 24. Participant Errors for Task 6 – Add an allergy to the allergy list

Participant Error Severity Error Description

16 minor Participant mistakenly selects the wrong list item, but immediately notices and changes the selection.

Page 129: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 34 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 25. Participant Comments for Task 6 – Add an allergy to the allergy list

Participant Do you have any additional comments?

6 none

9 NA

10 no

16 No …

Effectiveness

All of the participants successfully completed this task. One minor mistake in selecting a list item was

quickly corrected.

Efficiency

Participants were well used to adding allergies to the allergy list and quickly stepped through the

necessary actions. Step and time deviation ratios were small (1.46 and 1.68).

Satisfaction

Satisfaction scores were high, and there were no comments.

Major Findings

There were no additional findings from this task. Adding an allergy is done through a clearly visible

button and all participants knew how to find this button and filled out the necessary fields with no

problems.

Task 7 – Confirm review of allergy list

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(7) – Medication Allergy List

Task Description

Participants were asked to confirm that they had reviewed the allergy list.

Page 130: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 35 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Risk Analysis

High. An accurate and updated allergy list is important as it alerts physicians to potentially harmful

drug-allergy interactions when creating a treatment plan for patients.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary

Table 26. Usability Data for Task 7 – Confirm review of allergy list

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

7 Confirm review of allergy list

15 100 2.4 17.22

(10.68) 8.71 1.98

.07

(.26)

4.93

(.26)

Error Analysis

Table 27. Participant Errors for Task 7 – Confirm review of allergy list

Participant Error Severity Error Description

19 minor Reconcile instead of review (same functionality)

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 28. Participant Comments for Task 7 – Confirm review of allergy list

Participant Do you have any additional comments?

6 none

9 NA

Page 131: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 36 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

10 no

16 No …

Effectiveness

All of the participants successfully completed this task. One participant clicked the ‘reconcile’ button

instead of the review button, but both buttons have the same function.

Efficiency

Participants were used to reviewing allergies. They were slightly startled by a new reconciled button

that duplicates the function of reviewing and clicked both buttons which explains the slightly elevated

step deviation ratio (2.4) and time deviation ratio (1.98).

Satisfaction

Satisfaction scores were high (4.93), and there were no comments.

Major Findings

This task was completed without issues. It might make sense to remove the old verification button.

Page 132: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 37 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

List of Appendices

The following appendices include supplemental data for this usability test report.

Appendix A: Demographic Questionnaire

Appendix B: Administrator Script and General Instructions

Appendix C: Informed Consent Form

Appendix D: Usability Test Instructions

Appendix E: Experimenters’ Short Biograph

Appendix F: Participant Instructions for Each Task

Appendix G: End-of-Task Questionnaires

Appendix H: System Usability Scale (SUS) Questionnaire

Page 133: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 38 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Appendix A: Demographic Questionnaire

Page 134: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 39 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Page 135: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 40 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Appendix B: Administrator Script and General Instructions

Before participant arrives:

o Start remote desktop application and log into test environment

o Wait with logging into applications until Participant is ready (would time out)

o Start Morae Recorder, but don’t click the red button yet

o Open the training presentation; leave in presentation mode

When participant arrives:

o Welcome Participant

o Make sure participant has filled out Informed Consent sheet, make sure address is legible.

o Make sure participant has filled out Demographics sheet

o Get Participant started on training presentation

Verbal Instructions:

- “Thank you for participating in this usability study today. During this study will be doing

some tasks that are very familiar to you. But you will also perform some new tasks. To

familiarize yourself with these new functions, please walk through this short PowerPoint.

- You will notice that our system is similar to the one you are using, but it is not the same.

Some of the configurations will be slightly different from what you are used to, simply

because this is a somewhat limited test system.

- The PowerPoint will also tell you more about participating in a usability study, and show

you how to interact with the recording software.

- You can always ask me questions during these training slides.”

When participant is done with training presentation:

o Start Morae (press red button)

o Log into applications

Verbal Instructions:

- Remember that this system may be configured slightly different from what you are used

to.

- You start by clicking this button here. You can also move this window around, so that

you can see the system better. If you like, you can use these printed instructions and

these Navigation Instructions to follow along.

Page 136: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 41 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

- When you have started, I won’t be able to answer any more questions. Do you have any

questions before you start?”

Page 137: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 42 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Appendix C: Informed Consent Form

Page 138: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 43 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Appendix D: Usability Test Instructions

Introduction to the Study

• Thank you for participating in this usability study.

• Your participation is very valuable and critical for certification.

• Usability testing is required by ARRA 2; vendors cannot certify their EHR’s unless they have been

evaluated by our end-users

• Your feedback will help us improve the usability of Practice Partner.

• During the study, you will be performing a few routine tasks using Practice Partner, most of which will

be very familiar to you.

• This presentation provides a review of some of the new features of Practice Partner.

• The following screens present an overview of new features in self-guided presentation mode.

• Please take 10 minutes to familiarize yourself with these new functions.

• You can use the page up and down functions to navigate through the presentation.

• If you want to see a part again, you can always back up to previous screens by using the Page Up keys

on your keyboard.

Functionality Review

Study Instructions

• Thank you for participating in this study. Your input is very important.

• Our session today will last about 30 minutes. During that time you will use Practice Partner Version 11

to perform tasks.

Objective of the Study

• The purpose of the study is to uncover:

o Areas where the application performs well; that is, effectively, efficiently, and with satisfaction

o Areas where the application does not fully meet your needs.

• Please note that we are not testing you, we are testing the application. If you have difficulty, this means

only that something needs to be improved in the application.

• The study administrator will be here in case you need specific help with the procedures, but the

administrator will not provide help in how to use the application.

Starting the Study

• The system will show an instruction window. The window displays over the EHR screens you will be

using to complete the task.

Page 139: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 44 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

• Click the Start button to start the study.

• Within the instruction window, the system provides instructions for completing the first task.

• After reading the on-screen instructions, click the Start Task button.

• The instruction window will collapse vertically so that it will not obscure the application.

• You can click the Show Instructions button in the instruction window at any time to expand the

window and read the instructions again.

Performing the Task

• Using the EHR screens, begin performing the task. Please try to complete the tasks on your own

following the on-screen instructions very closely.

• You can also move and reposition the instruction window.

• If you place the mouse cursor focus in the instruction window, reposition the mouse cursor back in the

application window before continuing the task.

Indicating Task Completion

• When you are satisfied that you have completed the task, or when you have completed as much as you

can, click the End Task button.

• The system will open a short survey with questions about the task you have just completed.

• After answering the Participant Survey questions for the task, click the Done button.

• The system will present instructions for the next task.

• Between some tasks, the study administrator may adjust the application to start the next task at the

right location.

Completing the Study

• After you complete the Participant Task Evaluation for the final task, the system will open a System

Usability Scale Questionnaire to enable you to record your overall impressions of the application

Page 140: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 45 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

including all the tasks you performed. Note that you will need to scroll vertically to view and answer all

of the questions.

• After completing the questionnaire, click the Done button to complete the study.

• The instruction window shows a message indicating that the study is complete. Tell the study

administrator that you have completed the study.

• The administrator will click the OK button and save your work.

Page 141: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 46 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Appendix E: Experimenters’ Short Biographies

Marita Franzke received her Ph.D. in Cognitive Psychology after completing a dissertation on Exploration-

based Learning of Display Based User Interfaces at the University of Colorado. She has worked in the field of

Human Computer Interaction since 1991, conducting research on usability and adoption of new end user

technologies in the fields of telecommunication (U S WEST Advanced Technologies, Media One Labs, and

AT&T Broadband Labs), Educational Software (Pearson Knowledge Technologies) and EMR Technologies

(McKesson Provider Technologies). She has 21 years of experience in the field of user research and one year

experience in the field of EMR usability.

Gergana Pavlova joined McKesson Practice Partner over 5 years ago as a technical writer. A year ago she

joined the Practice Partner Product Management team to help with the functional design requirements and

user interface designs of the Meaningful Use Stage 2 Incentive Program capabilities. She has an MS in

English and certificates in User-Centered Design in Business Analysis from the University of Washington.

Page 142: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 47 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Appendix F: Participant Instructions for Each Task

Introduction

You will be treating a 43 year old female patient today.

She comes to the office with symptoms of a UTI.

Please conduct the following 7 tasks using the EHR provided to you.

When you are ready, please click the START button.

Task 1

Task Instructions:

Your patient has recently been seen in another clinic. You have received the Medical

Summary from that clinic and would like to reconcile the data. The import file is located in

the 'Medical Summaries' folder on your desktop.

1) Import the Medical Summary file into the patient chart. The file name and your

patient's name are provided to you on paper.

2) Stop when the Clinical Data Lists screen is displayed. You will reconcile the clinical

information in the next tasks.

Task 2

Task Instructions:

Perform Medication Reconciliation.

1) The patient tells you that she has stopped taking the Cipro, so discontinue it and don’t

select it for import.

2) Select all other new medications for import.

3) Stop when you have reconciled all medications. You will work on problems and

allergies in the upcoming tasks.

Task 3

Task Instructions:

Perform Allergy Reconciliation.

1) Make sure all new allergies will be imported.

Page 143: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 48 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Task 4

Task Instructions:

Perform Problem Reconciliation

1) Make sure all new major problems will be imported.

2) Don't import existing problems.

3) Finalize and import all selected data.

Task 5

Task Instructions:

The patient tells you that her Penicillin allergy has gotten much worse.

1) Change the severity from mild to severe.

Task 6

Document a Wheat Allergy with:

today’s date,

severity level “Moderate”, and

reaction “Nausea/Vomiting/Diarrhea” for the patient.

Task 7

Task Instructions:

1) Review all other allergies.

2) Indicate that you have reviewed the patient’s allergies today.

Page 144: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 49 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Appendix G: End-of-Task Questionnaires

1 - Import CMA

1. Finding the screen to do this task was ...

very difficult 1 2 3 4 5 very easy

2. Finding the Medical Summary file to import was ...

very difficult 1 2 3 4 5 very easy

3. Matching the file with the correct patient was ...

very difficult 1 2 3 4 5 very easy

4. Overall this task was ...

very difficult 1 2 3 4 5 very easy

5. Do you have any additional comments about this task?

Comments:

2 - Medication Reconciliation

1. Discontinuing Cipro was ...

very difficult 1 2 3 4 5 very easy

2. Selecting Medications for Import was ...

very difficult 1 2 3 4 5 very easy

3. Overall this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments about this task?

Comments:

Page 145: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 50 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

3 - Allergy Reconciliation

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Selecting allergies to import was ...

very difficult 1 2 3 4 5 very easy

3. Overall this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments about this task?

Comments:

4 - Problem Reconciliation

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Selecting problems to import was ...

very difficult 1 2 3 4 5 very easy

3. Importing all data was ...

very difficult 1 2 3 4 5 very easy

4. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

5. Do you have any additional comments about this task?

Comments:

5 - Change Allergy

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

Page 146: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 51 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

2. Changing the allergy severity was ...

very difficult 1 2 3 4 5 very easy

3. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments about this task?

Comments:

6 - Add Allergy

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Adding a new allergy was ...

very difficult 1 2 3 4 5 very easy

3. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments about this task?

Comments:

7 - Review Allergy List

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Marking the allergies as reviewed was ...

very difficult 1 2 3 4 5 very easy

3. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments about this task?

Page 147: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 52 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Appendix H: System Usability Scale (SUS) Questionnaire

This questionnaire was presented to the participant after the completion of all tasks and task-based

questionnaires.

Thank you for participating in this study. The following questions address your experience with all the tasks in this study.

1. I think that I would like to use this system frequently

Strongly Disagree 1 2 3 4 5 Strongly Agree

2. I found the system unnecessarily complex

Strongly Disagree 1 2 3 4 5 Strongly Agree

3. I thought that the system was easy to use

Strongly Disagree 1 2 3 4 5 Strongly Agree

4. I think that I would need the support of a technical person to be able to use this system

Strongly Disagree 1 2 3 4 5 Strongly Agree

5. I found the various functions in this system were well integrated

Strongly Disagree 1 2 3 4 5 Strongly Agree

6. I thought there was too much inconsistency in this system

Strongly Disagree 1 2 3 4 5 Strongly Agree

7. I would imagine that most people would learn to use this system very quickly

Strongly Disagree 1 2 3 4 5 Strongly Agree

8. I found the system very cumbersome to use

Strongly Disagree 1 2 3 4 5 Strongly Agree

9. I felt very confident using the system

Strongly Disagree 1 2 3 4 5 Strongly Agree

10. I needed to learn a lot of things before I could get going with this system

Strongly Disagree 1 2 3 4 5 Strongly Agree

Page 148: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 53 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Appendix I: System and Patient Scenario for Nurse’s Tasks

Setting: Ambulatory

Participant Group: Nurse

PATIENT SETUP

Name: XXX Green

Age:43

DOB: 02/03/1970

Gender: female

123 XXX, Denver, CO 80238

Telephone Number: 303-329-6802

Usual provider: Nurse

In patient’s chart:

Allergy List:

Penicillin Potassium

mild

skin rash

onset date: 01/01/1995

Problem List:

Asthma:

o onset date: 1995

o entered on the Major Problems tab

Diabetes:

o onset date: 2000

o entered on the Major problems tab

Med List:

Azmacort, MDI 2PUFFS Q6H X 30 D

o Date 01/01/2000

Metformin, 500 MG BID X 30 D

o Date 01/01/2000

Ciprofloxacin (Cipro oral)

o 250mg twice a day orally, for 7 days,

o start date: 09/05/2013

Page 149: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 54 of 54

EHR: Practice Partner Ver. 11, Part II – Nurse – Ambulatory October 2, 2013

Information on CMA (xml)

Medications:

Will be discontinued and not imported: Ciprofloxacin (Cipro oral)

o 250mg twice a day orally, for 7 days,

o start date: 09/05/2013

Will be continued: Phenazopiridine HCL

o 200 mg TID X 3D

o Start date: 09/05/2013

Allergies:

Egg allergy,

o severe, shock

o start date: 01/01/2013

Problems

UTI - acute

o start date 09/05/2013

o listed as a Major Problem

Asthma:

o onset date: 1995

o entered on the Major Problems tab

Diabetes:

o onset date: 2000

o entered on the Major problems tab

Page 150: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

EHR Usability Test Report Part III

EHR: Practice Partner® VER. 11

User Role: Administrator

Setting: Ambulatory

Applications Tested

Practice Partner® VER. 11

Dates

Usability Test: August 12-29, 2013

Report: September 2, 2013

Prepared by

Marita Franzke, Ph.D., Senior HFE, McKesson Provider Technologies

Gergana Pavlova-Grozeva, Senior BA, McKesson Provider Technologies

Contact Marita Franzke

McKesson Provider Technologies 11000 Westmoor Cr., Suite 125

Westminster, CO 80021 (720) 356-7711

[email protected]

Report based on ISO/IEC 25062:2006

Common Industry Format for Usability Test Reports

Page 151: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 2 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Contents

Tables ............................................................................................................................................................... 4

Executive Summary ........................................................................................................................................... 5

Major Findings and Areas for Improvement .................................................................................................... 8

Introduction ...................................................................................................................................................... 11

Intended Users............................................................................................................................................. 11

Method ............................................................................................................................................................ 11

Participants .................................................................................................................................................. 11

Study Design ............................................................................................................................................... 13

Procedures .................................................................................................................................................. 14

Test Locations.............................................................................................................................................. 15

Test Environment ......................................................................................................................................... 15

Test Forms and Tools .................................................................................................................................. 16

Usability Metrics ........................................................................................................................................... 16

Data Scoring ................................................................................................................................................ 16

Results ............................................................................................................................................................ 19

Data Analysis and Reporting ........................................................................................................................ 19

Discussion of the Findings ............................................................................................................................... 22

170.314(a)(2) – Drug-drug, drug allergy interaction checks .......................................................................... 22

Test contexts ............................................................................................................................................ 22

Task 1 – Configure drug-drug interaction severity level ............................................................................ 22

Task 2 – Change drug-allergy warning interventions ................................................................................ 25

170.314(a)(8) – Clinical decision support ..................................................................................................... 26

Test contexts ............................................................................................................................................ 26

Task 3 – Create and activate demographics-based decision support (Health Maintenance alert) ............. 27

Task 4 – Create and activate complex decision support ........................................................................... 29

List of Appendices ........................................................................................................................................... 33

Appendix A: Demographic Questionnaire ..................................................................................................... 34

Appendix B: Administrator Script and General Instructions .......................................................................... 36

Appendix C: Informed Consent Form ........................................................................................................... 38

Page 152: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 3 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Appendix D: Usability Test Instructions ........................................................................................................ 39

Appendix E: Experimenter's Short Biography ............................................................................................... 42

Appendix F: Participant Instructions for Each Task ...................................................................................... 43

Appendix G: End-of-Task Questionnaires .................................................................................................... 45

Appendix H: System Usability Scale (SUS) Questionnaire ........................................................................... 48

Page 153: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 4 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Tables

Table 1. Administrators' Tasks: Meaningful Use Criteria, Risk, and Application Tested ..................................... 5

Table 2. Result Summary by Task ..................................................................................................................... 7

Table 3. Result Summary by Meaningful Use Criterion ...................................................................................... 8

Table 4. Participant Demographics .................................................................................................................. 12

Table 5. Administrators' Tasks – same as Table 1, repeated here for easier reference ................................... 13

Table 6. Test Environment, Technical Data ..................................................................................................... 15

Table 7. Data Scoring: Measures & Rationale ................................................................................................. 17

Table 8. Usability Data by Task and Subtask - same as Table 2, repeated here for easier reference .............. 19

Table 9. Usability Data by Meaningful Use Criteria – same as Table 3, repeated here for easier reference ..... 20

Table 10. Usability Data for Task 1 – Configure drug-drug interaction ............................................................ 23

Table 11. Participant Errors for Task 1 – Configure drug-drug interaction ....................................................... 23

Table 12. Participant Comments for Task 1 – Configure drug-drug interaction ............................................... 24

Table 13. Usability Data for Task 2 – Configure drug-allergy warning intervention .......................................... 25

Table 14. Participant Errors for Task 2 – Configure drug-allergy warning intervention .................................... 25

Table 15. Participant Comments for Task 2 – Configure drug-allergy warning intervention ............................. 26

Table 16. Usability Data for Task 3 – Create and activate alert ....................................................................... 27

Table 17. Participant Errors for Task 3 – Create and activate alert ................................................................. 28

Table 18. Participant Comments for Task 3 – Create and activate alert .......................................................... 28

Table 19. Usability Data for Task 4 – Create complex alert rule ...................................................................... 30

Table 20. Participant Errors for Task 4 – Causes for overall task failures ....................................................... 31

Table 21. Participant Comments for Task 4 – Create complex alert rule ......................................................... 31

Page 154: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 5 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Executive Summary A usability test of Practice Partner® VER. 11 was conducted August 12-29, 2013 remotely by Marita

Franzke and Gergana Pavlova-Grozeva of the McKesson Provider Technologies. The testing was

conducted remotely due to the difficulty in finding more than a few administrators at any individual

customer site. The purpose of this test was to test and validate the usability of two Meaningful Use

Criteria as implemented in the functions of the Practice Partner® VER. 11. The Office of the National

Coordinator for Health Information Technology (ONC) requires vendors to test 7 Meaningful Criteria

outlined in the test procedures for §170.314(g)(3) Safety-enhanced design1 for ambulatory products.

The criteria focused on the performance of clinical tasks will be discussed in two other reports that

detail the results of usability studies involving nurses/medical assistants, and physicians, as those

clinical tasks are typically conducted by those user roles.

During the usability test, 9 administrators served as participants; they were all familiar with previous

releases of the Practice Partner EHR. In this study, they used this software in 4 simulated, but

representative tasks. More specifics on the sample demographics and participant experience with the

EHR will be detailed below in the Method section.

This study collected performance data on 4 tasks typically conducted by administrators who are

configuring an EHR or user roles for interacting with an EHR. Below is a list of all tasks employed,

which were chosen to evaluate the usability of two of the eight Meaningful Criteria outlined in the test

procedures for §170.314(g)(3) Safety-enhanced design. Table 1 below lists the administrators' tasks in

order of performance, the linked Meaningful Use criteria, and a risk rating for performing each task.

Table 1. Administrators' Tasks: Meaningful Use Criteria, Risk, and Application Tested

Task Meaningful Use Criterion Tested

Application Risk

1 Configure drug-drug interaction severity level

170.314(a)(2) – Drug-Drug and Drug-Allergy Interaction Checks

Practice Partner

High

2 Change drug-drug warning severity level

170.314(a)(2) – Drug-Drug and Drug-Allergy Interaction Checks

Practice Partner

High

3 Create and activate simple demographics-based decision support (Health Maintenance alert)

170.314(a)(8) – Clinical decision support

Practice Partner

High

1 Test Procedure for §170.314(g)(3) Safety-enhanced design. Approved Test Procedure Version 1.2. December 14, 2012; and Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 Edition; Revisions to the Permanent Certification Program for Health Information Technology, Final Rule.

Page 155: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 6 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

4 Create and activate complex demographics and drug-based decision support (Health Maintenance alert)

170.314(a)(2) – Drug-Drug and Drug-Allergy Interaction Checks

Practice Partner

High

During the one hour, one-on-one usability test, each participant was greeted by an administrator when

he/she called into the conference line. Each participant had prior experience with a previous version of

the Practice Partner ® EHR. The participants were asked to review a self-guided training module that

reviewed creating complex Health Maintenance rules, as this task is not practiced frequently, and

introduced instructions for the usability study and using the testing software. The administrator

introduced the test, and instructed participants to complete a series of tasks (given one at a time) using

the EHR Under Test (EHRUT). During the testing, data logging software timed the test and recorded

user screen captures and audio. These recordings were later used to score action steps, errors and

task completion rates. The test administrator did not assist the participant in completing the task.

The following types of data were collected for each participant:

Number of tasks successfully completed without assistance

Time to complete each task

Number and types of errors

Path deviations

Participant’s verbalizations and written comments

Participant’s satisfaction ratings of the system

All participant data was de-identified so that no correspondence can be made from the identity of the

participant to the data collected. At the end of the test session, each participant completed a post-test

questionnaire.

The results from the System Usability Scale scored the subjective satisfaction with the system based

on performance with these tasks to be: 68.

According to Tullis and Albert (2008), raw scores under 60 represent systems with poor usability;

scores over 80 would be considered above average.2 The details of our results are summarized in

Tables 2 and 3 below. Usability judgments were also collected at the end of each major task. The

averages obtained from those task-based usability scores varied between 3.11 and 4.89. The first

three tasks all scored over 4.4 with all tasks scoring over 4, quite a bit above the recommendation put

forward in NISTIR3 7742. The NISTIR 7742 states that “common convention is that average ratings for

systems judged easy to use should be 3.3., or above.” The final task presented a real difficulty to all

participants; in fact only two participants completed this task successfully. This task was rated as 3.11.

We will give special attention to the analysis of this task, which asked the participants not only to

implement a complex health maintenance rule, but it also asked to write a file with conditional logic, and

this appeared to be outside of what any of the participants saw as their normal job duties. The struggle

2 See Tullis, T. & Albert, W (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149).

3 Schumacher, Robert M., and Lowry, Svetlana Z. (2010), Customized Common Industry Format Template for Electronic

Health Record Usability Testing. National Institute of Standards and Technology, NISTIR 7742.

Page 156: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 7 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

with this final task noticeable influenced how the participants felt about the system, and may have had

an impact on their final assessment, as expressed in the SUS score.

Various additional recommended metrics, in accordance with the examples set forth in the NIST Guide

to the Processes Approach for Improving the Usability of Electronic Health Records, were used to

evaluate the usability of the EHRUT. Following is a summary of the performance and rating data

collected on the EHRUT.

Table 2. Result Summary by Task

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

1

Configure drug-drug interaction severity level

9 100 1.93 85.26

(94.29) 28.25 3.02 .78

4.56

(1.01)

Navigation 9 100 2.08

52.93

(80.54) 14.67 3.60 .56

4.44

(0.53)

Configuration 9 100 1.61

32.32

(17.63) 13.58 2.38 .22

4.78

(0.44)

2 Change drug-drug interaction severity level

9 100 1.07 30.53

(15.23) 18.85 1.62 0

4.89

(0.33)

Navigation 9 100 1

12.94

(7.48) 9.32 1.39 0

4.89

(0.33)

Configuration 9 100 1.22

17.6

(8.73) 9.52 1.85 0

4.89

(0.33)

3 Create and activate demographics-based decision support (Health Maintenance alert)

9 100 1.36 113.43

(43.66) 75.12 1.5 .55

4.44

(.73)

Navigation 9 100 1.28

20.21

(13.23) 15.38 1.31 .11

4.67

(0.5)

Activation of rule

9 100 1.67 28.24

(7.34) 25.57 1.11 0

4.33

(0.71)

Create specific overdue dates

9 100 1.24 64.99

(37.3) 35.17 1.85 .44

4.44

(.73)

4 Create and activate complex demographics and drug-based decision support (Health Maintenance alert)

9 22 NA, see

Discussion of the Findings

12 minutes,

38 seconds

NA, see

Discussion of

the Findin

gs

NA, see Discussion of the Findings

NA, see

Discussion of

the Finding

s

3.00

(1.41)

Page 157: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 8 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

The usability result data from Table 2 is grouped by Meaningful Use Criteria in Table 3 below, to allow

for better overview of the usability scores for each criterion.

Table 3. Result Summary by Meaningful Use Criterion

Meaningful Use Criteria and Tasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

170.314(a)(2) – Drug-Drug and Drug-Allergy Interaction Checks

Configure drug-drug interaction severity level

9 100 1.93 85.26

(94.29) 3.02 .78

4.56

(1.01)

Change drug-drug interaction severity level

9 100 1.07 30.53

(15.23) 1.62 0

4.89

(0.33)

170.314(a)(8) – Clinical decision support

Create and activate simple demographics-based decision support (Health Maintenance alert)

9 100 1.36 113.43

(43.66) 1.5 .55

4.44

(.73)

Create and activate complex demographics and drug-based decision support (Health Maintenance alert)

9 22 NA, see

Discussion of the Findings

12 minutes,

38 seconds

NA, see Discussion

of the Findings

NA, see Discussion

of the Findings

3.00

(1.41)

In addition to the performance data, qualitative observations were made. A more detailed discussion of

result details, tasks, and findings is provided later in the document (in Discussion of the Findings).

Major Findings and Areas for Improvement

This section summarizes the findings and areas for improvement at a high level. For additional detail, please

please see section "

Page 158: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 9 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Discussion of the Findings”.

Finding Areas for Improvement

Finding the Drug-drug Interaction Screen

For users who were rusty or unfamiliar with this

function, the menu labels provided little

guidance. The help file eventually revealed the

navigation path, but the participant had to

search for it even there.

1. Design: Revise the help file to provide more

direct pointers to the navigation path for

someone looking to set ‘drug-drug

interactions’, ‘drug drug’ ‘interaction alerts’,

‘drug alerts’, ‘drug-drug alerts’ or drug-allergy

alerts’. Optimize the search terms and provide

direct links to instructions for the navigation

path. ‘Prescription defaults’ was not a helpful

key word or menu label.

2. Menu labels. I may not be advisable to

change the menu labels since users quickly

remembered the navigation path after

discovering it the first time (and it might

confuse existing users).

Drug Interaction Settings Tab

1. As expressed in the participant

comments and the behavioral

observations, participants were slightly

confused about the severity levels

expressed in the drop-down controls.

2. Participants did not understand a second

control having to do with displaying the

depth of the documentation.

Misunderstanding of this control could

lead to erroneous and possibly harmful

settings of this control.

1. Design: Consider a more direct way of

labeling the drop down and drop down

instructions, use verbal labels next to each

number. Apply this to the severity and

documentation controls.

2. Design: Provide more detailed help

explaining the documentation control, and

consider including this function in the

training.

3. Design: Consider making this function

visually more different from the severity

control so that users don’t get misled in

thinking that these two fields have to be set

together.

Activating Decision Supports (Health

Maintenance Alerts)

1. The user interface provided good

guidance when creating simple alerts

based on one type of trigger alone.

2. When a combined trigger had to be

created, participants had to leave the

user interface and create a rule file, then

attach that rule file to an existing rule.

While this allows for a great amount of

1. Training: If the creation of complex

decision rules is expected to be part of the

administrators’ job, train on logic, syntax

and efficient way of connecting rule files to

existing rules. Provide plenty of

opportunities for practice.

2. Design: At a minimum, remove the grey

shading of the rule field in the procedure

form, to make it appear more editable and

easier to find.

3. Design: Better yet, change the filed into a

‘browse’ functionality so that the users can

navigate to the rule file they want to upload.

Page 159: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 10 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

flexibility in the types of alerts that can be

built, this task proved to be extremely

difficult for the participants.

This eliminates the frustration of having to

remember the file name; it will also solve

the problem of entering the wrong path

name.

4. Design: Consider providing more active

supports around writing rule files.

Suggestions:

a. A syntax checker

b. A form-based rule editor

c. Find out the types of rules most

commonly created. From the help

files, consider linking to examples of

rule elements (based on this

analysis above) that could be used

to copy frequently used elements

into files, where they can be edited.

d. Consider providing a rule test

facility.

Page 160: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 11 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Introduction The McKesson Practice Partner® EHR tested for this study was accessible to the participants in an

integrated environment, as would be typical in a provider implementation. The entire environment was

software version VER. 11. The usability testing environment and tasks attempted to simulate realistic

implementations, workflows and conditions. The purpose of the study was to test and validate the

usability of the current user interface, and provide evidence of usability in the EHR Under Test

(EHRUT). To this end, measures of effectiveness, efficiency and user satisfaction were captured during

the usability testing.

Intended Users

Intended administrative users of the EHRUT are medical directors, system administrators, or

implementation and support team members working in an environment supporting ambulatory patient

care. In these functions the may be working for the vendor or Value Added Resellers (VARs),

supporting smaller clinics who don’t have their own system administrators, or they may be working for

provider organizations directly. Users may have varying levels of experience in the clinical setting and

may have varying years of experience performing administrative functions with the EHRUT. They can

be expected to have received training on the EHRUT, and to be frequent users (weekly or daily use).

Specifically for the administrators' tasks, setting and changing the drug interaction warnings and

activating Health Maintenance rules is part of implementing the EHR and customizing it for a particular

site. It can be expected that these functions may be used infrequently by the majority of users on

customer provider sites as setups do not often change and the majority of the setup is done when the

software is implemented in the facility. Since Practice Partner is a product that support relatively small

Medical Practices, most often the task of implementing, training, and configuring/customizing goes to

the vendor, VARs or third-party support organizations. The participants reflect these groups of users.

Method

Participants

A total of 9 participants were tested on the EHRUT for the administrator role in the ambulatory setting.

Participants were associated with the McKesson Implementation and Support team, four different VARs

and two customer sites. Ten participants had been recruited by technical difficulties with the test

administrators PC prohibited data collection with one participant. All participants were recruited through

an email sent out to various customers. Participants had no direct connection to the development of

the product.

Participants were recruited through an open email to several customers, so there was no control over

responses to the email. All participants had experience configuring previous versions of Practice

Partner, training users on the application and creating various templates. The experience with the

application ranged from 3.5 years to 7 years. A demographics questionnaire was sent out in an email

Page 161: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 12 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

and had to be received by the facilitator before the session began; the questionnaire is included as

Appendix A: Demographic Questionnaire. Table 4 lists all 9 participants included in the analysis by

demographic characteristics, such as demographics, education, and EMR experience. Participant

names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual

identities.

Table 4. Participant Demographics

ID Gender Age Education Title Years experience with Practice Partner

Percentage Time spent configuring system

Percent time spent building templates

1 M 31-40 Some college

Account Mgr. missing 10 20

2 M 31-40 BS Tier 2 Technical Support

6.5 80 0

3 F 51-60 MA Technical Course Developer

3.5 25 0

4 M 41-50 Associate Senior Support Analyst

6 15 10

5 M 41-50 MD Supervisor, optimization, training & process mgmt

3.5 15 15

6 M 51-60 BS Technical Consultant

3.5 2 2

7 M 51-60 BS Implementation Specialist

7 30 30

8 F 51-60 HS Sr. Practice Mgr & EMR Trainer

6 25 35

9 F 31-40 BA EMR Analyst 6 25 25

All participants indicated that they rarely created conditional logic files as needed for the complex

decision support task. A short, self-paced training PowerPoint presentation was provided to

participants before the study, to review this functionality. These training modules were similar to what

end-users would usually receive when introduced to new functionality. The training also included some

explanation of the software used to capture the session data.

Participants were scheduled for one hour sessions which provided time between sessions for debrief by

the administrator, and to get the system ready for the next participant. A spreadsheet was used to keep

track of the participant schedule, and the participant demographics.

Page 162: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 13 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Study Design

Overall, the objective of this test was to uncover areas where the application performed well — that is,

effectively, efficiently, and with satisfaction — and areas where the application failed to meet the needs

of the participants. The data from this test may serve as a baseline for future tests with an updated

version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In

short, this testing serves as both a means to record or benchmark current usability, but also to identify

areas where improvements must be made. Specifically, the purpose of this test was to test the usability

of 2 of the 7 Meaningful Use Criteria that are defined in 45 CFR Part 170, RIN 0991-AB824 for

ambulatory use. The current study focused on the 2 MU criteria used typically by administrators. Two

separate reports detail the findings from tasks associated with 7 MU criteria typically conducted by

clinicians. Table 5 below cross-tabulates which task in the study of administrator tasks is associated

with which MU criterion and lists the risk associated with that task.

Table 5. Administrators' Tasks – same as Table 1, repeated here for easier reference

Task Meaningful Use Criterion Tested

Application Risk

1 Configure drug-drug interaction severity level

170.314(a)(2) – Drug-Drug and Drug-Allergy Interaction Checks

Practice Partner

High

2 Change drug-drug warning severity level

170.314(a)(2) – Drug-Drug and Drug-Allergy Interaction Checks

Practice Partner

High

3 Create and activate simple demographics-based decision support (Health Maintenance alert)

170.314(a)(8) – Clinical decision support

Practice Partner

High

4 Create and activate complex demographics and drug-based decision support (Health Maintenance alert)

170.314(a)(2) – Drug-Drug and Drug-Allergy Interaction Checks

Practice Partner

High

During the usability test, participants interacted with the above listed functions within the McKesson

PracticePartner® VER. 11. Each participant used the system remotely in the exact same configuration,

and was provided with the same instructions. The system was evaluated for effectiveness, efficiency

and satisfaction as defined by measures collected and analyzed for each participant:

o Number of tasks successfully completed without assistance

o Time to complete tasks

4 45 CFR Part 170, RIN 0991-AB82. Health Information Technology: Standards, Implementation Specifications, and

Certification Criteria for Electronic Health Record Technology, 2014 Edition; Revisions to the Permanent Certification Program for Health Information Technology.

Page 163: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 14 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

o Number of errors

o Path deviations

o Participant verbalizations (comments)

o Participant’s satisfaction ratings of the system

Procedures

Because the testing was conducted remotely, the participant and facilitator had to enter a Webex

meeting and connect to the conference call. Once the participant and facilitator were both in the

session, the participant was verbally instructed and oriented to the test situation (see Appendix B:

Administrator Script and General Instructions). Each participant had been sent an informed consent

form (Appendix C: Informed Consent Form) through email several days before the session and asked

to return the completed documents before the session. After being greeted and receiving the verbal

instructions and orientation, the demographics questionnaire was administered (Appendix A:

Demographic Questionnaire). Then the facilitator gave control of the screens to the participant. The

participant was asked to take control of the screens by clicking their mouse, which allowed him/her to

step through a set of self-paced training slides. Participants spent between 5-10 minutes reviewing this

presentation of the software. The presentation also included usability test instructions and an

introduction to the Morae recording software (Appendix D: Usability Test Instructions).

One test administrator per participant ran the test; another test administrator attended the testing

sessions remotely, but remained silent. Data logging was achieved automatically by the Morae

recording software. Detailed data analysis and scoring of the recorded protocols was done after the

test procedure by coding the recording with more fine grained task, action and error markers.

The usability testing facilitators conducting the test was a senior experienced usability practitioner

(Ph.D. Cognitive Psychology with a Human Factors emphasis.), with over 20 years of experience in the

field of User Experience Design and Research, and 1.5 years of experience developing EHRs (see

Appendix E: Experimenter's Short Biography).

For each task, the testing software provided an on-screen description of the task to be performed. See

"Appendix F: Participant Instructions for Each Task" for the complete text of each on-screen task

prompt.

Task timing began when the participant clicked the Start Task button. Task time was stopped when the

participant clicked the End Task button. Participants were instructed to click End Task when they had

successfully completed the task or when they could proceed no further.

After each task the participant was given a short satisfaction questionnaire related to that task

(Appendix G: End-of-Task Questionnaires). After all tasks were completed, the participant was given a

System Usability Scale (SUS) Questionnaire (Appendix H: System Usability Scale (SUS)

Questionnaire).

Following the test, the administrator provided time for the participant to informally communicate

impressions about the usability of the EHRUT and about the testing process.

Page 164: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 15 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Test Locations

Participants were tested in their own facilities through a remote connection. While all participants tried

to create a quiet space for the sake of testing, their busy professions did not always allow for this.

Background noise in these test situations varied from completely quiet, to busy background

conversations. Given the usually hectic context of clinical practices and the use of EMRs, this

represented fair and realistic conditions under which to test the software.

Test Environment

The participants performed the test tasks on their own computer systems. However, the presentation

of the information was done through the test administrator's system. The details of that system are: (1)

an HP Elitebook Folio 9470m laptop, DPI setting normal (96 DPI), Highest Color Quality (32 bit), 1366

X 768 pixel screen resolution, and a 60 Hertz screen refresh rate. Audio was recorded through the

computer’s internal microphone. Participants interacted with the remote software through their

keyboard and mouse or touchpad.

Table 6. Test Environment, Technical Data

Connection WAN. Dial-in into McKesson VPN, connection into a remote login server running the Horizon Ambulatory Care® ER 12 and HCCT ER 12 usability test environment.

PracticePartner®

VER. 11 Usability Test

Environment

A special usability test environment was created using the generic data configuration used for the PracticePartner Environment. The environment simulates the complete EHR including all functions and backend support software that a real implementation would provide, including data integration with external systems allowing for data flow from pharmacy solutions, integration with electronic data transfer to pharmacies, and allowing integration with the InfoButton reference vendor. No other user groups had access to this environment, ensuring a pristine test environment with full control of the configuration of the test data.

It is important to note that while all efforts were made to create a fully functioning test environment, the configurations of menus and menu choices, applications, alerts, etc. were similar but not the same as what any particular provider/customer would use. All of the customized functions are different from site to site, depending on hospital policies and implementation.

Our test environment is configured to show what the system is capable of and therefore presents more alerts, confirmation screens, list items etc., than many facilities would choose to implement.

Generally, the system performance (i.e. response time) was representative to what actual users would experience in a field implementation; however, testing remotely introduced a lag time. Data from tasks or trials where system response time slowed perceivably were excluded from the analysis. Additionally, participants were instructed not to change any of the default settings (such as font size).

EHR software and version

PracticePartner® Version 11

Testing software Morae Recorder Version 3.3.2 (participant)

Morae Manager Version 3.3.2 (data analysis)

Page 165: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 16 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Test Forms and Tools

During the usability test, various documents and instruments were used, including:

Informed Consent

Demographic Questionnaire

Experimenter’s Verbal Instructions

Training slides

Observer’s Task Sheets

Post-task surveys (presented through Morae)

Post-test survey (System Usability Scale, SUS, presented through Morae)

Most of these materials are included in the appendices.

Usability Metrics

According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health

Records, EHRs should support a process that provides a high level of usability for all users. The goal is

for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction.

To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability

testing. The goals of the test were to assess:

Effectiveness of the EHRUT by measuring participant success rates and errors.

Efficiency of the EHRUT by measuring the average task time and path deviations.

Satisfaction with the EHRUT by measuring ease of use ratings.

Data Scoring

The following table details how tasks were scored, errors evaluated, and the time data analyzed.

Page 166: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 17 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Table 7. Data Scoring: Measures & Rationale

Measures Rationale and Scoring

Effectiveness:

Task Success

A task is counted as a “Success” if the participant was able to achieve the correct outcome, without user interface directed assistance. Direction was occasionally provided with understanding task instructions, but never in how a task was to be accomplished.

A task is counted as a “Failure” if the participant was never able to find the correct action, or ended the task believing that they had accomplished all tasks without having done so.

The total number of successes are calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Task times are recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency.

Optimal task performance time, as benchmarked by the 3 users who finished the task without errors and deviations and in the shortest amount of time, was calculated by averaging the times of these 3 users. In cases where more than 2 participants failed a task, the optimal task performance time was calculated using the 2 users who finished the task without errors and deviations in the shortest amount of time. No adjustments were made to the optimal time. Average group times were divided by the optimal performance time for a measure of how close to optimal efficiency each task was performed. It was decided not to define time allotments, because it was felt that different adjustments would have to be made for new tasks and for existing task flows, and we could not find a generally accepted empirical or principled way to determine meaningful adjustments. All participants were allowed to finish the tasks, and the time ratios provide a measure of how much longer an average user would take to perform these tasks.

We felt this was a cleaner, and less biased measure than to define task allotment times based on arbitrary time adjustments. Using the same or no adjustment for a number of studies would also allow better comparison across studies, rather than trying to compare rates or ratios that had all been derived by different adjustments.

Effectiveness:

Task Failures and

Errors

If the participant abandons the task, does not reach the correct answer or performs it incorrectly the task is counted as a “Failure.” For each task the percentage of participants who successfully completed it are reported. The failures can be calculated by subtracting the successful completions from 100.

The total number of errors is calculated for each task and then divided by the total number of times that task was attempted. Not all deviations are counted as errors.

Errors are selections of wrong menu items, wrong list items, or entering wrong data formats. If an error results in a number of corrective steps, only the first actions, resulting in the error is counted as error. All corrective steps are counted as deviations.

On a qualitative level, an enumeration of errors and error types is collected.

Efficiency:

Task Deviations

The participant’s path (i.e., steps) through the application is recorded. Deviations occur if the participant, for example, chooses a path to task completion which requires more steps than the optimal, fills out additional fields, or performs any steps to correct a previous error. This path is compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation.

Efficiency:

Task Time

Each task is timed from when the participant clicks the Start Task button until the participant clicks the End Task button. Only task times for tasks that are successfully completed are included in the average task time analysis. Average time per task is calculated for each task. Variance measures (standard deviation) are also calculated.

Satisfaction:

Task Rating

Each participant’s subjective impression of the ease of use of the application is measured

by administering both a simple post-task question as well as a post-session questionnaire.

After each task, the participant is asked to rate “Overall, this task was:” on a scale of 1

(Very Difficult) to 5 (Very Easy). If the task included a number of subtasks, a question

addressing the ease of each subtask was included (see Appendix G: End-of-Task

Page 167: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 18 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Measures Rationale and Scoring

Questionnaires).

These data are averaged across participants. Common convention is that average ratings

for systems judged easy to use should be 3.3 or above.

To measure participants’ confidence in and likeability of the [EHRUT] overall, the testing team administers the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.” See the full System Usability Scale questionnaire in "Appendix H: System Usability Scale (SUS) Questionnaire".

Page 168: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 19 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Results

Data Analysis and Reporting

The results of the usability test were calculated according to the methods specified in the Usability

Metrics section above.

Table 8. Usability Data by Task and Subtask - same as Table 2, repeated here for easier reference

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

1

Configure drug-drug interaction severity level

9 100 1.93 85.26

(94.29) 28.25 3.02 .78

4.56

(1.01)

Navigation 9 100 2.08

52.93

(80.54) 14.67 3.60 .56

4.44

(0.53)

Configuration 9 100 1.61

32.32

(17.63) 13.58 2.38 .22

4.78

(0.44)

2 Change drug-drug interaction severity level

9 100 1.07 30.53

(15.23) 18.85 1.62 0

4.89

(0.33)

Navigation 9 100 1

12.94

(7.48) 9.32 1.39 0

4.89

(0.33)

Configuration 9 100 1.22

17.6

(8.73) 9.52 1.85 0

4.89

(0.33)

3 Create and activate demographics-based decision support (Health Maintenance alert)

9 100 1.36 113.43

(43.66) 75.12 1.5 .55

4.44

(.73)

Navigation 9 100 1.28

20.21

(13.23) 15.38 1.31 .11

4.67

(0.5)

Activation of rule

9 100 1.67 28.24

(7.34) 25.57 1.11 0

4.33

(0.71)

Create specific overdue dates

9 100 1.24 64.99

(37.3) 35.17 1.85 .44

4.44

(.73)

4 Create and activate complex demographics and drug-based decision support (Health Maintenance alert)

9 22 NA, see

Discussion of the Findings

12 minutes,

38 seconds

NA, see

Discussion of

the Findin

gs

NA, see Discussion of the Findings

NA, see

Discussion of

the Finding

s

3.00

(1.41)

Page 169: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 20 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

The results from the System Usability Scale scored the subjective satisfaction with the system based

on performance with these tasks to be: 68.

According to Tullis and Albert (2008), raw scores under 60 represent systems with poor usability;

scores over 80 would be considered above average.5 The details of our results are summarized in

Tables 2 and 3 below. Usability judgments were also collected at the end of each major task. The

averages obtained from those task-based usability scores varied between 3.11 and 4.89. The first

three tasks all scored over 4.4 with all tasks scoring over 4, quite a bit above the recommendation put

forward in NISTIR6 7742. The NISTIR 7742 states that “common convention is that average ratings for

systems judged easy to use should be 3.3., or above.” The final task presented a real difficulty to all

participants; in fact only two participants completed this task successfully. This task was rated as 3.11.

We will give special attention to the analysis of this task, which asked the participants not only to

implement a complex health maintenance rule, but it also asked to write a file with conditional logic, and

this appeared to be outside of what any of the participants saw as their normal job duties. The struggle

with this final task noticeable influenced how the participants felt about the system, and may have had

an impact on their final assessment, as expressed in the SUS score.

Various additional recommended metrics, in accordance with the examples set forth in the NIST Guide

to the Processes Approach for Improving the Usability of Electronic Health Records, were used to

evaluate the usability of the EHRUT. Following is a summary of the performance and rating data

collected on the EHRUT.

In the following table, Table 9, the usability metrics are reported grouped out by MU criterion. Please

note that the MU criteria related to clinician tasks have been tested with two other user groups (nurses

and physicians). Those results are summarized in 2 separate reports.

Table 9. Usability Data by Meaningful Use Criteria – same as Table 3, repeated here for easier reference

Meaningful Use Criteria and Tasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

170.314(a)(2) – Drug-Drug and Drug-Allergy Interaction Checks

Configure drug-drug interaction severity level

9 100 1.93 85.26

(94.29) 3.02 .78

4.56

(1.01)

Change drug-drug interaction severity level

9 100 1.07 30.53

(15.23) 1.62 0

4.89

(0.33)

170.314(a)(8) – Clinical decision support

Create and activate 9 100 1.36 113.43 1.5 .55 4.44

5 See Tullis, T. & Albert, W (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149).

6 Schumacher, Robert M., and Lowry, Svetlana Z. (2010), Customized Common Industry Format Template for Electronic

Health Record Usability Testing. National Institute of Standards and Technology, NISTIR 7742.

Page 170: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 21 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Meaningful Use Criteria and Tasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

simple demographics-based decision support (Health Maintenance alert)

(43.66) (.73)

Create and activate complex demographics and drug-based decision support (Health Maintenance alert)

9 22 NA, see

Discussion of the Findings

12 minutes,

38 seconds

NA, see Discussion

of the Findings

NA, see Discussion

of the Findings

3.00

(1.41)

Page 171: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 22 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Discussion of the Findings This section presents the study findings, organized by Meaningful Use Criteria.

170.314(a)(2) – Drug-drug, drug allergy interaction checks

Test contexts

Drug-drug, drug-allergy interaction configurations were tested twice. In Task 1 the participants were

asked to turn the configurations on and set the alerts to the most sensitive level. Task 2 tests the

usability of changing drug-allergy alerts to appear only for the highest level of drug-drug interactions.

Task 1 – Configure drug-drug interaction severity level

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(2) – Drug-drug, drug-allergy interaction checks

Task Description

This task asked participants to configure drug-drug interaction alerts such that all alerts will show, even

alerts of the least severity.

Risk Analysis

High. Configuring alerts for drug-drug interactions based on severity allows clinicians to adjust

prescribed medications based on identified issues. Depending upon the severity level for which the

alert is configured, too many alerts may appear for the clinician, resulting in alert fatigue. On the other

hand, if the alert don’t show important interactions they remain inefficient. Depending on the situation

administrators should be able to configure this setting without error and without confusing the levels.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Page 172: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 23 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Quantitative Summary

Table 10. Usability Data for Task 1 – Configure drug-drug interaction

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

1

Configure drug-drug interaction

severity level

9 100 1.93 85.26

(94.29) 28.25 3.02 .78

4.56

(1.01)

Navigation 9 100 2.08 52.93

(80.54) 14.67 3.60 .56

4.44

(0.53)

Configuration 9 100 1.61 32.32

(17.63) 13.58 2.38 .22

4.78

(0.44)

Participant Errors

Table 11. Participant Errors for Task 1 – Configure drug-drug interaction

Participant Error Severity Error Description

3 Minor Navigation Error, participant explores menu options

6 Minor Navigation Error, participant explores the wrong menu

3 Minor Navigation Error, participant explores the wrong tab in correct screen

8 Medium

Select the wrong severity level. Selected the second to lowest. This participant never opened the drop down and seemed to assume that ‘1’ was the lowest level (in fact, ‘0’ is the lowest level). This would still display most interaction warnings, but it is not the lowest level, and so in a situation where a clinic wanted to display everything this would present a medium amount of risk.

9 Minor Navigation Error. This participant had never used this function before and explored many menus, resorting to the help files to eventually find the correct menu, screen and controls.

6 Medium

This participant selected another control and set it to high. This will increase the amount of documentation shown – something we didn’t ask for. When asked later, this participant did not understand this function. Changing this function did not appear to create a safety risk. However, ‘superstitious’ use of controls can always create unwanted states and therefore is associated with some amount of risk.

9 Minor Navigation Error. Another sequence of exploring menus by the participant unfamiliar with this function.

Page 173: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 24 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Effectiveness

All participants performed this task successfully. All except for one participant seemed very familiar

with this task. The variances in steps (deviations) and task times (deviations), were mostly due to this

participants’ long navigation sequences. Two other participants also accessed the wrong menus or

tabs briefly but found the functions quite quickly after that.

Efficiency

Eight of the nine participants quickly found the prescription screen and made the necessary

adjustments. One participant struggled with finding the correct menu; however she used the provided

online help files to eventually locate the controls she was looking for. It is mostly due to this last

participant’s long exploration sequence that the navigation step deviation ratio was 2.08 and the time

deviation was 3.02. While every participant set the severity level to a low level, one participant kept the

second to lowest (which is a medium mistake), but eight participants also set a second control to the

lowest level – which was not necessary. This second drop-down list allows to control the level of

documentation to display. When asked after the end of the study why they set this control, participants

seemed to be confused about the actual function of this control, and they appeared to want to ‘make

sure’ that they alert really displayed.

Comments

Comments are listed only if a participant provided a written comment.

Table 12. Participant Comments for Task 1 – Configure drug-drug interaction

Participant Do you have any additional comments?

5 Would be helpful if the description for the levels in the "Rx Defaults" tab is more descriptive (eg- 0- Lowest- All interactions- minor & major, ...... )

6 Instead of a scale of 1 - 5 how about low severety, medium severity and high severity as drop downs (or similar)

Satisfaction

This familiar task was perceived to be easy for users (4.56).

Major Findings

Users familiar with the application had no difficulty finding it in the menu structure. However for a user

unfamiliar with this function, the menu labels provided little guidance. The help file eventually revealed

the navigation path, but the participant had to search for the path even there. I might be good to review

the help text and make the pointers to the navigation path more obvious. The task revealed two more

significant issues. As expressed in the participant comments and some of the observations, the

participants were slightly confused about the severity levels expressed in the drop down controls. Most

importantly, a second control on the same screen (Level of documentation to display) is poorly

understood and poorly labeled.

Page 174: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 25 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Task 2 – Change drug-allergy warning interventions

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(2) – Drug-drug, drug-allergy interaction checks

Task Description

This task asked participants to change the configuration for displaying drug-allergy alerts based on

severity of the allergy.

Risk Analysis

High. Configuring alerts for drug-allergy interactions based on allergy severity allows clinicians to

adjust prescribed medications based on identified issues. Depending upon the severity level for which

the alert is configured, too many alerts may appear for the clinician, resulting in alert fatigue.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary

Table 13. Usability Data for Task 2 – Configure drug-allergy warning intervention

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

2 Change drug-drug interaction severity level

9 100 1.07 30.53

(15.23) 18.85 1.62 0

4.89

(0.33)

Navigation 9 100 1

12.94

(7.48) 9.32 1.39 0

4.89

(0.33)

Configuration 9 100 1.22

17.6

(8.73) 9.52 1.85 0

4.89

(0.33)

Errors Table 14. Participant Errors for Task 2 – Configure drug-allergy warning intervention

Participant Error Severity Error Desctiption

- NA Participants did not perform any errors during this task.

Page 175: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 26 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Effectiveness

All participants performed this task successfully. After the navigation path had been discovered during

Task 1, no navigation errors were committed. This task also did not lead to any errors in changing the

severity to the desired level or to making any additional changes to the documentation control.

Efficiency

This task was performed with high efficiency. The overall step deviation ratio was 1.07, and the overall

time deviation was 1.62. After learning where to find the function (or refreshing their memory on where

to find it), all participants found the function quickly and performed the task with high efficiency.

Comments

Comments are listed only if a participant provided a written comment.

Table 15. Participant Comments for Task 2 – Configure drug-allergy warning intervention

Participant Do you have any additional comments?

5 Better descriptions for the levels on the "Rx Defaults- Drug Interactions tab" (eg- 3-Highest- Only Major)

Satisfaction

The satisfaction ratings were high, 4.89 across the whole task.

Major Findings

This task was performed without problems. However, the observations from Task1 apply here as well.

What this second task showed was that users don’t have a problem changing the severity settings,

however they still express that it would be good to have slightly more direct labels for the controls.

170.314(a)(8) – Clinical decision support

Test contexts

Creation of clinical decision support interventions was tested in 2 tasks. Task 3 tests the usability of

creating an alert based on a patient's demographic data and asks participants to configure a specific

set of overdue alerts. Task 4 tests the configuration of an alert that combines demographic and

medication based alerts. This task asked users not only to manipulate forms in the Practice Partner

applications, but also to create and load external rules files.

Page 176: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 27 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Task 3 – Create and activate demographics-based decision support (Health

Maintenance alert)

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(8) – Clinical decision support

Task Description

This task asked participants to create an alert based on a complex demographic (female and between

11 and 18 years of age). It then asked participants to create alerts at ages 12, 14, 16, 18 years.

Risk Analysis

High. Creating alerts can help clinicians to recognize potential issues for patients and allow them to

address those issues during an encounter. However, alerts must be used with care as too many can

lead to alert fatigue, resulting in clinicians missing important alerts due to the possibility of false alarms.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary Table 16. Usability Data for Task 3 – Create and activate alert

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

5 Create and activate

demographics-based decision support (Health

Maintenance alert)

9 100 1.36 113.43

(43.66) 75.12 1.5 .55

4.44

(.73)

Navigation 9 100 1.28

20.21

(13.23) 15.38 1.31 .11

4.67

(0.5)

Activation of rule

9 100 1.67 28.24

(7.34) 25.57 1.11 0

4.33

(0.71)

Create specific

overdue dates

9 100 1.24 64.99

(37.3) 35.17 1.85 .44

4.44

(.73)

Page 177: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 28 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Errors Table 17. Participant Errors for Task 3 – Create and activate alert

Participant Error Severity Error Desctiption

3 Minor Initially wrong use of special format of age/month field. Figures out how to use it.

7 Minor Navigation error. Picked wrong menu and went down wrong path, but quickly corrected and found the desired function.

7 Minor Initially wrong use of special format of age/month field. Figures out how to use it.

9 Minor Wrong use of special format of age/month field.

9 Minor Second wrong used of special format. Used another rule as an example for how to use it and corrects her mistake.

Effectiveness

All participants performed this task successfully. One participant committed a minor navigation error

that was quickly corrected. A specially formatted field that asks participants to enter year/month age

led three participants to make minor mistakes in using this format. The system feedback helped them

to figure out what as needed, and one participant looked at some other rules to see how this field had

been filled out. This situation was handled well by the user interface, because the system did not

accept wrongly formatted dates, and gently guided participants to enter the correct information.

Efficiency

Participants performed this task quite efficiently. The overall step deviation ratio was 1.36., and the

overall time deviation ratio was 1.5. The slight struggle with the special age format led to somewhat

longer time deviation of 1.85, but even this subtask was conducted quite quickly.

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 18. Participant Comments for Task 3 – Create and activate alert

Participant Do you have any additional comments?

5 None …

7 Didn’t test it…

Page 178: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 29 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Satisfaction

Although participants performed this task without major errors or inefficiencies, the overall satisfaction

rating for this task was a litter lower than for Task 1 or 2 (4.44). This may have been because

participants perceived this to be a cognitively more challenging task (it had more sub goals). But a few

participants also mentioned that they would usually check to make sure the rule really worked. We did

not ask them to do this in our setting, but not being able may have left them feeling like they were not

sure if they had done this task correctly. This is an issue with providing sufficient feedback after the

task has been completed. We will come back to this issue when discussing the Task 4.

Major Findings

In general, participants had little difficulty with this task; the user interface provided good guidance

when it came to entering the correct format for the age. The only possible issue could be providing

better feedbacks about whether a new rule had the desired effect.

Task 4 – Create and activate complex decision support

Analysis Notes

Linked Meaningful Use Criteria

170.314(a)(8) – Clinical decision support

Task Description

This task asked participants to create a complex rule using demographic and medication information.

The task was complicated by a number of things. 1. The system comes with a number of pre-

configured complex rules, and creating new ones does not happen very often. Our participants told us

that they may have had to modify an existing rule, but never had to create one from scratch. 2. To

create a new rule, a conditional rule file must be written using a system specific syntax and elements.

The participants were not familiar with writing these statements. 3. The rule file then has to be stored

in a specific location. 4. The rule file has to be added to a health maintenance trigger that is defined by

system controls. In our case the demographic rule was specified in the rule file, but the medication

rule was specified through the user interface. 5. The rule file then has to be added to the medication

trigger to result in the combination of the two criteria. (6. After the creation, the alert should be tested to

make sure that it works as desired).

We provided training on an example problem that was analogous to the problem in the usability test.

The only difference between the two problems was the name of the medication for which to use the rule

and the specifics of the demographic information to use as a trigger. Participants were allowed to use

the logic statement of the example during the usability test to help them figure out the correct syntax,

because that is how non-programmers often approach a programming problem (use working examples

to modify).

Page 179: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 30 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Despite these supports only 2 participants (out of 9) successfully created a rule. Since most of the time

was spent writing the logic rule and trying different ways to load it into the existing rule we did not

attempt a keystroke-level analysis of this task. We will discuss the findings in narrative and suggest

solutions to some of the discovered problems from that narrative.

Risk Analysis

High. Correctly targeting alerts helps those clinicians who need the information to receive that

information while treating patients, while also keeping the information from clinicians who do not need

it, thereby reducing the possibility of alert fatigue.

Scoring Notes

All participants followed instructions as expected. There were no difficulties resulting in exclusions.

Exclusions

None

Quantitative Summary

Table 19. Usability Data for Task 4 – Create complex alert rule

Tasks and subtasks

N Task Success

Path Deviation

Task Time (in seconds)

Errors Task Ratings 5=Easy

# Percentage

Deviations (Observed / Optimal)

Mean Observed

(SD)

Optimal Deviations (Observed / Optimal)

Mean (SD)

Mean (SD)

5 Create and activate complex demographics and drug-based decision support (Health Maintenance alert)

9 22

NA, see discussion

below in the result section

12 minutes,

38 seconds

NA, see

discussion

below in the result

section

NA, see discussion below in the result section

NA, see

discussion

below in the result

section

3.00

(0.78)

Page 180: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 31 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Error and Failure Analysis

Table 20. Participant Errors for Task 4 – Causes for overall task failures

Rule file logic Rule file syntax Rule file location Choosing the correct template to attach rule to

Attaching the rule in the correct manner

3 participants failed

Confused larger than with smaller than, or male with female

5 participants failed

Did not obey spacing rules, misspelled rule elements

0 participants failed

7 participants stsruggled:

sarted with the wrong template (demographic), but eventually figured out they had to start with a medication template

2 failed, started with the wrong type of template

6 participants failed

entered the rule in the wrong field, applied the wrong path name, or made additional changes to the template logic

High Severity High Severity High Severity High Severity High Severity

Participant Comments

Comments are listed only if a participant provided a written comment.

Table 21. Participant Comments for Task 4 – Create complex alert rule

Participant Do you have any additional comments?

1 Conditional Logic Threads are difficult for me.

5

It would be great to have a "Browse" next to the Rule file line on the HM procedure setup screen, so one does not have to type in the name of the file and can browse and attach it. This will minimize errors and save time.

7 need ability to browse to pick the rule file

6 Did not know I needed to create a rule file. The main screen was not clear on whether or not I could do what I wanted given the out of the box controls.

Effectiveness

Only 2 of our 9 participants succeeded in this task, and even these two participants struggled for a long

time with wrong starts, but eventually ended up with the correct logic, syntax, selecting the correct

template to attach the rule to, and managing to do this in the correct manner.

Efficiency

Even the two participants, who succeeded, struggled tremendously with this task. They needed an

average of 12.5 minutes to finish, and exhibited long sequences of navigating back and forth between

the different task components, example files, the file containing the rule file (to help remember its

Page 181: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 32 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

name), and exploring the template name, templates and procedure forms to help them figure out this

task. There were effectively too many task steps to count in any meaningful way.

Satisfaction

The high degree of difficulty is expressed in low satisfaction scores (average of 3.0).

Major Findings

The experimenters were surprised by the magnitude by which the participants struggled with this task,

given that a problem with a very similar structure had been reviewed during the training session, and

participants had even been asked to save that information and used it as a ‘crutch’ during this task.

1. Administrators had an extremely difficult time writing conditional logic statements.

a. They failed to express the correct logic

b. They failed to use the correct syntax

2. It was difficult to find the correct ways to attach the rule file into the templates:

a. It was difficult to understand what part of the complex rule (demographics vs. medication) to

include in the rule, and what ‘trigger’ template to connect this rule to

b. Even when the correct template and procedure had been selected, attaching the rule was

difficult

i. because the correct field was ‘greyed out’ making it look like it could not be edited (2

participants selected another field that was white and appeared much more ‘editable’

ii. Participants expected to enter the whole path name, when only the file name was

needed. The error message did not explain how to fix the problem.

3. Experimental constraints. Several participants mentioned that they would ideally ‘test’ the rule by

creating a test patient and making sure that it appeared in certain test situation that they would

mock up. This was outside the realm of the usability study but may have led to a higher success

rate eventually.

Page 182: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 33 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

List of Appendices

The following appendices include supplemental data for this usability test report.

Appendix A: Demographic Questionnaire

Appendix B: Administrator Script and General Instructions

Appendix C: Informed Consent Form

Appendix D: Usability Test Instructions

Appendix E: Experimenter's Short Biography

Appendix F: Participant Instructions for Each Task

Appendix G: End-of-Task Questionnaires

Appendix H: System Usability Scale (SUS) Questionnaire

Page 183: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 34 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Appendix A: Demographic Questionnaire

Page 184: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 35 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Page 185: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 36 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Appendix B: Administrator Script and General Instructions

Before participant arrives:

o Start remote desktop application and log into test environment

o Wait with logging into applications until Participant is ready (would time out)

o Start Morae Recorder, but don’t click the red button yet

o Open the training presentation; leave in presentation mode

When participant arrives:

o Welcome Participant

o Make sure participant has filled out Informed Consent sheet, make sure address is legible.

o Make sure participant has filled out Demographics sheet

o Get Participant started on training presentation

Verbal Instructions:

- “Thank you for participating in this usability study today. During this study will be doing

some tasks that are very familiar to you. But you will also perform some new tasks. To

familiarize yourself with these new functions, please walk through this short power point.

- You will notice that our system is similar to the one you are using, but it is not the same.

Some of the configurations will be slightly different from what you are used to, simply

because this is a somewhat limited test system.

- The PowerPoint will also tell you more about participating in a usability study, and show

you how to interact with the recording software.

- You can always ask me questions during these training slides.”

When participant is done with training presentation:

o Start Morae (press red button)

o Log into applications

Verbal Instructions:

- Please click anywhere on your screen to take control of the system.

- Remember that this system may be configured slightly different from what you are used

to.

Page 186: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 37 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

- You start by clicking this button here. You can also move this window around, so that

you can see the system better. If you like, you can use these printed instructions and

these Navigation Instructions to follow along.

- When you have started, I won’t be able to any more questions. Do you have any

questions before you start?”

Page 187: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 38 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Appendix C: Informed Consent Form

Page 188: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 39 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Appendix D: Usability Test Instructions

Introduction to the Study

• Thank you for participating in this usability study.

• Your participation is very valuable and critical for certification.

• Usability testing is required by ARRA 2; vendors cannot certify their EHR’s unless they have been

evaluated by our end-users

• Your feedback will help us improve the usability of Horizon Ambulatory Care and other EMR functions.

• During the study, you will be performing a few routine tasks using HAC, most of which will be very

familiar to you.

• This presentation provides a review of the routine tasks.

• The following screens present an overview of features in self-guided presentation mode.

• Please take 10 minutes to familiarize yourself with these new functions.

• You can use the page up and down functions to navigate through the presentation.

• If you want to see a part again, you can always back up to previous screens by using the Page Up keys

on your keyboard.

Functionality Review

Study Instructions

• Thank you for participating in this study. Your input is very important.

• Our session today will last about 30 minutes. During that time you will use HAC and Clinicals

Configuration Tool to perform tasks.

Objective of the Study

• The purpose of the study is to uncover:

o Areas where the application performs well; that is, effectively, efficiently, and with satisfaction

o Areas where the application does not fully meet your needs.

• Please note that we are not testing you, we are testing the application. If you have difficulty, this means

only that something needs to be improved in the application.

• The study administrator will be here in case you need specific help with the procedures, but the

administrator will not provide help in how to use the application.

Starting the Study

• The system will show an instruction window. The window displays over the EHR screens you will be

using to complete the task.

Page 189: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 40 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

• Click the Start button to start the study.

• Within the instruction window, the system provides instructions for completing the first task.

• After reading the on-screen instructions, click the Start Task button.

• The instruction window will collapse vertically so that it will not obscure the application.

• You can click the Show Instructions button in the instruction window at any time to expand the

window and read the instructions again.

Performing the Task

• Using the EHR screens, begin performing the task. Please try to complete the tasks on your own

following the on-screen instructions very closely.

• You can also move and reposition the instruction window.

• If you place the mouse cursor focus in the instruction window, reposition the mouse cursor back in the

application window before continuing the task.

Indicating Task Completion

• When you are satisfied that you have completed the task, or when you have completed as much as you

can, click the End Task button.

• The system will open a short survey with questions about the task you have just completed.

• After answering the Participant Survey questions for the task, click the Done button.

• The system will present instructions for the next task.

• Between some tasks, the study administrator may adjust the application to start the next task at the

right location.

Completing the Study

• After you complete the Participant Task Evaluation for the final task, the system will open a System

Usability Scale Questionnaire to enable you to record your overall impressions of the application

Page 190: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 41 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

including all the tasks you performed. Note that you will need to scroll vertically to view and answer all

of the questions.

• After completing the questionnaire, click the Done button to complete the study.

• The instruction window shows a message indicating that the study is complete. Tell the study

administrator that you have completed the study.

• The administrator will click the OK button and save your work.

Page 191: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 42 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Appendix E: Experimenter's Short Biography

Marita Franzke received her Ph.D. in Cognitive Psychology after completing a dissertation on Exploration-

based Learning of Display Based User Interfaces at the University of Colorado. She has worked in the field of

Human Computer Interaction since 1991, conducting research on usability and adoption of new end user

technologies in the fields of telecommunication (U S WEST Advanced Technologies, Media One Labs, and

AT&T Broadband Labs), Educational Software (Pearson Knowledge Technologies) and EMR Technologies

(McKesson Provider Technologies). She has 21 years of experience in the field of user research and one year

experience in the field of EMR usability.

Gergana Pavlova joined McKesson Practice Partner over 5 years ago as a technical writer. A year ago she

joined the Practice Partner Product Management team to help with the functional design requirements and

user interface designs of the Meaningful Use Stage 2 Incentive Program capabilities. She has an MS in

English and certificates in User-Centered Design in Business Analysis from the University of Washington.

Page 192: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 43 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Appendix F: Participant Instructions for Each Task

Introduction

You are the administrator for the McKesson Test Clinic.

In this session you will be configuring the drug-drug interaction settings and a few Health

Maintenance alerts for this clinic.

If you are ready to proceed, please click 'Start'.

Task 1

Task Instructions:

Adjust the severity for drug-drug interaction checks.

Select the lowest setting to display all interventions (both high significance and

low significance interventions).

Task 2

Task Instructions:

Your practice has changed the policy.

Adjust the drug severity to allow only the highest significance interactions to display.

Task 3

Now activate a new Health Maintenance Procedure:

Add “Depression Screen” to the Health Maintenance procedures applicable to 11-

18 year old girls, then …

activate overdue checking for that procedure at ages 12, 14, 16, 18.

Page 193: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 44 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Task 4

Task Instructions:

Add a Health Maintenance procedure “Beta HCG Test” (B-HCG) for:

for female patients who are 55 years old and younger, and

who are taking ACCUTANE .

Beta HCG testing should be done every month.

Page 194: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 45 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Appendix G: End-of-Task Questionnaires

1 - Configure Drug Interaction

1. Finding the Prescription Default screen was ...

very difficult 1 2 3 4 5 very easy

2. Allowing all interactions to display was ...

very difficult 1 2 3 4 5 very easy

3. Overall this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments?

Comments:

2 - Adjust Drug Interaction

1. Finding the Prescription Defaults was ...

very difficult 1 2 3 4 5 very easy

2. Adjusting the severity setting to high was ...

very difficult 1 2 3 4 5 very easy

3. Overall this task was ...

very difficult 1 2 3 4 5 very easy

4. Do you have any additional comments about this task?

Comments:

Page 195: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 46 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

3 – Create/Activate HM Alert

1. Finding the correct screen to accomplish this task was ...

very difficult 1 2 3 4 5 very easy

2. Activating this Health Maintenance alert was ...

very difficult 1 2 3 4 5 very easy

3. Creating the overdue alert was ...

very difficult 1 2 3 4 5 very easy

4. Overall this task was ...

very difficult 1 2 3 4 5 very easy

5. Do you have any additional comments?

Comments:

4 - Create/Activate HM Alert (Complex Alert including Creating of Conditional Rule)

1. Creating the text file with the rule was ...

very difficult 1 2 3 4 5 very easy

2. Saving the rule file in the correct location was ...

very difficult 1 2 3 4 5 very easy

3. Finding the screen to activate the Health Maintenance report was ...

very difficult 1 2 3 4 5 very easy

4. Activating the Health Maintenance Alert was ...

very difficult 1 2 3 4 5 very easy

Page 196: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 47 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

5. Overall, this task was ...

very difficult 1 2 3 4 5 very easy

6. Do you have any additional comments about this task?

Comments:

Page 197: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary

Page 48 of 48

EHR: Practice Partner VER. 11, Part III – Administrator – Ambulatory September 2, 2013

Appendix H: System Usability Scale (SUS) Questionnaire

This questionnaire was presented to the participant after the completion of all tasks and task-based

questionnaires.

Thank you for participating in this study. The following questions address your experience with all the tasks in this study.

1. I think that I would like to use this system frequently

Strongly Disagree 1 2 3 4 5 Strongly Agree

2. I found the system unnecessarily complex

Strongly Disagree 1 2 3 4 5 Strongly Agree

3. I thought that the system was easy to use

Strongly Disagree 1 2 3 4 5 Strongly Agree

4. I think that I would need the support of a technical person to be able to use this system

Strongly Disagree 1 2 3 4 5 Strongly Agree

5. I found the various functions in this system were well integrated

Strongly Disagree 1 2 3 4 5 Strongly Agree

6. I thought there was too much inconsistency in this system

Strongly Disagree 1 2 3 4 5 Strongly Agree

7. I would imagine that most people would learn to use this system very quickly

Strongly Disagree 1 2 3 4 5 Strongly Agree

8. I found the system very cumbersome to use

Strongly Disagree 1 2 3 4 5 Strongly Agree

9. I felt very confident using the system

Strongly Disagree 1 2 3 4 5 Strongly Agree

10. I needed to learn a lot of things before I could get going with this system

Strongly Disagree 1 2 3 4 5 Strongly Agree

Page 198: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary
Page 199: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary
Page 200: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary
Page 201: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary
Page 202: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary
Page 203: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary
Page 204: ONC HIT Certification Program Test Results Summary for ...€¦ · Email: Tom.Reinecke@mckesson.com: Phone: 563-585-4773: Developer/Vendor Contact: Tom Reinecke : Test Results Summary