onc hit certification program test results summary for ... · test results summary for 2014 edition...

147
Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: My Vision Express Product Version: 2014 Domain: Ambulatory Test Type: Complete 1.2 Developer/Vendor Information Developer/Vendor Name: Insight Software, LLC Address: 3050 Universal Blvd, Suite 120 Weston FL 33331 Website: www.MyVisionExpress.com Email: [email protected] Phone: (877) 882-7456 Developer/Vendor Contact: Eduardo Martinez Part 2: ONC-Authorized Certification Body Information 2.1 ONC-Authorized Certification Body Information ONC-ACB Name: InfoGard Laboratories, Inc. Address: 709 Fiero Lane Suite 25 San Luis Obispo, CA 93401 Website: www.infogard.com Email: [email protected] Phone: (805) 783-0810 ONC-ACB Contact: Adam Hardcastle This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative: Adam Hardcastle EHR Certification Body Manager ONC-ACB Authorized Representative Function/Title 2/23/16 Signature and Date

Upload: others

Post on 14-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 1 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Part 1: Product and Developer Information

1.1 Certified Product Information

Product Name: My Vision Express

Product Version: 2014

Domain: Ambulatory

Test Type: Complete

1.2 Developer/Vendor Information

Developer/Vendor Name: Insight Software, LLC

Address: 3050 Universal Blvd, Suite 120 Weston FL 33331

Website: www.MyVisionExpress.com

Email: [email protected]

Phone: (877) 882-7456

Developer/Vendor Contact: Eduardo Martinez

Part 2: ONC-Authorized Certification Body Information

2.1 ONC-Authorized Certification Body Information

ONC-ACB Name: InfoGard Laboratories, Inc.

Address: 709 Fiero Lane Suite 25 San Luis Obispo, CA 93401

Website: www.infogard.com

Email: [email protected]

Phone: (805) 783-0810

ONC-ACB Contact: Adam Hardcastle

This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative:

Adam Hardcastle

EHR Certification Body Manager ONC-ACB Authorized Representative Function/Title

2/23/16

Signature and Date

Page 2: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 2 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

2.2 Gap Certification The following identifies criterion or criteria certified via gap certification

§170.314 (a)(1) (a)(17) (d)(5) (d)(9)

(a)(6) (b)(5)* (d)(6) (f)(1)

(a)(7) (d)(1) (d)(8)

*Gap certification allowed for Inpatient setting only

No gap certification

2.3 Inherited Certification The following identifies criterion or criteria certified via inherited certification

§170.314 (a)(1) (a)(14) (c)(3) (f)(1) (a)(2) (a)(15) (d)(1) (f)(2) (a)(3) (a)(16) Inpt. only (d)(2) (f)(3) (a)(4) (a)(17) Inpt. only (d)(3) (f)(4) Inpt. only (a)(5) (b)(1) (d)(4)

(f)(5) Optional & Amb. only (a)(6) (b)(2) (d)(5)

(a)(7) (b)(3) (d)(6) (f)(6) Optional &

Amb. only (a)(8) (b)(4) (d)(7) (a)(9) (b)(5) (d)(8) (g)(1) (a)(10) (b)(6) Inpt. only (d)(9) Optional (g)(2) (a)(11) (b)(7) (e)(1) (g)(3) (a)(12) (c)(1) (e)(2) Amb. only (g)(4)

(a)(13) (c)(2) (e)(3) Amb. only

No inherited certification

Page 3: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 3 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

Part 3: NVLAP-Accredited Testing Laboratory Information

Report Number: 13-2397-R-0027 V1.2 Test Date(s): Aug. 15-16, 19-22, 26-27, Sept. 10, 16-17, 21-22, 25-26, 2013

3.1 NVLAP-Accredited Testing Laboratory Information

ATL Name: InfoGard Laboratories, Inc.

Accreditation Number: NVLAP Lab Code 100432-0

Address: 709 Fiero Lane Suite 25

San Luis Obispo, CA 93401

Website: www.infogard.com

Email: [email protected]

Phone: (805) 783-0810

ATL Contact: Milton Padilla

For more information on scope of accreditation, please reference http://ts.nist.gov/Standards/scopes/1004320.htm

Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory Representative:

Milton Padilla

EHR Test Body Manager ATL Authorized Representative Function/Title

2/23/16

Signature and Date

3.2 Test Information

3.2.1 Additional Software Relied Upon for Certification

Additional Software Applicable Criteria Functionality provided by Additional Software

DrFirst for Electronic Prescribing

314(b)(3) Medication Management and eRx functions.

No additional software required

3.2.2 Test Tools

Test Tool Version

Page 4: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 4 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

Test Tool Version Cypress 2.2.0 ePrescribing Validation Tool 1.0.2 HL7 CDA Cancer Registry Reporting Validation Tool HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool

HL7 v2 Immunization Information System (IIS) Reporting Validation Tool 1.6.0

HL7 v2 Laboratory Results Interface (LRI) Validation Tool 1.6.0 HL7 v2 Syndromic Surveillance Reporting Validation Tool 1.6.0 Transport Testing Tool 168 Direct Certificate Discovery Tool 2.1

No test tools required

3.2.3 Test Data

Alteration (customization) to the test data was necessary and is described in Appendix [insert appendix letter]

No alteration (customization) to the test data was necessary

3.2.4 Standards

3.2.4.1 Multiple Standards Permitted The following identifies the standard(s) that has been successfully tested where more than one standard is permitted

Criterion # Standard Successfully Tested

(a)(8)(ii)(A)(2)

§170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

(a)(13)

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.207(j) HL7 Version 3 Standard: Clinical Genomics; Pedigree

Page 5: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 5 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

Criterion # Standard Successfully Tested

(a)(15)(i)

§170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

(a)(16)(ii) §170.210(g)

Network Time Protocol Version 3 (RFC 1305)

§170. 210(g) Network Time Protocol Version 4 (RFC 5905)

(b)(2)(i)(A)

§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(b)(7)(i)

§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(e)(1)(i) Annex A of the FIPS Publication 140-2 • 128-bit AES • SHA-1

(e)(1)(ii)(A)(2) §170.210(g)

Network Time Protocol Version 3 (RFC 1305)

§170. 210(g) Network Time Protocol Version 4 (RFC 5905)

(e)(3)(ii) Annex A of the FIPS Publication 140-2 • 128-bit AES • SHA-1

Common MU Data Set (15)

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.207(b)(2) The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT-4)

None of the criteria and corresponding standards listed above are applicable

3.2.4.2 Newer Versions of Standards The following identifies the newer version of a minimum standard(s) that has been successfully tested

Page 6: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 6 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

Newer Version Applicable Criteria

No newer version of a minimum standard was tested

3.2.5 Optional Functionality

Criterion # Optional Functionality Successfully Tested

(a)(4)(iii) Plot and display growth charts

(b)(1)(i)(B) Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(1)(i)(C) Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(b)(2)(ii)(B) Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(2)(ii)(C) Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(f)(3) Ambulatory setting only – Create syndrome-based public health surveillance information for transmission using the standard specified at §170.205(d)(3) (urgent care visit scenario)

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature)

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)

No optional functionality tested

Page 7: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 7 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

3.2.6 2014 Edition Certification Criteria* Successfully Tested

Criteria # Version

Criteria # Version

TP** TD*** TP TD (a)(1) (c)(3) 1.5 2.2.0 (a)(2) 1.2 (d)(1)

(a)(3) 1.2 1.4 (d)(2) 1.3 (a)(4) 1.4 1.3 (d)(3) 1.3 (a)(5) 1.4 1.3 (d)(4) 1.2 (a)(6) 1.3 1.4 (d)(5) (a)(7) 1.3 1.3 (d)(6) (a)(8) 1.2 (d)(7) 1.2 (a)(9) 1.3 1.3 (d)(8) (a)(10) 1.2 1.4 (d)(9) Optional (a)(11) 1.3

(e)(1) 1.7 1.4 (a)(12) 1.3 (e)(2) Amb. only 1.2 1.5 (a)(13) 1.2 (e)(3) Amb. only 1.3 (a)(14) 1.2 (f)(1) (a)(15) 1.5 (f)(2) 1.3 (a)(16) Inpt. only (f)(3) 1.3 (a)(17) Inpt. only (f)(4) Inpt. only

(b)(1) 1.6 1.3 (f)(5) Optional &

Amb. only (b)(2) 1.4 1.5

(b)(3) 1.4 (f)(6) Optional &

Amb. only (b)(4) 1.3 1.4

(b)(5) 1.4 1.2 (g)(1) (b)(6) Inpt. only (g)(2) 1.6 1.8 (b)(7) 1.4 1.5 (g)(3) 1.3

(c)(1) 1.5 2.2.0 (g)(4) 1.2

(c)(2) 1.5 2.2.0

*For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method) **Indicates the version number for the Test Procedure (TP) ***Indicates the version number for the Test Data (TD)

Page 8: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 8 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

3.2.7 2014 Clinical Quality Measures*

Type of Clinical Quality Measures Successfully Tested: Ambulatory Inpatient No CQMs tested

*For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov (navigation: 2014 Clinical Quality Measures)

Ambulatory CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version

2 90 136 155 v1

22 117 137 156

50 122 138 v1 157

52 123 139 158

56 124 140 159

61 125 141 160

62 126 142 v1 161

64 127 143 v1 163

65 128 144 164

66 129 145 165 v2

68 v2 130 146 v2 166

69 131 v1 147 167 v1

74 132 v1 148 169

75 133 v1 149 177

77 134 153 179

82 135 154 v2 182

Inpatient CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version

9 71 107 172

26 72 108 178

30 73 109 185

31 91 110 188

32 100 111 190

53 102 113

55 104 114

60 105 171

Page 9: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 9 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

3.2.8 Automated Numerator Recording and Measure Calculation

3.2.8.1 Automated Numerator Recording

Automated Numerator Recording Successfully Tested (a)(1) (a)(9) (a)(16) (b)(6)

(a)(3) (a)(11) (a)(17) (e)(1)

(a)(4) (a)(12) (b)(2) (e)(2)

(a)(5) (a)(13) (b)(3) (e)(3)

(a)(6) (a)(14) (b)(4)

(a)(7) (a)(15) (b)(5)

Automated Numerator Recording was not tested

3.2.8.2 Automated Measure Calculation

Automated Numerator Recording Successfully Tested (a)(1) (a)(9) (a)(16) (b)(6)

(a)(3) (a)(11) (a)(17) (e)(1)

(a)(4) (a)(12) (b)(2) (e)(2)

(a)(5) (a)(13) (b)(3) (e)(3)

(a)(6) (a)(14) (b)(4)

(a)(7) (a)(15) (b)(5)

Automated Measure Calculation was not tested

3.2.9 Attestation

Attestation Forms (as applicable) Appendix

Safety-Enhanced Design* A

Quality Management System** B

Privacy and Security C

*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), (b)(4) **Required for every EHR product

Page 10: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 10 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

Appendix A: Safety-Enhanced Design An inaccurate description of the summative usability testing measures used for Effectiveness, Efficiency, and Satisfaction was provided in the "Results" section of the report. The information provided in the table of results data did not match the results as described for measures of Effectiveness, Efficiency, and Satisfaction.

The following required data was missing: Display color settings

Page 11: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

The document is intended to describe the User-Centric Design Processes used for the

selective Criterions set forth by Stage 2 (2014) ONC Certification Test Method 170.314(g)(3) Safety-Enhanced design

UCD Design Analysis My Vision Express 2014

Eduardo Martinez : Insight Software, LLC

Page 12: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

P a g e | 1

Table of Contents UCD Standard – NISTIR 7741 ........................................................................................................................ 2

UCD Standard Reference .......................................................................................................................... 2

UCD Standard Description ........................................................................................................................ 2

One Standard for All .................................................................................................................................. 3

UCD Process Practice ................................................................................................................................ 3

Understanding User Needs, Workflows and Work Environment ......................................................... 3

Engage Users Early and Often ............................................................................................................... 3

Set User Performance Objective ........................................................................................................... 3

Design the User Interface from Known Human Behavior Principles and from Familiar Interface Patterns ................................................................................................................................................. 3

Conduct Usability Test .......................................................................................................................... 4

Adapt Design with Iterative Test Results .............................................................................................. 4

Page 13: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

P a g e | 2

UCD Standard – NISTIR 7741

UCD Standard Reference (NISTIR 7741) NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records

Authors : Robert M. Schumacher; Svetlana Z. Lowry

Published : November 29, 2010

UCD Standard Description

“This document provides NIST guidance for those developing electronic health record (EHR) applications that need to know more about processes of user-centered design (UCD). An established UCD process ensures that designed EHRs are efficient, effective, and satisfying to the user. Following the guidance in this document will greatly increase the likelihood of achieving the goal of building usable user interfaces and a better user experience. One of the main purposes of this guide is to provide practical guidance on methods relating to UCD and usability testing. The intended audiences of this document are those with a role in determining the features and functions contained in the EHR and how those are represented in the user interface.”

Reference: http://www.nist.gov/manuscript-publication-search.cfm?pub_id=907313

Page 14: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

P a g e | 3

Declaration of Use

One Standard for All The NISTIR 7741 standard was chosen as the single User-Centered Design process for the following capabilities being certified for Stage 2 (2014):

• §170.314(a)(1) Computerized provider order entry • §170.314(a)(2) Drug-drug, drug-allergy interaction checks • §170.314(a)(6) Medication list • §170.314(a)(7) Medication allergy list • §170.314(a)(8) Clinical decision support • §170.314(b)(3) Electronic prescribing • §170.314(b)(4) Clinical information reconciliation

UCD Process Practice For all capabilities listed above the following has been practiced. This assisted with making each of the capabilities effective, efficient and satisfactory for our software users.

Understanding User Needs, Workflows and Work Environment Before designing or implementing any of the capabilities in the criterion set, a consultant was hired to assist with the needs to become certified and help devise an initial requirement set for a successful workflow for our providers whom seek to attest for meaningful use for Stage 1 (2011) and Stage 2 (2014).

Engage Users Early and Often Many of our providers joined several discussion meeting to analyze current workflow and how new capabilities can be easily integrated with ease. For each of these capabilities, a user group meeting was conducted to discuss need and integration.

Set User Performance Objective To be able to perform a complete Usability Test, User Performance objectives have to be first defined. Our internal test consisted of an in-staff professional and a new employee. Each was tested and timed using each capability. An average factor was calculated to measure the amount of extra time is needed to assume most users will reach their objective successfully. This performance objective was later used to calculate the optimal time for each task in the Usability Test.

Design the User Interface from Known Human Behavior Principles and from Familiar Interface Patterns To keep consistency in the program with the addition of these capabilities we continued to use the following features in each capability:

• Hot Keys for saving records • The use of table grids for row selections

Page 15: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

P a g e | 4

• TAB key to traverse to the next field in the table • The typical Add and Delete button in the same locations • Add features to the associated Patient Encounter sections for cohesiveness

With these features, we were able to continue to use a familiar and consistent behavior in the software and also consider Human Behavior Principles.

Conduct Usability Test Accompanied with this document will be seven UCD Analysis documents which will detail the Usability test conducted for each capability. For each Usability test, an analysis of the results is drawn out to reveal the success and failure of each and all capabilities and their associated tasks.

Adapt Design with Iterative Test Results The Usability test results helped improve the workflow and understanding of each of the capabilities workflow and behaviors. Much of the feedback from the participants was used to revisit the design and requirements for these capabilities.

Page 16: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

INSIGHT SOFTWARE, LLC

User Centered Design Analysis

170.314(A)(8) Clinical Decision Support

Eduardo Martinez

7/3/2013

EHR Usability Test Report of My Vision Express 2014 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports My Vision Express 2014 Date of Usability Test: June 29th 2013

Page 17: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Table of Contents EXECUTIVE SUMMARY .................................................................................................................................. 2

METHOD ........................................................................................................................................................ 4

PARTICIPANTS ........................................................................................................................................... 4

STUDY DESIGN........................................................................................................................................... 4

TASKS ........................................................................................................................................................ 5

PROCEDURES ............................................................................................................................................. 5

TEST LOCATION ......................................................................................................................................... 6

TEST ENVIRONMENT ................................................................................................................................. 6

TEST FORMS AND TOOLS .......................................................................................................................... 7

PARTICIPANT INSTRUCTIONS .................................................................................................................... 7

USABILITY METRICS ................................................................................................................................... 7

DATA SCORING .............................................................................................................................................. 8

RESULTS ........................................................................................................................................................ 9

DATA ANALYSIS AND REPORTING ............................................................................................................. 9

EFFECTIVENESS ........................................................................................................................................... 10

EFFICIENCY .................................................................................................................................................. 10

SATISFACTION ............................................................................................................................................. 10

MAJOR FINDINGS ........................................................................................................................................ 10

AREAS FOR IMPROVEMENT ........................................................................................................................ 10

APPENDIX [A]: Informed Consent ............................................................................................................... 12

APPENDIX [B]: Tasks/Questionnaire ........................................................................................................... 13

Task 1: First Impressions ......................................................................................................................... 14

Task 2: Setup and Activate an Evidence-based Decision Support Intervention ..................................... 15

Task 3: Electronically Link and Cite the Diagnostic Reference Information ........................................... 16

Task 4: Trigger a Decision Support Intervention ..................................................................................... 17

Task 5: Authorize a Decision Support Intervention ................................................................................ 18

Task 6: Validate permissions of a Decision Support Intervention .......................................................... 19

Final Questions ........................................................................................................................................ 20

Page 18: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EXECUTIVE SUMMARY A usability test of My Vision Express, 2014, Optical EHR was conducted on 6/29/2013 in a remote web environment by Insight Software, LLC. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, 5 healthcare providers and optical managers matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 5 tasks typically conducted on an EHR:

1. Setup and Activate an Evidence-based Decision Support Intervention 2. Electronically Link and Cite the Diagnostic Reference Information 3. Trigger a Decision Support Intervention 4. Authorize a Decision Support Intervention 5. Validate permissions of a Decision Support Intervention

During the 20 minute usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data electronically. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: • Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations • Participant’s satisfaction ratings of the system All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a posttest questionnaire. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

Page 19: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

The tasks for this test produced a high success rate with some path deviations proving the test to be effective. The participants found the tasks fairly easy with an average task rating of 2.2 which is below the 2.6 threshold making the task satisfactory for the testing participants. Most of the average task times were under their projected optimal time as most participants followed the designated path as close as they could. Few errors were identified in the process of the test. In addition to the performance data, the following qualitative observations were made: - Major findings

Upon reviewing the participants’ comments, many found the workflow for setting up the Clinical Decision Support Intervention a bit cumbersome. The path deviation and single failure revealed how triggering the intervention required multiple steps to complete; some outside the optimal designated path.

- Areas for improvement Based on the findings, any future usability testing for Clinical Decision Support will require a streamlined way of adding a clinical document citation. The Intervention setup also requires some attention to workflow. Participant’s referred to the setup of the Intervention as “Cumbersome” as it had the ability to any and all conditions of the following: (A) Problem list; (B) Medication list; (C) Medication allergy list; (D) Demographics; (E) Laboratory tests and values/results; and (F) Vital signs

Page 20: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

METHOD

PARTICIPANTS

A total of 5 participants were tested on the EHRUT(s). Participants in the test were Optical Healthcare Professionals. Participants were recruited by Insight Software, LLC. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

Part ID Gender Age Education Occup./Role Prof. Exp. Compu. Exp. Product Exp. Assistive Tech Needs

1234 Male 54 Doctorate Optometric Phys. 30 years 15 years 4 years None 2345 Female 47 Masters Optometric IT 20 years 15 years 5 years None 3456 Male 49 Doctorate Optometrist 25 years 10 years 1.5 years None 4567 Female 36 Masters Optometric Trainer 6 years 7 years 3.5 years None 5678 Male 37 Doctorate Optometrist 9 years 12 years 4.5 years None

Table 1. Participant Description. 5 participants (matching the demographics in the section on Participants) were recruited and 5 participated in the usability test. No participants failed to show for the study.

STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with one EHR(s). Each participant used the system in the same location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant:

• Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations (comments) • Participant’s satisfaction ratings of the system

Page 21: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

1. Setup and Activate an Evidence-based Decision Support Intervention 2. Electronically Link and Cite the Diagnostic Reference Information 3. Trigger a Decision Support Intervention 4. Authorize a Decision Support Intervention 5. Validate permissions of a Decision Support Intervention

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.

PROCEDURES Upon connection, participants were greeted; their identity was verified and matched with a name on the participant schedule. Participants were then assigned a participant ID. To ensure that the test ran smoothly, two staff members participated in this test, the usability administrator and the data logger. The usability testing staff conducting the test was experienced usability practitioners. Each participant reviewed and signed an informed consent and release form (See Appendix [A]). The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post-task rating data, and took notes on participant comments. A second person served as the data logger and took notes on task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below):

• As quickly as possible making as few errors and deviations as possible. • Without assistance; administrators were allowed to give immaterial guidance and clarification on

tasks, but not instructions on use. • Without using a think aloud technique.

For each task, the participants were given a written copy of the task. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Following the session, the administrator thanked each individual for their participation. Participants' demographic information, task success rate, time on task, errors, deviations, and verbal responses were recorded into a spreadsheet.

Page 22: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST LOCATION The test location was a cloud environment where each participant had their own desktop to log into. All observers and the data logger worked from a remote location and using an online meeting conference which could see the participant’s screen and review the process which was taken. To ensure that the environment was comfortable for users, the participants were able to connect from the comfort of their home.

TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in cloud environment in an online group meeting. For testing, the computer used a Dell running Windows Server 2008 R2 with RDP. The participants used RDP, Mouse and Keyboard when interacting with the EHRUT. The My Vision Express Software used a screen resolution of 1280 x 768 with all participants using a 20 inch monitor. The application was set up by the Insight Software, LLC according to the vendor’s documentation describing the system set-up and preparation. The application itself was running on a Windows SQL Server 2012 using a test database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size or themes).

Page 23: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including:

1. Informed Consent 2. Moderator’s Guide 3. Post-Test Questionnaire

Examples of these documents can be found in Appendices. The Moderator’s Guide was devised so as to be able to capture required data. Online meeting room and shadow terminal session access was used to view the participants session as they went through each task.

PARTICIPANT INSTRUCTIONS The administrator reads the following instructions aloud to the each participant (also see the full moderator’s guide in Appendix [B]): Thank you for participating in this study. Your input is very important. Our session today will last about 60 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing. Following the procedural instructions, participants were shown the EHR and as their first task, were given time 10 minutes to explore the system and make comments. Participants were then given 6 tasks to complete. Tasks are listed in the moderator’s guide in Appendix [B].

USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess:

1. Effectiveness of My Vision Express by measuring participant success rates and errors 2. Efficiency of My Vision Express by measuring the average task time and path deviations 3. Satisfaction with My Vision Express by measuring ease of use ratings

Page 24: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

DATA SCORING The following table (Table 2) details how tasks were scored, errors evaluated, and the time data analyzed. Measures Rationale and Scoring

Effectiveness: Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by 1.25 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 120 seconds then allotted task time performance was 150 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness: Task Failure

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. This should also be expressed as the mean number of failed tasks per participant. On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency: Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 25: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Efficiency: Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction: Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Easy) to 5 (Very Difficult). These data are averaged across participants. Common convention is that average ratings for systems judged easy to use should be 2.6 or below.

Table 2. Details of how observed data were scored.

RESULTS

DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. Some issues were identified when the a couple participants were attempting to electronically prescribe. At the moment of the test connectivity to the interface was sporadic causing an error to show up.

Page 26: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EFFECTIVENESS Based on the results, the average high success rate for all tasks is calculated to be 0.9 rendering the test and content effective. Amongst the 5 tasks, Task 3 (‘Trigger a Decision Support Intervention’) and Task 5 (‘Validate permissions of a Decision Support Intervention’) both revealed an error for a single and same participant which caused an increase in path deviation. The error was due to an ill prepared test patient which was not set up to be older than 18.

EFFICIENCY Based on the time average for these tasks, all tasks averages were very close to their optimal times. The average deviation was 0.92 for the time it took for all participants to complete the tasks in this test. The majority of the participants completed their task by or before the optimal time designated.

SATISFACTION Based on the task ratings, the average task rating for the two tasks was 2.2. This average is below the 2.6 threshold for satisfaction and concludes the tasks were satisfactory for the participants. Task 3 (‘Trigger a Decision Support Intervention’) and Task 5 (‘Validate permissions of a Decision Support Intervention’) both had high task rating reaching very close if not equal to the threshold. These tasks proved to me more difficult than the other tasks.

MAJOR FINDINGS Upon reviewing the participants’ comments, many found the workflow for setting up the Clinical Decision Support Intervention a bit cumbersome. The path deviation and single failure revealed how triggering the intervention required multiple steps to complete; some outside the optimal designated path.

AREAS FOR IMPROVEMENT Based on the findings, any future usability testing for Clinical Decision Support will require a streamlined way of adding a clinical document citation. The Intervention setup also requires some attention to workflow. Participant’s referred to the setup of the Intervention as “Cumbersome” as it had the ability to any and all conditions of the following:

Page 27: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

(A) Problem list; (B) Medication list; (C) Medication allergy list; (D) Demographics; (E) Laboratory tests and values/results; and (F) Vital signs

Page 28: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [A]: Informed Consent

Test Company would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and give your feedback. The study will last about 60 minutes. At the conclusion of the test, you will be compensated for your time. Agreement I understand and agree that as a voluntary participant in the present study conducted by Test Company I am free to withdraw consent or discontinue participation at any time. I understand and agree to participate in the study conducted and videotaped by the Test Company. I understand and consent to the use and release of the videotape by Test Company. I understand that the information and videotape is for research purposes only and that my name and image will not be used for any purpose other than research. I relinquish any rights to the videotape and understand the videotape may be copied and used by Test Company without further permission. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from this study may be shared with outside of Test Company and Test Company’s client. I understand and agree that data confidentiality is assured, because only de-identified data – i.e., identification numbers not names – will be used in analysis and reporting of the results. I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can leave at any time. Please check one of the following: YES, I have read the above statement and agree to be a participant. NO, I choose not to participate in this study. Signature: _____________________________________ Date: ____________________

Page 29: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [B]: Tasks/Questionnaire Orientation (1 minutes) Thank you for participating in this study. Our session today will last 60 minutes. During that time you will take a look at an electronic health record system. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked. If you get lost or have difficulty I cannot answer help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. I did not have any involvement in its creation, so please be honest with your opinions. The product you will be using today is My Vision Express 2014. Some of the data may not make sense as it is placeholder data. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Preliminary Questions (5 minutes) What is your job title / appointment? How long have you been working in this role? What are some of your main responsibilities? Tell me about your experience with electronic health records.

Page 30: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 1: First Impressions (120 Seconds) This is the application you will be working with. Have you heard of it? ____Yes ____No If so, tell me what you know about it. Please a brief description of your first impression of My Vision Express 2014. Notes / Comments:

Page 31: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 2: Setup and Activate an Evidence-based Decision Support Intervention (250 Seconds) This task will have you setup Clinical Decision Support in the software to allow real-time interventions based on evidences such as the ones below: A) Problem list; (B) Medication list; (C) Medication allergy list; (D) Demographics; (E) Laboratory tests and values/results; and (F) Vital signs Test Scenario Problem (SNOMED Code): Glacoma – 23986001 Demographic: Patients over 18 years Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: SETUP: Click File Click Setup Click Exam Click Alerts A screen will appear, Click Add Insert ‘Warning’ in the description Insert ‘Glacoma Side Effects’ in the Message field For the age, enter ‘>’ in the operator field and ‘18’ in the value field Click the ‘Detail’ button A response window will appear with tabs for each evidence type above Click the Problem Tab Click Add Select the SNOMED code of the problem from the drop down Click Close Click Save to the response Click Save for the Alert **By default the addition of the alert will include all security groups __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 32: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 3: Electronically Link and Cite the Diagnostic Reference Information (150 Seconds) This task will have you setup Clinical Decision Support in the software to allow real-time interventions based on Test Scenario Document Source : C:\Glacoma.pdf Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: SETUP CITATION: Click on the ‘…’ button in the line of the Citation A directory browse window will appear, either enter the file path above or traverse in the ‘Look In’ feature to get to the C Drive and select the ‘Glacoma.PDF’ Enter the following for the Developer: ‘Agency for Health Care Research’ Enter the following for the Funding Source ‘AHRQ No. 12’ __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 33: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 4: Trigger a Decision Support Intervention (400 Seconds) This task will ask you attempt to trigger a real-time intervention based on evidences such as the ones below with a authorized user. Test Scenario Problem (SNOMED Code): Glacoma – 23986001 Demographic: Patients over 18 years Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: VERIFY: Open Patient Module Search for Patient ’John Doe’ Patient # 115 (Patient is over the age of 18) After Opening, Click the Health tab Click the Problems tab Click the ‘Add’ button Type in the SNOMED Code for Glacoma into the Search Select the first Glacoma Disorder which will be listed Click Select The Problem will be listed in the Patient’s Problem List Click Save An Alert will appear showing the message which was setup and a link to the document which was added to the software. OPEN: Click the ‘Yes’ button Verify the Glacoma citation previously linked __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 34: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 5: Authorize a Decision Support Intervention (200 Seconds) This task will have you authorize a Clinical Decision Support in the software to allow real-time interventions based on evidences to only show to a certain security group Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: SETUP: Click File Click Setup Click Exam Click Alerts A screen will appear, Click ‘Glacoma Side Effects’ intervention In the Security Groups listing in the bottom left, unclick the ‘Reception’ security group Click Save for the Alert Log off the software Session RELOG: On the login screen, enter ‘front’ for the user name Enter ’desk’ for the password Click login __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 35: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 6: Validate permissions of a Decision Support Intervention (360 Seconds) This task will ask you attempt to trigger a real-time intervention based on evidences such as the ones below with a unauthorized user. Test Scenario Problem (SNOMED Code): Glacoma – 23986001 Demographic : Patients over 18 years Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: VERIFY: Open Patient Module Search for Patient ’John Doe’ Patient # 115 After Opening, Click the Health tab Click the Problems tab Click the ‘Add’ button Type in the SNOMED Code for Glacoma into the Search Select the first Glacoma Disorder which will be listed Click Select The Problem will be listed in the Patient’s Problem List Click Save No alert should be displayed __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 36: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Final Questions (10 Minutes) What was your overall impression of this system? What aspects of the system did you like most? What aspects of the system did you like least? Were there any features that you were surprised to see? What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? Compare this system to other systems you have used. Would you recommend this system to your colleagues?

Page 37: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

INSIGHT SOFTWARE, LLC

User Centered Design Analysis

170.314(A)(1) Computer Provider Order Entry

Eduardo Martinez

7/3/2013

EHR Usability Test Report of My Vision Express 2014 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports My Vision Express 2014 Date of Usability Test: June 29th 2013

Page 38: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Table of Contents EXECUTIVE SUMMARY .................................................................................................................................. 2

METHOD ........................................................................................................................................................ 4

PARTICIPANTS ........................................................................................................................................... 4

STUDY DESIGN........................................................................................................................................... 4

TASKS ........................................................................................................................................................ 5

PROCEDURES ............................................................................................................................................. 5

TEST LOCATION ......................................................................................................................................... 5

TEST ENVIRONMENT ................................................................................................................................. 6

TEST FORMS AND TOOLS .......................................................................................................................... 7

PARTICIPANT INSTRUCTIONS .................................................................................................................... 7

USABILITY METRICS ................................................................................................................................... 7

DATA SCORING .............................................................................................................................................. 8

RESULTS ........................................................................................................................................................ 9

DATA ANALYSIS AND REPORTING ............................................................................................................. 9

EFFECTIVENESS ........................................................................................................................................... 10

EFFICIENCY .................................................................................................................................................. 10

SATISFACTION ............................................................................................................................................. 10

MAJOR FINDINGS ........................................................................................................................................ 10

AREAS FOR IMPROVEMENT ........................................................................................................................ 10

APPENDIX [A]: Informed Consent ............................................................................................................... 11

APPENDIX [B]: Tasks/Questionnaire ........................................................................................................... 12

Task 1: First Impressions ......................................................................................................................... 13

Task 2: Record a Medication ................................................................................................................... 14

Task 3: Change and Access a Medication ............................................................................................... 15

Task 4: Record a Laboratory Test ............................................................................................................ 16

Task 5: Change and Access a Laboratory Test ........................................................................................ 17

Task 6: Record a Radiology/Imaging Test ............................................................................................... 18

Task 7: Change and Access a Radiology/Imaging Test ............................................................................ 19

Final Questions ........................................................................................................................................ 20

Page 39: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EXECUTIVE SUMMARY A usability test of My Vision Express, 2014, Optical EHR was conducted on 6/29/2013 in a remote web environment by Insight Software, LLC. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, 5 healthcare providers and optical managers matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 6 tasks typically conducted on an EHR:

1. Record a Medication 2. Change and Access a Medication 3. Record a Laboratory Test 4. Change and Access a Laboratory test 5. Record a Radiology/Imaging test 6. Change and Access a Radiology/Imaging test

During the 20 minute usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form; they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data electronically. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: • Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations • Participant’s satisfaction ratings of the system All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a posttest questionnaire. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

Page 40: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

The overall results revealed the Computer Provider Order Entry analysis to be satisfactory. Some deviations were observed causing the majority of the failures but the overall Success of the tasks listed above was successful. The general task rating suggested the tasks to be fairly easy with an average task rating of 2.1 In addition to the performance data, the following qualitative observations were made: - Major findings

Some of the participants should unfamiliarity with the Laboratory and Imaging Test sections of the software. The codes sets associated to the Laboratory Test and Medication were unclear to the participants as some code sets have changed from previous versions of the software.

- Areas for improvement The task results revealed a need for streamlined workflow of Laboratory Tests and Rad./Imaging Test entry. Future tester will require a more specific path designation for testers to follow to complete their task with fewer path deviations.

Page 41: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

METHOD

PARTICIPANTS

A total of 5 participants were tested on the EHRUT(s). Participants in the test were Optical Healthcare Professionals. Participants were recruited by Insight Software, LLC. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

Part ID Gender Age Education Occup./Role Prof. Exp. Compu. Exp. Product Exp. Assistive Tech Needs

1234 Male 54 Doctorate Optometric Phys. 30 years 15 years 4 years None 2345 Female 47 Masters Optometric IT 20 years 15 years 5 years None 3456 Male 49 Doctorate Optometrist 25 years 10 years 1.5 years None 4567 Female 36 Masters Optometric Trainer 6 years 7 years 3.5 years None 5678 Male 37 Doctorate Optometrist 9 years 12 years 4.5 years None

Table 1. Participant Description. 5 participants (matching the demographics in the section on Participants) were recruited and 5 participated in the usability test. No participants failed to show for the study.

STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with one EHR(s). Each participant used the system in the same location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant:

• Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations (comments) • Participant’s satisfaction ratings of the system

Page 42: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

1. Record a Medication 2. Change and Access a Medication 3. Record a Laboratory Test 4. Change and Access a Laboratory test 5. Record a Radiology/Imaging test 6. Change and Access a Radiology/Imaging test

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.

PROCEDURES Upon connection, participants were greeted; their identity was verified and matched with a name on the participant schedule. Participants were then assigned a participant ID. To ensure that the test ran smoothly, two staff members participated in this test, the usability administrator and the data logger. The usability testing staff conducting the test was experienced usability practitioners. Each participant reviewed and signed an informed consent and release form (See Appendix [A]). The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post-task rating data, and took notes on participant comments. A second person served as the data logger and took notes on task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below):

• As quickly as possible making as few errors and deviations as possible. • Without assistance; administrators were allowed to give immaterial guidance and clarification on

tasks, but not instructions on use. • Without using a think aloud technique.

For each task, the participants were given a written copy of the task. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Following the session, the administrator thanked each individual for their participation. Participants' demographic information, task success rate, time on task, errors, deviations, and verbal responses were recorded into a spreadsheet.

Page 43: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST LOCATION The test location was a cloud environment where each participant had their own desktop to log into. All observers and the data logger worked from a remote location and using an online meeting conference which could see the participant’s screen and review the process which was taken. To ensure that the environment was comfortable for users, the participants were able to connect from the comfort of their home.

TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in cloud environment in an online group meeting. For testing, the computer used a Dell running Windows Server 2008 R2 with RDP. The participants used RDP, Mouse and Keyboard when interacting with the EHRUT. The My Vision Express Software used a screen resolution of 1280 x 768 with all participants using a 20 inch monitor. The application was set up by the Insight Software, LLC according to the vendor’s documentation describing the system set-up and preparation. The application itself was running on a Windows SQL Server 2012 using a test database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size or themes).

Page 44: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including:

1. Informed Consent 2. Moderator’s Guide 3. Post-Test Questionnaire

Examples of these documents can be found in Appendices. The Moderator’s Guide was devised so as to be able to capture required data. Online meeting room and shadow terminal session access was used to view the participants session as they went through each task.

PARTICIPANT INSTRUCTIONS The administrator reads the following instructions aloud to the each participant (also see the full moderator’s guide in Appendix [B]): Thank you for participating in this study. Your input is very important. Our session today will last about 60 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing. Following the procedural instructions, participants were shown the EHR and as their first task, were given time 10 minutes to explore the system and make comments. Participants were then given 6 tasks to complete. Tasks are listed in the moderator’s guide in Appendix [B].

USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess:

1. Effectiveness of My Vision Express by measuring participant success rates and errors 2. Efficiency of My Vision Express by measuring the average task time and path deviations 3. Satisfaction with My Vision Express by measuring ease of use ratings

Page 45: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

DATA SCORING The following table (Table 2) details how tasks were scored, errors evaluated, and the time data analyzed. Measures Rationale and Scoring

Effectiveness: Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by 1.25 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 120 seconds then allotted task time performance was 150 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness: Task Failure

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. This should also be expressed as the mean number of failed tasks per participant. On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency: Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 46: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Efficiency: Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction: Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Easy) to 5 (Very Difficult). These data are averaged across participants. Common convention is that average ratings for systems judged easy to use should be 2.6 or below.

Table 2. Details of how observed data were scored.

RESULTS

DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. Some issues were identified when the a couple participants were attempting to electronically prescribe.

Page 47: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EFFECTIVENESS Based on the findings in this test, the mean success for all the test was found to be 0.8. This reveals a decent success rate but notes a few failures amongst the 5 tasks. Any failures were predominately found within the tasks requiring initial data entry. One task in particular, (Record a Rad./Imaging Test), found a higher path deviation than other test cases. Some of the participants were familiar with the Radiology and Imagining Test data but unfamiliar with the attributes associated to adding a Test to the EHR. Upon saving the record, a prompt of a missing attribute caused a significant path deviation to complete the Task.

EFFICIENCY Based on the observations for these tasks, the participants showed they were able to accomplish most of the tasks in the optimal time assumed. The (Record a Rad./Imaging Test) task was shown to be the highest task time deviation which in turn caused the task to result in a failure in some cases. The remaining showed the participants understanding and familiarity of the Medication and Laboratory sections of the software.

SATISFACTION The task rating rated 1 as very easy and 5 as very difficult. A value of 2.6 or below would render the task satisfactory amongst the set of participants in this group. Most of the task did rate fewer than 2.6 but some of the tasks were very close to the threshold. Feedback revealed a dislike of standardized code sets, such as LOINC code for Laboratory Results. The majority of the participants found the tasks very easy to accomplish.

MAJOR FINDINGS Tasks 1 and 2, Medication Data relates tasks, were found to be the easiest for all the participants. The remaining tasks pertaining to Laboratory and Imaging test were the more difficult tasks for the majority of the participants. Some participants were unfamiliar with these sections of the HER and deviated from the designated path for the tasks. This ultimately contributed some of the failure in these tasks. Those whom showed familiarity were under the optimal time and completed their tasks with satisfactory rating.

AREAS FOR IMPROVEMENT Upon completion of the test and review of the results, data entry for Rad./Imaging and Laboratory Tests requires attention to data entry work flow. Some participants described the process as not optimal and the association from a code set to test name should be streamlined. The participants associated to the name of the Laboratory and Rad./Imaging test better than its standardized code representation. This observation suggests the selection of the test being performed should focus but not limit to a name selection.

Page 48: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [A]: Informed Consent

Test Company would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and give your feedback. The study will last about 60 minutes. At the conclusion of the test, you will be compensated for your time. Agreement I understand and agree that as a voluntary participant in the present study conducted by Test Company I am free to withdraw consent or discontinue participation at any time. I understand and agree to participate in the study conducted and videotaped by the Test Company. I understand and consent to the use and release of the videotape by Test Company. I understand that the information and videotape is for research purposes only and that my name and image will not be used for any purpose other than research. I relinquish any rights to the videotape and understand the videotape may be copied and used by Test Company without further permission. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from this study may be shared with outside of Test Company and Test Company’s client. I understand and agree that data confidentiality is assured, because only de-identified data – i.e., identification numbers not names – will be used in analysis and reporting of the results. I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can leave at any time. Please check one of the following: YES, I have read the above statement and agree to be a participant. NO, I choose not to participate in this study. Signature: _____________________________________ Date: ____________________

Page 49: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [B]: Tasks/Questionnaire Orientation (1 minutes) Thank you for participating in this study. Our session today will last 60 minutes. During that time you will take a look at an electronic health record system. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked. If you get lost or have difficulty I cannot answer help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. I did not have any involvement in its creation, so please be honest with your opinions. The product you will be using today is My Vision Express 2014. Some of the data may not make sense as it is placeholder data. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Preliminary Questions (5 minutes) What is your job title / appointment? How long have you been working in this role? What are some of your main responsibilities? Tell me about your experience with electronic health records.

Page 50: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 1: First Impressions (120 Seconds) This is the application you will be working with. Have you heard of it? ____Yes ____No If so, tell me what you know about it. Please a brief description of your first impression of My Vision Express 2014. Notes / Comments:

Page 51: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 2: Record a Medication (50 Seconds) This task will ask for you to Search for a specific patient and Add a Medication. Suggested Medication – RXCUI List:

• Xalatan - 542527 • Travatan - 285032 • Pataday - 686437 • Patanol - 992116 • Vigamox - 404473

Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: ADD: Open Patient Module Search for Patient ’John Doe’ Patient # 115 After Opening, Click the Health tab Click the Medications tab Click the ‘Add’ button A Search window will open, search for any of above listed Medications After a Medication, select physician ‘Ben Franklin’ and an expire date __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 52: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 3: Change and Access a Medication (30 Seconds) This task will ask for you to Search for a specific patient and Change and Revisit a Medication. Suggested Medication – RXCUI List:

• Xalatan - 542527 • Travatan - 285032 • Pataday - 686437 • Patanol - 992116 • Vigamox - 404473

Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: CHANGE: Open Patient Module Search for Patient ’John Doe’ Patient # 115 After Opening, Click the Health tab Click the Medications tab Double Click on the Medication just added Change the dosage Click Ok ACCESS: Verify the change is properly represented in the list of medication __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 53: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 4: Record a Laboratory Test (80 Seconds) This task will ask Add a Laboratory Test. Suggested Laboratory LOINC code List:

• 2093-3 – Cholesterol in Serum or Plasma • 2888-6 – Protein in Urine • 718-7 – Hemoglobin in Blood • 2947-0 – Sodium in Blood • 6298-4 – Potassium in Blood

Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: ADD: For the same patient ‘John Doe’ Click the Health tab Click the Laboratory tab Click the ‘Add’ button A new line will appear, Select a value for Test Type Enter one of the Lab. codes listed above into the LOINC field Select the ‘Time Type’ as Routine Click Save __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 54: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 5: Change and Access a Laboratory Test (30 Seconds) This task will ask Change and Access a Laboratory Test. Suggested Laboratory LOINC code List:

• 2093-3 – Cholesterol in Serum or Plasma • 2888-6 – Protein in Urine • 718-7 – Hemoglobin in Blood • 2947-0 – Sodium in Blood • 6298-4 – Potassium in Blood

Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: ADD: For the same patient ‘John Doe’ Click the Health tab Click the Laboratory tab Change the ‘Time Type’ to Custom Click Save IDENTIFY: Identify the change is properly represented in the list of Laboratory Orders __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 55: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 6: Record a Radiology/Imaging Test (85 Seconds) This task will ask Add a Radiology/Imaging (Diagnostic Test). Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: ADD: For the same patient ‘John Doe’ Click the Health tab Click the Diagnostic tab Click the ‘Add’ button A new line will appear, Select ‘Imagining’ value for Test Type Select the ‘Time Type’ as Routine Click Save __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 56: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 7: Change and Access a Radiology/Imaging Test (30 Seconds) This task will ask Add, Change and Access a Radiology/Imaging (Diagnostic Test). Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: CHANGE: For the same patient ‘John Doe’ Click the Health tab Click the Diagnostic tab Change the ‘Test Status’ to InActive Click Save IDENTIFY: Identify the change is properly represented in the list of Diagnostic Tests __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 57: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Final Questions (10 Minutes) What was your overall impression of this system? What aspects of the system did you like most? What aspects of the system did you like least? Were there any features that you were surprised to see? What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? Compare this system to other systems you have used. Would you recommend this system to your colleagues?

Page 58: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

INSIGHT SOFTWARE, LLC

User Centered Design Analysis

170.314(A)(2) Drug-Drug, Drug-Allergy Interaction Check

Eduardo Martinez

7/3/2013

EHR Usability Test Report of My Vision Express 2014 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports My Vision Express 2014 Date of Usability Test: June 29th 2013

Page 59: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Table of Contents EXECUTIVE SUMMARY .................................................................................................................................. 2

METHOD ........................................................................................................................................................ 4

PARTICIPANTS ........................................................................................................................................... 4

STUDY DESIGN............................................................................................ Error! Bookmark not defined.

TASKS ........................................................................................................................................................ 5

PROCEDURES ............................................................................................................................................. 5

TEST LOCATION ......................................................................................................................................... 6

TEST ENVIRONMENT ................................................................................................................................. 6

TEST FORMS AND TOOLS .......................................................................................................................... 7

PARTICIPANT INSTRUCTIONS .................................................................................................................... 7

USABILITY METRICS ................................................................................................................................... 7

DATA SCORING .............................................................................................................................................. 8

RESULTS ........................................................................................................................................................ 9

DATA ANALYSIS AND REPORTING ............................................................................................................. 9

EFFECTIVENESS ........................................................................................................................................... 10

EFFICIENCY .................................................................................................................................................. 10

SATISFACTION ............................................................................................................................................. 10

MAJOR FINDINGS ........................................................................................................................................ 10

AREAS FOR IMPROVEMENT ........................................................................................................................ 10

APPENDIX [A]: Informed Consent ............................................................................................................... 11

APPENDIX [B]: Tasks/Questionnaire ........................................................................................................... 12

Task 1: First Impressions ......................................................................................................................... 13

Task 2: Record a Medication ................................................................................................................... 14

Task 3: Change and Access a Medication ............................................................................................... 15

Final Questions ........................................................................................................................................ 16

Page 60: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EXECUTIVE SUMMARY A usability test of My Vision Express, 2014, Optical EHR was conducted on 6/29/2013 in a remote web environment by Insight Software, LLC. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, 5 healthcare providers and optical managers matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 2 tasks typically conducted on an EHR:

1. Indicate Drug-Drug and Drug-Allergy Intervention 2. Adjust Intervention Severity Level

During the 20 minute usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data electronically. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: • Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations • Participant’s satisfaction ratings of the system All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a posttest questionnaire. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

The overall results show the Task Success with a mean average of 0.6. This was directly affected by the path deviation caused by the error which was created for a couple of participants. The path deviation deviations were averaged at 1.8 which proves to be very high for this test. Added workarounds were

Page 61: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

taken by the participants to finish the tasks. The overall task rating was a 2.1 and under the 2.6 threshold classifying the tasks as satisfactory. In addition to the performance data, the following qualitative observations were made: - Major findings

The error encountered by a couple of test participants was the major finding for this test. Those whom did not encounter the error found the test easy and straight forward. The error was later identified to be a sporadic issue in the test environment related to electronic prescribing. The tasks lead the participants to a revamped version of the electronic prescription. The participants showed to adapt quickly to the changes and, for those without the error, accomplish the task successfully.

- Areas for improvement Upon completion of the test and reviewing the feedback from participants, the test was very straight forward but the encountered errors need to be avoided. The change to the electronic prescribing module will require a transition document for those who were used to the format in previous versions.

Page 62: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

METHOD

PARTICIPANTS

A total of 5 participants were tested on the EHRUT(s). Participants in the test were Optical Healthcare Professionals. Participants were recruited by Insight Software, LLC. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

Part ID Gender Age Education Occup./Role Prof. Exp. Compu. Exp. Product Exp. Assistive Tech Needs

1234 Male 54 Doctorate Optometric Phys. 30 years 15 years 4 years None 2345 Female 47 Masters Optometric IT 20 years 15 years 5 years None 3456 Male 49 Doctorate Optometrist 25 years 10 years 1.5 years None 4567 Female 36 Masters Optometric Trainer 6 years 7 years 3.5 years None 5678 Male 37 Doctorate Optometrist 9 years 12 years 4.5 years None

Table 1. Participant Description. 5 participants (matching the demographics in the section on Participants) were recruited and 5 participated in the usability test. No participants failed to show for the study.

STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with one EHR(s). Each participant used the system in the same location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant:

• Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations (comments) • Participant’s satisfaction ratings of the system

Page 63: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

1. Indicate Drug-Drug and Drug-Allergy Intervention 2. Adjust Intervention Severity Level

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.

PROCEDURES Upon connection, participants were greeted; their identity was verified and matched with a name on the participant schedule. Participants were then assigned a participant ID. To ensure that the test ran smoothly, two staff members participated in this test, the usability administrator and the data logger. The usability testing staff conducting the test was experienced usability practitioners. Each participant reviewed and signed an informed consent and release form (See Appendix [A]). The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post-task rating data, and took notes on participant comments. A second person served as the data logger and took notes on task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below):

• As quickly as possible making as few errors and deviations as possible. • Without assistance; administrators were allowed to give immaterial guidance and clarification on

tasks, but not instructions on use. • Without using a think aloud technique.

For each task, the participants were given a written copy of the task. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Following the session, the administrator thanked each individual for their participation. Participants' demographic information, task success rate, time on task, errors, deviations, and verbal responses were recorded into a spreadsheet.

Page 64: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST LOCATION The test location was a cloud environment where each participant had their own desktop to log into. All observers and the data logger worked from a remote location and using an online meeting conference which could see the participant’s screen and review the process which was taken. To ensure that the environment was comfortable for users, the participants were able to connect from the comfort of their home.

TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in cloud environment in an online group meeting. For testing, the computer used a Dell running Windows Server 2008 R2 with RDP. The participants used RDP, Mouse and Keyboard when interacting with the EHRUT. The My Vision Express Software used a screen resolution of 1280 x 768 with all participants using a 20 inch monitor. The application was set up by the Insight Software, LLC according to the vendor’s documentation describing the system set-up and preparation. The application itself was running on a Windows SQL Server 2012 using a test database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size or themes).

Page 65: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including:

1. Informed Consent 2. Moderator’s Guide 3. Post-Test Questionnaire

Examples of these documents can be found in Appendices. The Moderator’s Guide was devised so as to be able to capture required data. Online meeting room and shadow terminal session access was used to view the participants session as they went through each task.

PARTICIPANT INSTRUCTIONS The administrator reads the following instructions aloud to the each participant (also see the full moderator’s guide in Appendix [B]): Thank you for participating in this study. Your input is very important. Our session today will last about 60 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing. Following the procedural instructions, participants were shown the EHR and as their first task, were given time 10 minutes to explore the system and make comments. Participants were then given 2 tasks to complete. Tasks are listed in the moderator’s guide in Appendix [B].

USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess:

1. Effectiveness of My Vision Express by measuring participant success rates and errors 2. Efficiency of My Vision Express by measuring the average task time and path deviations 3. Satisfaction with My Vision Express by measuring ease of use ratings

Page 66: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

DATA SCORING The following table (Table 2) details how tasks were scored, errors evaluated, and the time data analyzed. Measures Rationale and Scoring

Effectiveness: Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by 1.25 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 120 seconds then allotted task time performance was 150 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness: Task Failure

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. This should also be expressed as the mean number of failed tasks per participant. On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency: Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 67: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Efficiency: Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction: Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Easy) to 5 (Very Difficult). These data are averaged across participants. Common convention is that average ratings for systems judged easy to use should be 2.6 or below.

Table 2. Details of how observed data were scored.

RESULTS

DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. Some issues were identified when the a couple participants were attempting to electronically prescribe.

Page 68: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EFFECTIVENESS Based on the findings in this test, the success rate was severely hindered by the errors which existed for a couple of participants. With a success rate of 0.6, the tasks were not shown to be effective. An unexplained error from the electronic prescribing section of the software created an average Error mean of 0.4 and an average path deviation of 1.8. Not all participants were affected by the error, of those unaffected these tasks showed to be successful.

EFFICIENCY Based on the task time and deviation results of the tasks, the suggested path was observed to be followed very closely resulting in average times under the optimal time threshold. This showed the tasks were efficient with the proposed path.

SATISFACTION The average task rating was measured as 2.1 and is just under the 2.6 rating threshold for satisfaction. Though satisfactory, the average was skewed by the failures caused by an error and ultimately causing a high path deviation.

MAJOR FINDINGS The error encountered by a couple of test participants was the major finding for this test. Those whom did not encounter the error found the test easy and straight forward. The error was later identified to be a sporadic issue in the test environment related to electronic prescribing. The tasks lead the participants to a revamped version of the electronic prescription module. The participants showed to adapt quickly to the changes and, for those without the error, accomplish the task successfully.

AREAS FOR IMPROVEMENT Upon completion of the test and reviewing the feedback from participants, the test was very straight forward but the encountered errors need to be avoided. The change to the electronic prescribing module will require a transition document for those who were used to the format in previous versions.

Page 69: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [A]: Informed Consent

Test Company would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and give your feedback. The study will last about 60 minutes. At the conclusion of the test, you will be compensated for your time. Agreement I understand and agree that as a voluntary participant in the present study conducted by Test Company I am free to withdraw consent or discontinue participation at any time. I understand and agree to participate in the study conducted and videotaped by the Test Company. I understand and consent to the use and release of the videotape by Test Company. I understand that the information and videotape is for research purposes only and that my name and image will not be used for any purpose other than research. I relinquish any rights to the videotape and understand the videotape may be copied and used by Test Company without further permission. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from this study may be shared with outside of Test Company and Test Company’s client. I understand and agree that data confidentiality is assured, because only de-identified data – i.e., identification numbers not names – will be used in analysis and reporting of the results. I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can leave at any time. Please check one of the following: YES, I have read the above statement and agree to be a participant. NO, I choose not to participate in this study. Signature: _____________________________________ Date: ____________________

Page 70: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [B]: Tasks/Questionnaire Orientation (1 minutes) Thank you for participating in this study. Our session today will last 60 minutes. During that time you will take a look at an electronic health record system. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked. If you get lost or have difficulty I cannot answer help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. I did not have any involvement in its creation, so please be honest with your opinions. The product you will be using today is My Vision Express 2014. Some of the data may not make sense as it is placeholder data. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Preliminary Questions (5 minutes) What is your job title / appointment? How long have you been working in this role? What are some of your main responsibilities? Tell me about your experience with electronic health records.

Page 71: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 1: First Impressions (120 Seconds) This is the application you will be working with. Have you heard of it? ____Yes ____No If so, tell me what you know about it. Please a brief description of your first impression of My Vision Express 2014. Notes / Comments:

Page 72: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 2: Indicate a Drug-Drug and Drug-Allergy Intervention (200 Seconds) This task will lead you to e-prescribe a medication to a patient and cause an intervention to be triggered. Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: INTERACT: Open Patient Module Search for Patient ’Gisele Bündchen’ Patient # 4 (If not already opened) After Opening, Click the Health tab Click the Medications Tab Click the e-Prescribe Button A new window will open in full screen mode showing the Dr. First Portal for the patient The patient information will display for verification, under the patient information section is an ‘Add a Medication’ field Enter the medication Ibuprofen, Click Find Select the 400 MG selection A Allergy Alert will be displayed : “Allergy Alert! Gisele Bündchen (02/16/1965) is listed as having an allergic reaction (Abdominal Cramps) to aspirin. Patients with this allergy are generally allergic to the drug you have just prescribed, as they are both related to Salicylates group. Proceed with extreme caution” __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 73: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 3: Adjust Intervention Level (100 Seconds) This task will ask for you to adjust the Intervention level and confirm the intervention behavior. Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: CHANGE: Login as an administrator Open Patient Module Search for Patient ’Gisele Bündchen’ Patient # 4 (If not already opened) After Opening, Click the Health tab Click the Medications Tab Click the e-Prescribe Button Click on Options link in the top banner Click on Preference Practice Change the option for ‘When checking for Drug-Drug Interactions, Show:’ to “Sever and Contraindicated Only” Click ‘Make these Changes’ INTERACT: Click Medications link in the top banners Enter the medication Ibuprofen, Click Find Select the 400 MG selection No alert should be displayed. __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 74: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Final Questions (10 Minutes) What was your overall impression of this system? What aspects of the system did you like most? What aspects of the system did you like least? Were there any features that you were surprised to see? What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? Compare this system to other systems you have used. Would you recommend this system to your colleagues?

Page 75: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

INSIGHT SOFTWARE, LLC

User Centered Design Analysis

170.314(B)(3) Electronic Prescription

Eduardo Martinez

7/3/2013

EHR Usability Test Report of My Vision Express 2014 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports My Vision Express 2014 Date of Usability Test: June 29th 2013

Page 76: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Table of Contents EXECUTIVE SUMMARY .................................................................................................................................. 2

METHOD ........................................................................................................................................................ 4

PARTICIPANTS ........................................................................................................................................... 4

STUDY DESIGN............................................................................................ Error! Bookmark not defined.

TASKS ........................................................................................................................................................ 5

PROCEDURES .............................................................................................. Error! Bookmark not defined.

TEST LOCATION ......................................................................................................................................... 6

TEST ENVIRONMENT ................................................................................................................................. 6

TEST FORMS AND TOOLS ........................................................................... Error! Bookmark not defined.

PARTICIPANT INSTRUCTIONS ..................................................................... Error! Bookmark not defined.

USABILITY METRICS .................................................................................... Error! Bookmark not defined.

DATA SCORING .............................................................................................................................................. 8

RESULTS ........................................................................................................................................................ 9

DATA ANALYSIS AND REPORTING ............................................................................................................. 9

EFFECTIVENESS ............................................................................................................................................. 9

EFFICIENCY .................................................................................................................................................... 9

SATISFACTION ............................................................................................................................................. 10

MAJOR FINDINGS ........................................................................................................................................ 10

AREAS FOR IMPROVEMENT ........................................................................................................................ 10

Page 77: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EXECUTIVE SUMMARY A usability test of My Vision Express, 2014, Optical EHR was conducted on 6/29/2013 in a remote web environment by Insight Software, LLC. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, 5 healthcare providers and optical managers matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 1 task typically conducted on an EHR:

1. Electronically Create a Prescription During the 20 minute usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data electronically. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: • Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations • Participant’s satisfaction ratings of the system All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a posttest questionnaire. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

The overall results show the Task Success with a mean average of 0.6. This was directly affected by the path and time deviation caused by the error which was created for a couple of participants. The path deviation deviations were averaged at 1.4 which proves to be very high for this test. Added workarounds were taken by the participants to finish the tasks. The overall task rating was a 2.6 equal to the 2.6 threshold classifying the tasks as closely satisfactory.

Page 78: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

In addition to the performance data, the following qualitative observations were made: - Major findings

The error encountered by a couple of test participants was the major finding for this test. Those whom did not encounter the error found the test easy and straight forward. The error was later identified to be a sporadic issue in the test environment related to electronic prescribing. The tasks lead the participants to a revamped version of the electronic prescription. The participants showed to adapt quickly to the changes and, for those without the error, accomplish the task successfully.

- Areas for improvement Upon completion of the test and reviewing the feedback from participants, the test was very straight forward but the encountered errors need to be avoided. The change to the electronic prescribing module will require a transition document for those who were used to the format in previous versions.

Page 79: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

METHOD

PARTICIPANTS

A total of 5 participants were tested on the EHRUT(s). Participants in the test were Optical Healthcare Professionals. Participants were recruited by Insight Software, LLC. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

Part ID Gender Age Education Occup./Role Prof. Exp. Compu. Exp. Product Exp. Assistive Tech Needs

1234 Male 54 Doctorate Optometric Phys. 30 years 15 years 4 years None 2345 Female 47 Masters Optometric IT 20 years 15 years 5 years None 3456 Male 49 Doctorate Optometrist 25 years 10 years 1.5 years None 4567 Female 36 Masters Optometric Trainer 6 years 7 years 3.5 years None 5678 Male 37 Doctorate Optometrist 9 years 12 years 4.5 years None

Table 1. Participant Description. 5 participants (matching the demographics in the section on Participants) were recruited and 5 participated in the usability test. No participants failed to show for the study.

STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with one EHR(s). Each participant used the system in the same location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant:

• Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations (comments) • Participant’s satisfaction ratings of the system

Page 80: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

1. Electronically Create a Prescription Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.

PROCEDURES Upon connection, participants were greeted; their identity was verified and matched with a name on the participant schedule. Participants were then assigned a participant ID. To ensure that the test ran smoothly, two staff members participated in this test, the usability administrator and the data logger. The usability testing staff conducting the test was experienced usability practitioners. Each participant reviewed and signed an informed consent and release form (See Appendix [A]). The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post-task rating data, and took notes on participant comments. A second person served as the data logger and took notes on task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below):

• As quickly as possible making as few errors and deviations as possible. • Without assistance; administrators were allowed to give immaterial guidance and clarification on

tasks, but not instructions on use. • Without using a think aloud technique.

For each task, the participants were given a written copy of the task. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Following the session, the administrator thanked each individual for their participation. Participants' demographic information, task success rate, time on task, errors, deviations, and verbal responses were recorded into a spreadsheet.

Page 81: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST LOCATION The test location was a cloud environment where each participant had their own desktop to log into. All observers and the data logger worked from a remote location and using an online meeting conference which could see the participant’s screen and review the process which was taken. To ensure that the environment was comfortable for users, the participants were able to connect from the comfort of their home.

TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in cloud environment in an online group meeting. For testing, the computer used a Dell running Windows Server 2008 R2 with RDP. The participants used RDP, Mouse and Keyboard when interacting with the EHRUT. The My Vision Express Software used a screen resolution of 1280 x 768 with all participants using a 20 inch monitor. The application was set up by the Insight Software, LLC according to the vendor’s documentation describing the system set-up and preparation. The application itself was running on a Windows SQL Server 2012 using a test database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size or themes).

Page 82: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including:

1. Informed Consent 2. Moderator’s Guide 3. Post-Test Questionnaire

Examples of these documents can be found in Appendices. The Moderator’s Guide was devised so as to be able to capture required data. Online meeting room and shadow terminal session access was used to view the participants session as they went through each task.

PARTICIPANT INSTRUCTIONS The administrator reads the following instructions aloud to the each participant (also see the full moderator’s guide in Appendix [B]): Thank you for participating in this study. Your input is very important. Our session today will last about 60 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing. Following the procedural instructions, participants were shown the EHR and as their first task, were given time 10 minutes to explore the system and make comments. Participants were then given 1 task to complete. Tasks are listed in the moderator’s guide in Appendix [B].

USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess:

1. Effectiveness of My Vision Express by measuring participant success rates and errors 2. Efficiency of My Vision Express by measuring the average task time and path deviations 3. Satisfaction with My Vision Express by measuring ease of use ratings

Page 83: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

DATA SCORING The following table (Table 2) details how tasks were scored, errors evaluated, and the time data analyzed. Measures Rationale and Scoring

Effectiveness: Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by 1.25 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 120 seconds then allotted task time performance was 150 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness: Task Failure

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. This should also be expressed as the mean number of failed tasks per participant. On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency: Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 84: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Efficiency: Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction: Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Easy) to 5 (Very Difficult). These data are averaged across participants. Common convention is that average ratings for systems judged easy to use should be 2.6 or below.

Table 2. Details of how observed data were scored.

RESULTS

DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. Some issues were identified when the a couple participants were attempting to electronically prescribe.

EFFECTIVENESS Based on the findings in this test, the success rate was severely hindered by the errors which existed for a couple of participants. With a success rate of 0.6, the tasks were not shown to be effective. An unexplained error from the electronic prescribing section of the software created an average Error mean

Page 85: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

of 0.4 and an average path deviation of 1.4. Not all participants were affected by the error, of those unaffected this task showed to be successful.

EFFICIENCY Based on the task time and deviation results of the tasks, the suggested path was observed to be followed resulting in average times over the optimal time threshold. This showed the task to be inefficient due to errors and elongated time of completion.

SATISFACTION The average task rating was measured as 2.6 which is equal to the 2.6 rating threshold for satisfaction. Though meeting satisfactory measure, the average was skewed by the failures caused by an error and ultimately causing a high path deviation and task time.

MAJOR FINDINGS The error encountered by a couple of test participants was the major finding for this test. Those whom did not encounter the error found the test easy and straight forward. The error was later identified to be a sporadic issue in the test environment related to electronic prescribing module. The tasks lead the participants to a revamped version of the electronic prescription. The participants showed to adapt quickly to the changes and, for those without the error, accomplish the task successfully.

AREAS FOR IMPROVEMENT Upon completion of the test and reviewing the feedback from participants, the test was very straight forward but the encountered errors need to be avoided. The change to the electronic prescribing module will require a transition document for those who were used to the format in previous versions.

Page 86: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [A]: Informed Consent

Test Company would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and give your feedback. The study will last about 60 minutes. At the conclusion of the test, you will be compensated for your time. Agreement I understand and agree that as a voluntary participant in the present study conducted by Test Company I am free to withdraw consent or discontinue participation at any time. I understand and agree to participate in the study conducted and videotaped by the Test Company. I understand and consent to the use and release of the videotape by Test Company. I understand that the information and videotape is for research purposes only and that my name and image will not be used for any purpose other than research. I relinquish any rights to the videotape and understand the videotape may be copied and used by Test Company without further permission. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from this study may be shared with outside of Test Company and Test Company’s client. I understand and agree that data confidentiality is assured, because only de-identified data – i.e., identification numbers not names – will be used in analysis and reporting of the results. I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can leave at any time. Please check one of the following: YES, I have read the above statement and agree to be a participant. NO, I choose not to participate in this study. Signature: _____________________________________ Date: ____________________

Page 87: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [B]: Tasks/Questionnaire Orientation (1 minutes) Thank you for participating in this study. Our session today will last 60 minutes. During that time you will take a look at an electronic health record system. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked. If you get lost or have difficulty I cannot answer help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. I did not have any involvement in its creation, so please be honest with your opinions. The product you will be using today is My Vision Express 2014. Some of the data may not make sense as it is placeholder data. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Preliminary Questions (5 minutes) What is your job title / appointment? How long have you been working in this role? What are some of your main responsibilities? Tell me about your experience with electronic health records.

Page 88: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 1: First Impressions (120 Seconds) This is the application you will be working with. Have you heard of it? ____Yes ____No If so, tell me what you know about it. Please a brief description of your first impression of My Vision Express 2014. Notes / Comments:

Page 89: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 2: Electronically Create a Prescription (400 Seconds) This task will introduce the electronic prescribing feature of the software. You will select a patient and enter the e-prescribing portal to electronically submit a prescription. Medication List:

• Xalatan • Travatan • Pataday • Patanol • Vigamox

Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: IMPORT: Open Patient Module Search for Patient ’Gisele Bündchen’ Patient # 4 After Opening, Click the Health tab Click the Medications Tab Click the e-Prescribe Button A new window will open in full screen mode showing the Dr. First Portal for the patient The patient information will display for verification, under the patient information section is an ‘Add a Medication’ field Enter one of the above medications listed above, Click Find The Medication prescription will show, please enter related signature, duration and quantity you feel is typical Click Continue The Medication list for Gisele Bündchen will be updated with the added medication Close the window IDENTIFY: Closing the window will initiate a short synchronization period to update the current list of medications with the changes made in the Dr. First interface Identify the medication added is now listed for Patient Gisele Bündchen. __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 90: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Final Questions (10 Minutes) What was your overall impression of this system? What aspects of the system did you like most? What aspects of the system did you like least? Were there any features that you were surprised to see? What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? Compare this system to other systems you have used. Would you recommend this system to your colleagues?

Page 91: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

INSIGHT SOFTWARE, LLC

User Centered Design Analysis

170.314(A)(7) Medication Allergy List

Eduardo Martinez

7/3/2013

EHR Usability Test Report of My Vision Express 2014 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports My Vision Express 2014 Date of Usability Test: June 29th 2013

Page 92: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Table of Contents EXECUTIVE SUMMARY .................................................................................................................................. 2

METHOD ........................................................................................................................................................ 4

PARTICIPANTS ........................................................................................................................................... 4

STUDY DESIGN........................................................................................................................................... 4

TASKS ........................................................................................................................................................ 5

PROCEDURES ............................................................................................................................................. 5

TEST LOCATION ......................................................................................................................................... 6

TEST ENVIRONMENT ................................................................................................................................. 6

TEST FORMS AND TOOLS .......................................................................................................................... 7

PARTICIPANT INSTRUCTIONS .................................................................................................................... 7

USABILITY METRICS ................................................................................................................................... 7

DATA SCORING .............................................................................................................................................. 8

RESULTS ........................................................................................................................................................ 9

DATA ANALYSIS AND REPORTING ............................................................................................................. 9

EFFECTIVENESS ........................................................................................................................................... 10

EFFICIENCY .................................................................................................................................................. 10

SATISFACTION ............................................................................................................................................. 10

MAJOR FINDINGS ........................................................................................................................................ 10

AREAS FOR IMPROVEMENT ........................................................................................................................ 10

APPENDIX [A]: Informed Consent ............................................................................................................... 11

APPENDIX [B]: Tasks/Questionnaire ........................................................................................................... 12

Task 1: First Impressions ......................................................................................................................... 13

Task 2: Record a Medication Allergy for a Patient .................................................................................. 14

Task 3: Change and Access Medication Allergy History .......................................................................... 15

Final Questions ........................................................................................................................................ 16

Page 93: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EXECUTIVE SUMMARY A usability test of My Vision Express, 2014, Optical EHR was conducted on 6/29/2013 in a remote web environment by Insight Software, LLC. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, 5 healthcare providers and optical managers matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 2 tasks typically conducted on an EHR:

1. Record a Patient’s Medication Allergy 2. Change and Access Medication Allergy History

During the 20 minute usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data electronically. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: • Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations • Participant’s satisfaction ratings of the system All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a posttest questionnaire. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

The tasks for this test produced a high success rate and revealed no errors proving the subject matter effective. The participants found the tasks fairly easy and average task rating of 1.9 which is below the

Page 94: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

2.6 threshold making the task satisfactory for the testing participants. The average task times were very close to their project optimal time as most participants followed the designated path very closely. In addition to the performance data, the following qualitative observations were made: - Major findings

Upon reviewing the participants’ comments, many found the workflow for inputting medication allergies in task 1 to be inconsistent with the other related sections of the software. The path deviation and single failure revealed how the module could possible hinder input efficiency.

- Areas for improvement

Based on the findings for this test, the workflow of the Medication Allergy section of the software requires some review. More than one participant commented on the workflow in the ‘Record a Patient’s Medication Allergy’ task. The consensus found the task fairly easy but emphasized consistency concerns.

Page 95: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

METHOD

PARTICIPANTS

A total of 5 participants were tested on the EHRUT(s). Participants in the test were Optical Healthcare Professionals. Participants were recruited by Insight Software, LLC. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

Part ID Gender Age Education Occup./Role Prof. Exp. Compu. Exp. Product Exp. Assistive Tech Needs

1234 Male 54 Doctorate Optometric Phys. 30 years 15 years 4 years None 2345 Female 47 Masters Optometric IT 20 years 15 years 5 years None 3456 Male 49 Doctorate Optometrist 25 years 10 years 1.5 years None 4567 Female 36 Masters Optometric Trainer 6 years 7 years 3.5 years None 5678 Male 37 Doctorate Optometrist 9 years 12 years 4.5 years None

Table 1. Participant Description. 5 participants (matching the demographics in the section on Participants) were recruited and 5 participated in the usability test. No participants failed to show for the study.

STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with one EHR(s). Each participant used the system in the same location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant:

• Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations (comments) • Participant’s satisfaction ratings of the system

Page 96: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

1. Record a Patient’s Medication Allergy 2. Change and Access Medication Allergy History

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.

PROCEDURES Upon connection, participants were greeted; their identity was verified and matched with a name on the participant schedule. Participants were then assigned a participant ID. To ensure that the test ran smoothly, two staff members participated in this test, the usability administrator and the data logger. The usability testing staff conducting the test was experienced usability practitioners. Each participant reviewed and signed an informed consent and release form (See Appendix [A]). The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post-task rating data, and took notes on participant comments. A second person served as the data logger and took notes on task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below):

• As quickly as possible making as few errors and deviations as possible. • Without assistance; administrators were allowed to give immaterial guidance and clarification on

tasks, but not instructions on use. • Without using a think aloud technique.

For each task, the participants were given a written copy of the task. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Following the session, the administrator thanked each individual for their participation. Participants' demographic information, task success rate, time on task, errors, deviations, and verbal responses were recorded into a spreadsheet.

Page 97: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST LOCATION The test location was a cloud environment where each participant had their own desktop to log into. All observers and the data logger worked from a remote location and using an online meeting conference which could see the participant’s screen and review the process which was taken. To ensure that the environment was comfortable for users, the participants were able to connect from the comfort of their home.

TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in cloud environment in an online group meeting. For testing, the computer used a Dell running Windows Server 2008 R2 with RDP. The participants used RDP, Mouse and Keyboard when interacting with the EHRUT. The My Vision Express Software used a screen resolution of 1280 x 768 with all participants using a 20 inch monitor. The application was set up by the Insight Software, LLC according to the vendor’s documentation describing the system set-up and preparation. The application itself was running on a Windows SQL Server 2012 using a test database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size or themes).

Page 98: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including:

1. Informed Consent 2. Moderator’s Guide 3. Post-Test Questionnaire

Examples of these documents can be found in Appendices. The Moderator’s Guide was devised so as to be able to capture required data. Online meeting room and shadow terminal session access was used to view the participants session as they went through each task.

PARTICIPANT INSTRUCTIONS The administrator reads the following instructions aloud to the each participant (also see the full moderator’s guide in Appendix [B]): Thank you for participating in this study. Your input is very important. Our session today will last about 60 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing. Following the procedural instructions, participants were shown the EHR and as their first task, were given time 10 minutes to explore the system and make comments. Participants were then given 2 tasks to complete. Tasks are listed in the moderator’s guide in Appendix [B].

USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess:

1. Effectiveness of My Vision Express by measuring participant success rates and errors 2. Efficiency of My Vision Express by measuring the average task time and path deviations 3. Satisfaction with My Vision Express by measuring ease of use ratings

Page 99: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

DATA SCORING The following table (Table 2) details how tasks were scored, errors evaluated, and the time data analyzed. Measures Rationale and Scoring

Effectiveness: Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by 1.25 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 120 seconds then allotted task time performance was 150 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness: Task Failure

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. This should also be expressed as the mean number of failed tasks per participant. On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency: Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 100: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Efficiency: Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction: Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Easy) to 5 (Very Difficult). These data are averaged across participants. Common convention is that average ratings for systems judged easy to use should be 2.6 or below.

Table 2. Details of how observed data were scored.

RESULTS

DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. Some issues were identified when the a couple participants were attempting to electronically prescribe. At the moment of the test connectivity to the interface was sporadic causing an error to show up.

Page 101: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EFFECTIVENESS Based on the test results, the two tasks were successful with an average of 0.9 for the mean success. A path deviation of 1.1 was calculated for the ‘Record a Patient’s Medication Allergy’ task. The participants, whose path deviations were observed, attempted to complete their task in a work flow familiar to the previous version of the software. There were no errors present in this test. The high success rate and stable path deviation prove this test was effective.

EFFICIENCY Based on the time average for these tasks, both tasks averages were very close to their optimal times. The average deviation was 0.98 for the time it took for all participants to complete the tasks in this test. The ‘Record a Patient’s Medication Allergy’ had one participant whose path deviation caused an over optimal time of completion. This participant’s task was considered a failed attempt due to time overage. The majority of the participants completed their task by or before the optimal time designated.

SATISFACTION Based on the task ratings, the average task rating for the two tasks was 1.9. This average is below the 2.6 threshold for satisfaction and concludes the tasks were satisfactory for the participants. The ‘Record a Patient’s Medication Allergy’ proved to be the more difficult of the two tasks for the participants.

MAJOR FINDINGS Upon reviewing the participants’ comments, many found the workflow for inputting medication allergies in task 1 to be inconsistent with the other related sections of the software. The path deviation and single failure revealed how the module could possible hinder input efficiency.

AREAS FOR IMPROVEMENT Based on the findings for this test, the workflow of the Medication Allergy section of the software requires some review. More than one participant commented on the workflow in the ‘Record a Patient’s Medication Allergy’ task. The consensus found the task fairly easy but emphasized consistency concerns.

Page 102: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [A]: Informed Consent

Test Company would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and give your feedback. The study will last about 60 minutes. At the conclusion of the test, you will be compensated for your time. Agreement I understand and agree that as a voluntary participant in the present study conducted by Test Company I am free to withdraw consent or discontinue participation at any time. I understand and agree to participate in the study conducted and videotaped by the Test Company. I understand and consent to the use and release of the videotape by Test Company. I understand that the information and videotape is for research purposes only and that my name and image will not be used for any purpose other than research. I relinquish any rights to the videotape and understand the videotape may be copied and used by Test Company without further permission. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from this study may be shared with outside of Test Company and Test Company’s client. I understand and agree that data confidentiality is assured, because only de-identified data – i.e., identification numbers not names – will be used in analysis and reporting of the results. I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can leave at any time. Please check one of the following: YES, I have read the above statement and agree to be a participant. NO, I choose not to participate in this study. Signature: _____________________________________ Date: ____________________

Page 103: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [B]: Tasks/Questionnaire Orientation (1 minutes) Thank you for participating in this study. Our session today will last 60 minutes. During that time you will take a look at an electronic health record system. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked. If you get lost or have difficulty I cannot answer help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. I did not have any involvement in its creation, so please be honest with your opinions. The product you will be using today is My Vision Express 2014. Some of the data may not make sense as it is placeholder data. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Preliminary Questions (5 minutes) What is your job title / appointment? How long have you been working in this role? What are some of your main responsibilities? Tell me about your experience with electronic health records.

Page 104: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 1: First Impressions (120 Seconds) This is the application you will be working with. Have you heard of it? ____Yes ____No If so, tell me what you know about it. Please a brief description of your first impression of My Vision Express 2014. Notes / Comments:

Page 105: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 2: Record a Medication Allergy for a Patient (100 Seconds) This task will ask add a Medication Allergy. Medication Allergy List and Reaction:

• Apraclonidine – Diarrhea • Nortriptyline – Chest Discomfort • Insulin – Facial Swelling • Ibuprofen – Hives • Penicillin – Dizziness • Codeine - Rash

Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: ADD: For the same patient ‘John Doe’ Click the Health tab Click the Allergies tab Click the ‘Add’ button A new record will appear, enter one of the above listed allergies from the allergy drop down Select the reaction from the list Click ‘Update Reactions’ Click Save __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 106: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 3: Change and Access Medication Allergy History (30 Seconds) This task will ask Change and Access a Medication History. Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: For the same patient ‘John Doe’ Click the Health tab Click the Allergies tab CHANGE: Click the ‘Show All Reactions’ radial button for this record Select another Reaction of your choosing Click ‘Update Reactions’ Click Save ACCESS: Identify the change is properly represented in the list of Medication Allergy Click Save IDENTIFY HISTORY: Toggle Between Active and All allergies using the radial button right under the tab label to see past history of inactive allergies from previous encounters __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 107: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Final Questions (10 Minutes) What was your overall impression of this system? What aspects of the system did you like most? What aspects of the system did you like least? Were there any features that you were surprised to see? What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? Compare this system to other systems you have used. Would you recommend this system to your colleagues?

Page 108: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

INSIGHT SOFTWARE, LLC

User Centered Design Analysis

170.314(A)(6) Medication List

Eduardo Martinez

7/3/2013

EHR Usability Test Report of My Vision Express 2014 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports My Vision Express 2014 Date of Usability Test: June 29th 2013

Page 109: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Table of Contents EXECUTIVE SUMMARY .................................................................................................................................. 2

METHOD ........................................................................................................................................................ 4

PARTICIPANTS ........................................................................................................................................... 4

STUDY DESIGN........................................................................................................................................... 4

TASKS ........................................................................................................................................................ 5

PROCEDURES ............................................................................................................................................. 5

TEST LOCATION ......................................................................................................................................... 6

TEST ENVIRONMENT ................................................................................................................................. 6

TEST FORMS AND TOOLS .......................................................................................................................... 7

PARTICIPANT INSTRUCTIONS .................................................................................................................... 7

USABILITY METRICS ................................................................................................................................... 7

DATA SCORING .............................................................................................................................................. 8

RESULTS ........................................................................................................................................................ 9

DATA ANALYSIS AND REPORTING ............................................................................................................. 9

EFFECTIVENESS ........................................................................................................................................... 10

EFFICIENCY .................................................................................................................................................. 10

SATISFACTION ............................................................................................................................................. 10

MAJOR FINDINGS ........................................................................................................................................ 10

AREAS FOR IMPROVEMENT ........................................................................................................................ 10

APPENDIX [A]: Informed Consent ............................................................................................................... 11

APPENDIX [B]: Tasks/Questionnaire ........................................................................................................... 12

Task 1: First Impressions ......................................................................................................................... 13

Task 2: Record a Patient’s Medication .................................................................................................... 14

Task 3: Change and Access Medication History ...................................................................................... 15

Final Questions ........................................................................................................................................ 16

Page 110: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EXECUTIVE SUMMARY A usability test of My Vision Express, 2014, Optical EHR was conducted on 6/29/2013 in a remote web environment by Insight Software, LLC. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, 5 healthcare providers and optical managers matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 2 tasks typically conducted on an EHR:

1. Record a Patient’s Medication 2. Change and Access Medication History

During the 20 minute usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data electronically. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: • Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations • Participant’s satisfaction ratings of the system All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a posttest questionnaire. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

The tasks for this test produced a high success rate and revealed no errors proving the subject matter effective. The participants found the tasks easy with an average task rating of 1.6 which is below the 2.6 threshold making the task satisfactory for the testing participants. The average task times were very close to their projected optimal time as most participants followed the designated path very closely.

Page 111: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

In addition to the performance data, the following qualitative observations were made: - Major findings

Upon reviewing the participants’ comments, many found the workflow for inputting medication in task 1 to be inconsistent with the other related sections of the software. The path deviation and single failure revealed how the module could possible hinder input efficiency.

- Areas for improvement Based on the findings for this test, the workflow of the Medication section of the software requires some review. More than one participant commented on the RXCUI code standard in the ‘Record a Patient’s Medication’ task. The consensus found the task very easy but emphasized consistent workflow.

Page 112: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

METHOD

PARTICIPANTS

A total of 5 participants were tested on the EHRUT(s). Participants in the test were Optical Healthcare Professionals. Participants were recruited by Insight Software, LLC. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

Part ID Gender Age Education Occup./Role Prof. Exp. Compu. Exp. Product Exp. Assistive Tech Needs

1234 Male 54 Doctorate Optometric Phys. 30 years 15 years 4 years None 2345 Female 47 Masters Optometric IT 20 years 15 years 5 years None 3456 Male 49 Doctorate Optometrist 25 years 10 years 1.5 years None 4567 Female 36 Masters Optometric Trainer 6 years 7 years 3.5 years None 5678 Male 37 Doctorate Optometrist 9 years 12 years 4.5 years None

Table 1. Participant Description. 5 participants (matching the demographics in the section on Participants) were recruited and 5 participated in the usability test. No participants failed to show for the study.

STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with one EHR(s). Each participant used the system in the same location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant:

• Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations (comments) • Participant’s satisfaction ratings of the system

Page 113: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

1. Record a Patient’s Medication 2. Change and Access Medication History

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.

PROCEDURES Upon connection, participants were greeted; their identity was verified and matched with a name on the participant schedule. Participants were then assigned a participant ID. To ensure that the test ran smoothly, two staff members participated in this test, the usability administrator and the data logger. The usability testing staff conducting the test was experienced usability practitioners. Each participant reviewed and signed an informed consent and release form (See Appendix [A]). The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post-task rating data, and took notes on participant comments. A second person served as the data logger and took notes on task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below):

• As quickly as possible making as few errors and deviations as possible. • Without assistance; administrators were allowed to give immaterial guidance and clarification on

tasks, but not instructions on use. • Without using a think aloud technique.

For each task, the participants were given a written copy of the task. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Following the session, the administrator thanked each individual for their participation. Participants' demographic information, task success rate, time on task, errors, deviations, and verbal responses were recorded into a spreadsheet.

Page 114: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST LOCATION The test location was a cloud environment where each participant had their own desktop to log into. All observers and the data logger worked from a remote location and using an online meeting conference which could see the participant’s screen and review the process which was taken. To ensure that the environment was comfortable for users, the participants were able to connect from the comfort of their home.

TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in cloud environment in an online group meeting. For testing, the computer used a Dell running Windows Server 2008 R2 with RDP. The participants used RDP, Mouse and Keyboard when interacting with the EHRUT. The My Vision Express Software used a screen resolution of 1280 x 768 with all participants using a 20 inch monitor. The application was set up by the Insight Software, LLC according to the vendor’s documentation describing the system set-up and preparation. The application itself was running on a Windows SQL Server 2012 using a test database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size or themes).

Page 115: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including:

1. Informed Consent 2. Moderator’s Guide 3. Post-Test Questionnaire

Examples of these documents can be found in Appendices. The Moderator’s Guide was devised so as to be able to capture required data. Online meeting room and shadow terminal session access was used to view the participants session as they went through each task.

PARTICIPANT INSTRUCTIONS The administrator reads the following instructions aloud to the each participant (also see the full moderator’s guide in Appendix [B]): Thank you for participating in this study. Your input is very important. Our session today will last about 60 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing. Following the procedural instructions, participants were shown the EHR and as their first task, were given time 10 minutes to explore the system and make comments. Participants were then given 2 tasks to complete. Tasks are listed in the moderator’s guide in Appendix [B].

USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess:

1. Effectiveness of My Vision Express by measuring participant success rates and errors 2. Efficiency of My Vision Express by measuring the average task time and path deviations 3. Satisfaction with My Vision Express by measuring ease of use ratings

Page 116: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

DATA SCORING The following table (Table 2) details how tasks were scored, errors evaluated, and the time data analyzed. Measures Rationale and Scoring

Effectiveness: Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by 1.25 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 120 seconds then allotted task time performance was 150 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness: Task Failure

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. This should also be expressed as the mean number of failed tasks per participant. On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency: Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 117: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Efficiency: Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction: Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Easy) to 5 (Very Difficult). These data are averaged across participants. Common convention is that average ratings for systems judged easy to use should be 2.6 or below.

Table 2. Details of how observed data were scored.

RESULTS

DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. Some issues were identified when the a couple participants were attempting to electronically prescribe. At the moment of the test connectivity to the interface was sporadic causing an error to show up.

Page 118: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EFFECTIVENESS Based on the test results, the two tasks were effective with an average of 0.87 for the mean success. A path deviation of 1.1 was calculated for the ‘Record a Patient’s Medication’ task. The participants, whose path deviations were observed, attempted to complete their task in a work flow familiar to the previous version of the software. There were no errors present in this test. The high success rate and stable path deviation prove this test was effective.

EFFICIENCY Based on the time average for these tasks, both tasks averages were very close to their optimal times. The average deviation was 0.89 for the time it took for all participants to complete the tasks in this test. The majority of the participants completed their task by or before the optimal time designated.

SATISFACTION Based on the task ratings, the average task rating for the two tasks was 1.6. This average is below the 2.6 threshold for satisfaction and concludes the tasks were satisfactory for the participants. The ‘Record a Patient’s Medication’ proved to be the more difficult of the two tasks for the participants.

MAJOR FINDINGS Upon reviewing the participants’ comments, many found the workflow for inputting medication in task 1 to be inconsistent with the other related sections of the software. The path deviation and single failure revealed how the module could possible hinder input efficiency.

AREAS FOR IMPROVEMENT Based on the findings for this test, the workflow of the Medication section of the software requires some review. More than one participant commented on the RXCUI code standard in the ‘Record a Patient’s Medication’ task. The consensus found the task very easy but emphasized consistent workflow.

Page 119: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [A]: Informed Consent

Test Company would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and give your feedback. The study will last about 60 minutes. At the conclusion of the test, you will be compensated for your time. Agreement I understand and agree that as a voluntary participant in the present study conducted by Test Company I am free to withdraw consent or discontinue participation at any time. I understand and agree to participate in the study conducted and videotaped by the Test Company. I understand and consent to the use and release of the videotape by Test Company. I understand that the information and videotape is for research purposes only and that my name and image will not be used for any purpose other than research. I relinquish any rights to the videotape and understand the videotape may be copied and used by Test Company without further permission. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from this study may be shared with outside of Test Company and Test Company’s client. I understand and agree that data confidentiality is assured, because only de-identified data – i.e., identification numbers not names – will be used in analysis and reporting of the results. I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can leave at any time. Please check one of the following: YES, I have read the above statement and agree to be a participant. NO, I choose not to participate in this study. Signature: _____________________________________ Date: ____________________

Page 120: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [B]: Tasks/Questionnaire Orientation (1 minutes) Thank you for participating in this study. Our session today will last 60 minutes. During that time you will take a look at an electronic health record system. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked. If you get lost or have difficulty I cannot answer help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. I did not have any involvement in its creation, so please be honest with your opinions. The product you will be using today is My Vision Express 2014. Some of the data may not make sense as it is placeholder data. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Preliminary Questions (5 minutes) What is your job title / appointment? How long have you been working in this role? What are some of your main responsibilities? Tell me about your experience with electronic health records.

Page 121: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 1: First Impressions (120 Seconds) This is the application you will be working with. Have you heard of it? ____Yes ____No If so, tell me what you know about it. Please a brief description of your first impression of My Vision Express 2014. Notes / Comments:

Page 122: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 2: Record a Patient’s Medication (100 Seconds) This task will ask for you to Search for a specific patient and add a Medication. Suggested Medication – RXCUI List:

• Xalatan - 542527 • Travatan - 285032 • Pataday - 686437 • Patanol - 992116 • Vigamox - 404473

Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: ADD: Open Patient Module Search for Patient ’John Doe’ Patient # 115 After Opening, Click the Health tab Click the Medications tab Click the ‘Add’ button A Search window will open, search for any of above listed Medications After a Medication, select physician ‘Ben Franklin’ and an expire date __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 123: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 3: Change and Access Medication History (30 Seconds) This task will ask for you to Search for a specific patient and Add, Change and Revisit a Medication. Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: ACCESS: Open Patient Module Search for Patient ’John Doe’ Patient # 115 After Opening, Click the Health tab Click the Medications tab CHANGE: Double Click on the Medication just added Change the Status to ‘Inactive’ Click Ok IDENTIFY HISTORY: Toggle Between Active and All medication using the radial button right under the tab label to see past history of inactive medication from previous encounters __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 124: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Final Questions (10 Minutes) What was your overall impression of this system? What aspects of the system did you like most? What aspects of the system did you like least? Were there any features that you were surprised to see? What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? Compare this system to other systems you have used. Would you recommend this system to your colleagues?

Page 125: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

INSIGHT SOFTWARE, LLC

User Centered Design Analysis

170.314(B)(4) Clinical Information Reconciliation

Eduardo Martinez

7/3/2013

EHR Usability Test Report of My Vision Express 2014 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports My Vision Express 2014 Date of Usability Test: June 29th 2013

Page 126: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Table of Contents EXECUTIVE SUMMARY .................................................................................................................................. 2

METHOD ........................................................................................................................................................ 4

PARTICIPANTS ........................................................................................................................................... 4

STUDY DESIGN........................................................................................................................................... 4

TASKS ........................................................................................................................................................ 5

PROCEDURES ............................................................................................................................................. 5

TEST LOCATION ......................................................................................................................................... 6

TEST ENVIRONMENT ................................................................................................................................. 6

TEST FORMS AND TOOLS .......................................................................................................................... 7

PARTICIPANT INSTRUCTIONS .................................................................................................................... 7

USABILITY METRICS ................................................................................................................................... 7

DATA SCORING .............................................................................................................................................. 8

RESULTS ........................................................................................................................................................ 9

DATA ANALYSIS AND REPORTING ............................................................................................................. 9

EFFECTIVENESS ........................................................................................................................................... 10

EFFICIENCY .................................................................................................................................................. 10

SATISFACTION ............................................................................................................................................. 10

MAJOR FINDINGS ........................................................................................................................................ 10

AREAS FOR IMPROVEMENT ........................................................................................................................ 10

APPENDIX [A]: Informed Consent ............................................................................................................... 11

APPENDIX [B]: Tasks/Questionnaire ........................................................................................................... 12

Task 1: First Impressions ......................................................................................................................... 13

Task 2: Import Clinical Summary Document and Consolidate ................................................................ 14

Task 3: Commit and Verify Consolidated Data ....................................................................................... 15

Final Questions ........................................................................................................................................ 16

Page 127: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EXECUTIVE SUMMARY A usability test of My Vision Express, 2014, Optical EHR was conducted on 6/29/2013 in a remote web environment by Insight Software, LLC. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, 5 healthcare providers and optical managers matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 2 tasks typically conducted on an EHR:

1. Import Clinical Document and Consolidate List 2. Commit and Verify Consolidated Data

During the 20 minute usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data electronically. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: • Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations • Participant’s satisfaction ratings of the system All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a posttest questionnaire. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

The tasks for this test produced a high success rate with little path deviations proving the test to be effective. The participants found the tasks fairly easy with an average task rating of 2.0 which is below the

Page 128: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

2.6 threshold making the task satisfactory for the testing participants. The average task times were under their projected optimal time as most participants followed the designated path very closely. In addition to the performance data, the following qualitative observations were made: - Major findings

Upon reviewing the comments and results, the Clinical Information Reconciliation process was successful. An issue was found during the test with one of the test computers not having the import document in the correct location for the participant to find it. This translated to the only failure in the test found during task 1. Most participants commented positively towards the new feature in the software.

- Areas for improvement Based on the findings, any future usability testing for Clinical Information Reconciliation will require an automated import for the importation of clinical documents. This will avoid any possible errors with misplaced files. The consolidation module allows the user to see and compare what problems, medications and allergies exist and match the observations on each list. The matching items would be best sorted to the top of the list for ease of identification and create a better workflow for consolidation.

Page 129: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

METHOD

PARTICIPANTS

A total of 5 participants were tested on the EHRUT(s). Participants in the test were Optical Healthcare Professionals. Participants were recruited by Insight Software, LLC. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

Part ID Gender Age Education Occup./Role Prof. Exp. Compu. Exp. Product Exp. Assistive Tech Needs

1234 Male 54 Doctorate Optometric Phys. 30 years 15 years 4 years None 2345 Female 47 Masters Optometric IT 20 years 15 years 5 years None 3456 Male 49 Doctorate Optometrist 25 years 10 years 1.5 years None 4567 Female 36 Masters Optometric Trainer 6 years 7 years 3.5 years None 5678 Male 37 Doctorate Optometrist 9 years 12 years 4.5 years None

Table 1. Participant Description. 5 participants (matching the demographics in the section on Participants) were recruited and 5 participated in the usability test. No participants failed to show for the study.

STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with one EHR(s). Each participant used the system in the same location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant:

• Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations (comments) • Participant’s satisfaction ratings of the system

Page 130: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

1. Import Clinical Document and Consolidate List 2. Commit and Verify Consolidated Data

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.

PROCEDURES Upon connection, participants were greeted; their identity was verified and matched with a name on the participant schedule. Participants were then assigned a participant ID. To ensure that the test ran smoothly, two staff members participated in this test, the usability administrator and the data logger. The usability testing staff conducting the test was experienced usability practitioners. Each participant reviewed and signed an informed consent and release form (See Appendix [A]). The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post-task rating data, and took notes on participant comments. A second person served as the data logger and took notes on task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below):

• As quickly as possible making as few errors and deviations as possible. • Without assistance; administrators were allowed to give immaterial guidance and clarification on

tasks, but not instructions on use. • Without using a think aloud technique.

For each task, the participants were given a written copy of the task. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Following the session, the administrator thanked each individual for their participation. Participants' demographic information, task success rate, time on task, errors, deviations, and verbal responses were recorded into a spreadsheet.

Page 131: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST LOCATION The test location was a cloud environment where each participant had their own desktop to log into. All observers and the data logger worked from a remote location and using an online meeting conference which could see the participant’s screen and review the process which was taken. To ensure that the environment was comfortable for users, the participants were able to connect from the comfort of their home.

TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in cloud environment in an online group meeting. For testing, the computer used a Dell running Windows Server 2008 R2 with RDP. The participants used RDP, Mouse and Keyboard when interacting with the EHRUT. The My Vision Express Software used a screen resolution of 1280 x 768 with all participants using a 20 inch monitor. The application was set up by the Insight Software, LLC according to the vendor’s documentation describing the system set-up and preparation. The application itself was running on a Windows SQL Server 2012 using a test database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size or themes).

Page 132: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including:

1. Informed Consent 2. Moderator’s Guide 3. Post-Test Questionnaire

Examples of these documents can be found in Appendices. The Moderator’s Guide was devised so as to be able to capture required data. Online meeting room and shadow terminal session access was used to view the participants session as they went through each task.

PARTICIPANT INSTRUCTIONS The administrator reads the following instructions aloud to the each participant (also see the full moderator’s guide in Appendix [B]): Thank you for participating in this study. Your input is very important. Our session today will last about 60 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. I did not have any involvement in its creation, so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing. Following the procedural instructions, participants were shown the EHR and as their first task, were given time 10 minutes to explore the system and make comments. Participants were then given 2 tasks to complete. Tasks are listed in the moderator’s guide in Appendix [B].

USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess:

1. Effectiveness of My Vision Express by measuring participant success rates and errors 2. Efficiency of My Vision Express by measuring the average task time and path deviations 3. Satisfaction with My Vision Express by measuring ease of use ratings

Page 133: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

DATA SCORING The following table (Table 2) details how tasks were scored, errors evaluated, and the time data analyzed. Measures Rationale and Scoring

Effectiveness: Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator’s Guide must be operationally defined by taking multiple measures of optimal performance and multiplying by 1.25 that allows some time buffer because the participants are presumably not trained to expert performance. Thus, if expert, optimal performance on a task was 120 seconds then allotted task time performance was 150 seconds. This ratio should be aggregated across tasks and reported with mean and variance scores.

Effectiveness: Task Failure

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an “Failures.” No task times were taken for errors. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. This should also be expressed as the mean number of failed tasks per participant. On a qualitative level, an enumeration of errors and error types should be collected.

Efficiency: Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks.

Page 134: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Efficiency: Task Time

Each task was timed from when the administrator said “Begin” until the participant said, “Done.” If he or she failed to say “Done,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated.

Satisfaction: Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Very Easy) to 5 (Very Difficult). These data are averaged across participants. Common convention is that average ratings for systems judged easy to use should be 2.6 or below.

Table 2. Details of how observed data were scored.

RESULTS

DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. Some issues were identified when the a couple participants were attempting to electronically prescribe. At the moment of the test connectivity to the interface was sporadic causing an error to show up.

Page 135: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

EFFECTIVENESS Based on the test results, the two tasks were effective with an average of 0.9 for the mean success rate. A path deviation of 1.4 was calculated for the ‘Import Clinical Document and Consolidation List’ task. The participants, whose path deviations were observed, were unclear where the import document was stored in the test environment. There was one error noted in this test and was contributed by a misplaced XML file on one test computer. The high success rate and stable path deviation prove this test was effective.

EFFICIENCY Based on the average time for the each task, most participants completed the assigned tasks before the allotted optimal time. Task 2, ‘Commit and Verify Consolidated Data’, averaged a calculated time ¾ the calculated optimal time. This observation indicated the participants completed as quick as the professional time benchmarked prior to the test. These findings indicate the test proved efficient.

SATISFACTION Based on the task ratings, the average task rating for the two tasks was 2. This average is below the 2.6 threshold for satisfaction and concludes the tasks were satisfactory for the participants. The ‘Import Clinical Summary’ proved to be the more difficult of the two tasks for the participants rendering a high task rating.

MAJOR FINDINGS Upon reviewing the comments and results, the Clinical Information Reconciliation process was successful. An issue was found during the test with one of the test computers not having the import document in the correct location for the participant to find it. This translated to the only failure in the test found during task 1. Most participants commented positively towards the new feature in the software.

AREAS FOR IMPROVEMENT Based on the findings, any future usability testing for Clinical Information Reconciliation will require an automated import for the importation of clinical documents. This will avoid any possible errors with misplaced files. The consolidation module allows the user to see and compare what problems, medications and allergies exist and match the observations on each list. The matching items would be best sorted to the top of the list for ease of identification and create a better workflow for consolidation.

Page 136: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [A]: Informed Consent

Test Company would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and give your feedback. The study will last about 60 minutes. At the conclusion of the test, you will be compensated for your time. Agreement I understand and agree that as a voluntary participant in the present study conducted by Test Company I am free to withdraw consent or discontinue participation at any time. I understand and agree to participate in the study conducted and videotaped by the Test Company. I understand and consent to the use and release of the videotape by Test Company. I understand that the information and videotape is for research purposes only and that my name and image will not be used for any purpose other than research. I relinquish any rights to the videotape and understand the videotape may be copied and used by Test Company without further permission. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from this study may be shared with outside of Test Company and Test Company’s client. I understand and agree that data confidentiality is assured, because only de-identified data – i.e., identification numbers not names – will be used in analysis and reporting of the results. I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can leave at any time. Please check one of the following: YES, I have read the above statement and agree to be a participant. NO, I choose not to participate in this study. Signature: _____________________________________ Date: ____________________

Page 137: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

APPENDIX [B]: Tasks/Questionnaire Orientation (1 minutes) Thank you for participating in this study. Our session today will last 60 minutes. During that time you will take a look at an electronic health record system. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not do anything more than asked. If you get lost or have difficulty I cannot answer help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. I did not have any involvement in its creation, so please be honest with your opinions. The product you will be using today is My Vision Express 2014. Some of the data may not make sense as it is placeholder data. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Preliminary Questions (5 minutes) What is your job title / appointment? How long have you been working in this role? What are some of your main responsibilities? Tell me about your experience with electronic health records.

Page 138: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 1: First Impressions (120 Seconds) This is the application you will be working with. Have you heard of it? ____Yes ____No If so, tell me what you know about it. Please a brief description of your first impression of My Vision Express 2014. Notes / Comments:

Page 139: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 2: Import Clinical Summary Document and Consolidate (400 Seconds) This task will introduce the importation feature for clinical information from a standardized Health Information file structure. Import file path : C:\import.xml Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: Import: Open Patient Module Search for Patient ’Gisele Bündchen’ Patient # 4 After Opening, Click the Health tab Click the HL7 CCD/CCR tab Click the Import Button A directory browse window will appear, either enter the file path above or traverse in the ‘Look In’ feature to get to the C Drive and select the ‘Import.XML’ IDENTIFY: A small wizard will appear to assist with reconciliation, it will first ask to identify the patient importing the information if the same as the person on the patient record selected, Click ‘OK’ Click ‘Accept’ Existing and Importing Medication lists will show side by side, unselect any medication you do not wish to import, the item highlighted in dark grey will be merged to prevent duplicates Click NEXT Existing and Importing Problem lists will show side by side, unselect any Problems you do not wish to import, the item highlighted in dark grey will be merged to prevent duplicates Click NEXT Existing and Importing Allergies lists will show side by side, unselect any Allergies you do not wish to import, the item highlighted in dark grey will be merged to prevent duplicates Click NEXT Review the Information which will import __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 140: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Task 3: Commit and Verify Consolidated Data (200 Seconds) This task will ask for you to review the imported list and verify the consolidated list which will be imported for the Medication, Problem and Allergies sections of the software. Success: __ Easily completed __ Completed with difficulty or help :: Describe below __ Not completed Comments: Task Time: ________ Seconds Optimal Path: Import: Review the Information which will be imported click Import VERIFY: Click on the Medications Tab Identify Medications imported VERIFY: Click on the Allergies Tab Identify Allergies imported VERIFY: Click on the Problems Tab Identify Problems imported __ Correct __ Minor Deviations / Cycles :: Describe below __ Major Deviations :: Describe below Comments: Observed Errors and Verbalizations: Comments: Rating: Overall, this task was: ______ Scale: “Very Easy” (1) to “Very Difficult” (5)

Page 141: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Final Questions (10 Minutes) What was your overall impression of this system? What aspects of the system did you like most? What aspects of the system did you like least? Were there any features that you were surprised to see? What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? Compare this system to other systems you have used. Would you recommend this system to your colleagues?

Page 142: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 11 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

Appendix B: Quality Management System

Page 143: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard
Page 144: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 12 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

Appendix C: Privacy and Security

Page 145: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard
Page 146: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard
Page 147: ONC HIT Certification Program Test Results Summary for ... · Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016 Page 1 of 13 ©InfoGard

Test Results Summary for 2014 Edition EHR Certification 13-2397-R-0027-PRA 1.1 February 23, 2016

Page 13 of 13 ©InfoGard 2016. May be reproduced only in its original entirety, without revision

Test Results Summary Document History

Version Description of Change Date V1.0 Initial release November 13, 2013

V1.1 Updated Safety-Enhanced Design report February 23, 2016

END OF DOCUMENT