onc hit certification program test results summary for 2014 … · 2020. 6. 9. · test results...

44
Test Results Summary for 2014 Edition EHR Certification 14-2655-R-0079-PRA Version 1.1, February 28, 2016 ©2016 InfoGard. May be reproduced only in its original entirety, without revision 1 Part 1: Product and Developer Information 1.1 Certified Product Information 1.2 Developer/Vendor Information Part 2: ONC-Authorized Certification Body Information 2.1 ONC-Authorized Certification Body Information ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Product Name: electronic Clinical Record (eCR) Product Version: 2.8 Domain: Ambulatory East Amherst, NY 14051 Website: www.10e11.com Email: [email protected] Phone: (716) 810-9755 Test Type: Complete Developer/Vendor Name: TenEleven Group Address: 6489 Transit Rd. San Luis Obispo, CA 93401 Website: www.infogard.com Email: [email protected] Phone: (805) 783-0810 Developer/Vendor Contact: Alex Alexander ONC-ACB Name: InfoGard Laboratories, Inc. Address: 709 Fiero Lane Suite 25 Signature and Date ONC-ACB Contact: Adam Hardcastle This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative: Adam Hardcastle EHR Certification Body Manager ONC-ACB Authorized Representative Function/Title 2/28/2016

Upload: others

Post on 05-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 1

Part 1: Product and Developer Information1.1 Certified Product Information

1.2 Developer/Vendor Information

Part 2: ONC-Authorized Certification Body Information2.1 ONC-Authorized Certification Body Information

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Product Name: electronic Clinical Record (eCR)Product Version: 2.8Domain: Ambulatory

East Amherst, NY 14051Website: www.10e11.comEmail: [email protected]: (716) 810-9755

Test Type: Complete

Developer/Vendor Name: TenEleven Group Address: 6489 Transit Rd.

San Luis Obispo, CA 93401Website: www.infogard.comEmail: [email protected]: (805) 783-0810

Developer/Vendor Contact: Alex Alexander

ONC-ACB Name: InfoGard Laboratories, Inc.Address: 709 Fiero Lane Suite 25

Signature and Date

ONC-ACB Contact: Adam Hardcastle

This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative:

Adam Hardcastle EHR Certification Body ManagerONC-ACB Authorized Representative Function/Title

2/28/2016

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 2

2.2 Gap Certification

(a)(1) (a)(17) (d)(5) (d)(9)(a)(6) (b)(5)* (d)(6) (f)(1)(a)(7) (d)(1) (d)(8)

*Gap certification allowed for Inpatient setting only

2.3 Inherited CertificationThe following identifies criterion or criteria certified via inherited certification

The following identifies criterion or criteria certified via gap certification§170.314

No gap certification

§170.314

(a)(3) (a)(16) Inpt. only (d)(2) (f)(3) (a)(4) (a)(17) Inpt. only (d)(3) (f)(4) Inpt. only

(a)(1) (a)(14) (c)(3) (f)(1) (a)(2) (a)(15) (d)(1) (f)(2)

(a)(7) (b)(3) (d)(6) (f)(6) Optional & Amb. only (a)(8) (b)(4) (d)(7)

(a)(5) (b)(1) (d)(4) (f)(5) Optional & Amb. only (a)(6) (b)(2) (d)(5)

(a)(11) (b)(7) (e)(1) (g)(3) (a)(12) (c)(1) (e)(2) Amb. only (g)(4)

(a)(9) (b)(5) (d)(8) (g)(1) (a)(10) (b)(6) Inpt. only (d)(9) Optional (g)(2)

(a)(13) (c)(2) (e)(3) Amb. only

No inherited certification

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 3

Part 3: NVLAP-Accredited Testing Laboratory Information

August 21, 2014 - September 18, 2014

3.1 NVLAP-Accredited Testing Laboratory Information

3.2 Test Information

3.2.1 Additional Software Relied Upon for Certification

No additional software required

Test Date(s): Location of Testing: Remote

ATL Name: InfoGard Laboratories, Inc.Accreditation Number: NVLAP Lab Code 100432-0

Report Number: 14-2655-R-0079 V1.3

Phone: (805) 783-0810ATL Contact: Milton Padilla

For more information on scope of accreditation, please reference http://ts.nist.gov/Standards/scopes/1004320.htm

Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory Representative:

Address: 709 Fiero Lane Suite 25San Luis Obispo, CA 93401

Website: www.infogard.comEmail: [email protected]

MaxMD Direct Mail170.314(b)(1), (b)(2), (e)(1)

Direct messaging

DrFirst e-prescribing 170.314(a)(1), (a)(2), (a)(10), (b)(3)

Medication orders, list, eRx, drug checks

Milton Padilla EHR Test Body ManagerATL Authorized Representative Function/Title

Signature and Date

Additional Software Applicable CriteriaFunctionality provided by

Additional Software

2/28/2016

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 4

3.2.2 Test Tools

Versionv2.4.1v1.0.4

v1.8.0v1.7.0v1.8.0v179

v3.0.2 No test tools required

3.2.3 Test Data

3.2.4 Standards3.2.4.1 Multiple Standards Permitted

HL7 v2 Laboratory Results Interface (LRI) Validation ToolHL7 v2 Syndromic Surveillance Reporting Validation ToolTransport Testing ToolDirect Certificate Discovery Tool

Alteration (customization) to the test data was necessary and is described in Appendix No alteration (customization) to the test data was necessary

Test ToolCypressePrescribing Validation ToolHL7 CDA Cancer Registry Reporting Validation ToolHL7 v2 Electronic Laboratory Reporting (ELR) Validation ToolHL7 v2 Immunization Information System (IIS) Reporting Valdiation T

(a)(13)

§170.207(a)(3)IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.207(j)HL7 Version 3 Standard: Clinical Genomics; Pedigree

(a)(15)(i)

§170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.204(b)(2)HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

The following identifies the standard(s) that has been successfully tested where more than one standard is permitted

Criterion # Standard Successfully Tested

(a)(8)(ii)(A)(2)

§170.204(b)(1)HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.204(b)(2)HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

(a)(16)(ii) §170.210(g) Network Time Protocol Version 3 (RFC 1305)

§170. 210(g)Network Time Protocol Version 4 (RFC 5905)

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 5

None of the criteria and corresponding standards listed above are applicable

3.2.4.2 Newer Versions of Standards

No newer version of a minimum standard was tested

3.2.5 Optional Functionality

(b)(7)(i)

§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3)IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

(e)(1)(i) Annex A of the FIPS Publication 140-2AES 128 and SHA-1

(b)(2)(i)(A)

§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions

§170.207(a)(3)IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

Common MU Data Set (15)

§170.207(a)(3)IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.207(b)(2)The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT-4)

The following identifies the newer version of a minimum standard(s) that has been successfully tested

Newer Version Applicable Criteria

(e)(1)(ii)(A)(2) §170.210(g) Network Time Protocol Version 3 (RFC 1305)

§170. 210(g)Network Time Protocol Version 4 (RFC 5905)

(e)(3)(ii) Annex A of the FIPS Publication 140-2AES 128 and SHA-1

(b)(1)(i)(B) Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(1)(i)(C) Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(b)(2)(ii)(B) Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

Criterion # Optional Functionality Successfully Tested(a)(4)(iii) Plot and display growth charts

(b)(2)(ii)(C) Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols)

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 6

No optional functionality tested

3.2.6 2014 Edition Certification Criteria* Successfully Tested

TP** TD*** TP** TD***1.7.1 2.4.1

1.21.2 1.4 1.51.4 1.3 1.31.4 1.3 1.2

1.2 1.21.3 1.31.2 1.41.3 1.8 1.51.3 1.2 1.61.2 1.31.21.5 1.3 1.8.0

1.3 1.8.0

1.7 1.41.4 1.61.4 1.21.3 1.41.4 1.7.0

1.8a 2.01.4 1.7 1.3

1.7.1 2.4.1 1.21.7.1 2.4.1

Common MU Data Set (15) Express Procedures according to the standard specified

at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)

Criteria #Version

Criteria #Version

(f)(3)

Ambulatory setting only – Create syndrome-based public health surveillance information for transmission using the standard specified at §170.205(d)(3) (urgent care visit scenario)

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature)

(a)(4) (d)(3) (a)(5) (d)(4) (a)(6) (d)(5)

(a)(1) (c)(3) (a)(2) (d)(1) (a)(3) (d)(2)

(a)(10) (d)(9) Optional

(a)(11) (e)(1) (a)(12) (e)(2) Amb. only

(a)(7) (d)(6) (a)(8) (d)(7) (a)(9) (d)(8)

(a)(16) Inpt. only (f)(3) (a)(17) Inpt. only (f)(4) Inpt. only

(b)(1) (f)(5) Optional & Amb. only

(a)(13) (e)(3) Amb. only

(a)(14) (f)(1) (a)(15) (f)(2)

(b)(5) (g)(1) (b)(6) Inpt. only (g)(2) (b)(7) (g)(3)

(b)(2) (b)(3) (f)(6) Optional &

Amb. only (b)(4)

***Indicates the version number for the Test Data (TD)

(c)(1) (g)(4) (c)(2)*For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method)**Indicates the version number for the Test Procedure (TP)

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 7

3.2.7 2014 Clinical Quality Measures*Type of Clinical Quality Measures Successfully Tested:

CMS ID Version CMS ID Version CMS ID Version CMS ID Version2 v3 90 136 v3 155

22 117 137 v2 156 v250 122 138 v2 15752 123 139 15856 124 140 15961 125 141 16062 126 v2 142 16164 127 143 16365 128 144 16466 129 145 16568 v3 130 146 16669 131 147 16774 132 148 169 v275 133 149 177 v277 134 153 17982 135 154 182

CMS ID Version CMS ID Version CMS ID Version CMS ID Version9 71 107 172

26 72 108 17830 73 109 18531 91 110 18832 100 111 19053 102 11355 104 11460 105 171

Ambulatory Inpatient No CQMs tested*For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov (navigation: 2014 Clinical Quality Measures)

Ambulatory CQMs

Inpatient CQMs

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 8

3.2.8 Automated Numerator Recording and Measure Calculation3.2.8.1 Automated Numerator Recording

Automated Numerator Recording was not tested

3.2.8.2 Automated Measure Calculation

Automated Measure Calculation was not tested

3.2.9 Attestation

(a)(3) (a)(11) (a)(17) (e)(1)(a)(4) (a)(12) (b)(2) (e)(2)

Automated Numerator Recording Successfully Tested(a)(1) (a)(9) (a)(16) (b)(6)

(a)(7) (a)(15) (b)(5)

Automated Numerator Recording Successfully Tested(a)(1) (a)(9) (a)(16) (b)(6)

(a)(5) (a)(13) (b)(3) (e)(3)(a)(6) (a)(14) (b)(4)

(e)(3)(a)(6) (a)(14) (b)(4)

(a)(3) (a)(11) (a)(17) (e)(1)(a)(4) (a)(12) (b)(2) (e)(2)

(a)(7) (a)(15) (b)(5)

Attestation Forms (as applicable) Appendix Safety-Enhanced Design* A

(a)(5) (a)(13) (b)(3)

Quality Management System** B Privacy and Security C*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), (b)(4)**Required for every EHR product

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 9

The following required data was missing:• Describe facilities• Computer Type• Operating System• Interaction style, e.g. a mouse and keyboard• Description of the display including screen size, resolution and color settings• Set up of environment completed by (vendor or test laboratory)• System platform• Type of system (training/test database)• Connection (LAN/WAN)• Description of the intended users

Appendix A: Safety Enhanced Design

1

TenEleven Group Incorporated 6489 Transit Road East Amherst New York 14051

(tel) 716.810.9755 (fax) 716.580.3134 (web) www.tenelevengroup.com

User Centered Design

• §170.314(a)(1) Computerized provider order entry • §170.314(a)(2) Drug-drug, drug-allergy interaction checks • §170.314(a)(6) Medication list • §170.314(a)(7) Medication allergy list • §170.314(a)(8) Clinical decision support • §170.314(b)(3) Electronic prescribing • §170.314(b)(4) Clinical information reconciliation •

Our entire eCR (electronic Clinical Record) product, including the 7 areas above, have been created using the principles of User Centered Design (UCD). Our process, Agile Scrum, is similar to that of ISO 9241-11. We focus on the following processes during software development:

• specify context of use • specify user requirements • produce design solutions • evaluate design against requirements

All eCR implementations start with a work flow study in which the customer works with TenEleven staff to document the customer’s processes. This step is integral in making certain that the customer is able to utilize the software in the most efficient way possible. Enhancements are added to the system as needed, to meet customer’s unique processes.

TenEleven uses Agile Scrum for software development, which is an Internationally recognized development methodology. All changes and enhancements to eCR are created based upon a user story that is written using the template "As a <type of user> I want <some goal> so that <some reason>". The story is always centered on what the user wants, why they want it and how to allow them to efficiently reach the goal. We have development cycles that last 3 weeks and have a “Releasable” product every 4 cycles, or 12 weeks. All stories in our Product Backlog are prioritized, in regularly scheduled Agile Scrum meetings, and the highest prioritized stories are placed in the next sprint to be completed by the end of the 3 week cycle. A sprint will ordinarily consist of a mix of both customer specific stories and Product Strategy items that TenEleven has decided to do to improve our product functionality for all of our customers. At the end of each cycle we have a “Potentially Releasable” product that we are able to give to a customer for testing.

2

TenEleven Group Incorporated 6489 Transit Road East Amherst New York 14051

(tel) 716.810.9755 (fax) 716.580.3134 (web) www.tenelevengroup.com

All of our customers are provided with a test eCR system where they can test the story, once it is completed at the end of a Sprint, to make sure the development fully meets their requirements. We will make any necessary changes based upon the customer’s input and it will be released to production only after the customer has signed off on the functionality.

TenEleven follows the stringent process of Agile Scrum to ensure that all functionality added to eCR meets the needs of our customers. Workflow and efficiency are always primary considerations that allow us to create a user friendly and efficient product.

EHR Usability Test Report

Product: TenEleven eCR

Version: 2

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports

Date of Usability Test: July 2014

Date of Report: July 2014

Report Prepared By: Alex P. Alexander

Note: The following study was developed using the NISTIR 7742 template as a guide for reporting our findings: CustomizedCommon Industry Format Template for Electronic Health Record Usability Testing.

Executive SummaryA usability test of the TenEleven eCR, a complete EHR, was conducted during the month of July 2014 by TenEleven Group Inc.The purpose of this test was to assess the usability of the 2014 EHR user interface, and to provide evidence of this usability.

During the usability test, 5 providers (EHR users) served as participants and used the eCR in simulated, but representativetasks.

This study collected performance data on tasks associated with 3 Meaningful Use criterion. These tasks were associated withthe following test procedures:

• §170.314(a)(1) CPOE – Only for Labs and Radiology orders (Med Orders covered in DrFirst document)• §170.314(a)(8) Clinical Decision Support• §170.314 (a)(16) Electronic Medication Administration Record• §170.314 (b)(4) Clinical Information Reconciliation

Task Detail

• CPOEo Med Orders See Dr. First Documentation

o Lab Orders Click on Lab Orders from the Day to Day menu Select a Patient Select the lab test(s) required Fill out the Note and any other information needed Select the Send button to send the order

o Radiology Orders

Click on Radiology Orders from the Day to Day menu Select a Patient Select the radiology test(s) required Fill out the Note and any other information needed Select the Send button to send the order

• Clinical Decision Supporto Click on Clinical Decision Support from Day to Day menuo Create a Clinical Decision Support Rule – Diagnosis, Gender - F, Age 30-40, Care Suggestion, Source

then click Saveo Bring up patient chart as given by administratoro Give patient Diagnosis that was in rule just createdo Acknowledge Care Suggestion was delivered

• Electronic Medication Administration Recordo Open Patient Details from Day to Day menuo Search for patient designated by administratoro click on e-prescribing tabo Prescribe 1 medication with a frequency of TIDo Open patient chart and open eMAR – Routine Medso Administer designated medication (sign for it)o Designate a “Medication Not Given” then document reasono Administer a medication then designate as “Signed in error” then designate Error Reasono Click ”Save”

• Clinical Information Reconciliationo Click on Clinical Information Reconciliation from Day to Day menuo Search for specific patient name given by administratoro Choose patiento Choose Diagnoses to add/keep in charto Choose Medications to add/keep in charto Choose Medication Allergies to add/keep in chart

Each usability test was conducted by the user and the data logger by remote log-in to the eCR on-site testing environment.Participants of each one-on-one usability test were asked to sign a release form prior to testing. All participants were currentusers of the EHR software. The administrator instructed participants to complete a series of tasks, and collected the followinginformation from each test, along with post-test data submitted via a participant questionnaire:

• Time to complete the task• Number and types of errors• Path deviations• Participants’ verbalizations• Tasks completed in the allotted time.

The following is a summary of the performance and rating data collected on the EHR.

Errors Deviations

Task Participants TaskSuccess

Time toComplete(Avg)

TotalAverage (TotalObserved/CompletedTask)

Total(Observed/Optimal)

Average(Observed/Optimal)

1. ClinicalDecisionSupport

4 4 3 mins 3 15% 26/23 1.13

2. eMAR 4 44 mins 5 18.5% 38/34 1.12

3. ClinicalInformationReconciliation

4 42 mins 3 12.5% 12/9 1.33

4. CPOE – Labs &Radiology

1 11 min 2 8.7% 28/23 1.22

In addition to the performance data, the following qualitative observations were made:

1. Major Findings2. Areas for Improvement

IntroductionThe EHR tested for this study was The TenEleven eCR Version 2, a complete EHR. This system is designed to represent acomplete and comprehensive digital version of a patient’s paper chart with real-time, patient-centered records. This systemcomplies with all 2011 NIST certification criteria and is being submitted for verification of compliance with 2014 NISTcertification criteria. The usability testing was conducted with every effort to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usabilityin the EHR. To this end, measure of effectiveness and efficiency (time to perform tasks; total number of deviations; totalnumber of errors; etc.) were captured during the usability testing.

Method

ParticipantsA total of 5 participants were tested on the eCR. Participants in the test were individuals that work within an ambulatory andinpatient healthcare environment. Participants were contacted by TenEleven. In addition, participants had no directconnection to the development of the eCR. Participants were not from TenEleven. All participants had the same level oftraining as all other actual end users.

The following is a table of participants by characteristics, including demographics, user role, and product experience.Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities. Asummary of the participant demographics can be found in the Appendix.

PartID

Gender Age Education

Occupation/Role

ProfessionalExperience(Years)

ComputerExperience(Years)

ProductExperience(Years)

AssistiveTechnologyNeeds

1 1 F 35-40 BS/BA Program Director 15+ 15+ 3 None2 2 F 35-40 BS/BA VP of Clinical Services/CCO 15+ 15+ 1 None

3 3 F 40-45 BS/BA Executive Vice President,Nursing Services

15+ 15+ 1 None

4 4 F 45-50 BS/BA Operations Coordinator,Behavioral Health Services

15+ 15+ 1 None

5 5 F 45-50 BS/BA IT Applications Manager 15+ 15+ 4 None

Participants were advised to allocate 60 minutes for the test but the test may end early. The added time was to ensure initialconnection to the webex, provide enough time for administrator instructions and time between tasks.

Study DesignOverall, the objective of this test was to uncover areas where the application performed effectively, efficiently, and withsatisfaction, and areas where the application failed to meet the needs of the participants. The data from this test may serve asa baseline for future tests with an updated version of the eCR provided the same tasks are used. In short, this testing serves asboth a means to record or benchmark current usability, but also to identify areas where improvements must be made.

During the usability test, participants interacted with the eCR. Each participant was provided the same set of instructions. Thesystem was evaluated for effectiveness and efficiency as defined by measures collected and analyzed for each participant.

• Time to complete the task• Number and types of errors• Path deviations• Participants verbalizations• Tasks completed in the allotted time.

A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do withthis EHR, including:

• Add a clinical decision support rule• Administer medications using the eMAR• Perform clinical information reconciliation• Order a Lab and a Radiology Test

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.

ProceduresUpon commencement of the test, participants were greeted, remotely logged in, and their identity was verified. Participantswere then assigned a participant ID. All participants signed an informed consent form prior to the testing. The testadministrator provided the instructions for each test, noted all comments from the participants; while the data logger notedall times, deviations and errors.

Participants were instructed to perform the tasks:

• After listening to the instructions from the testing administrator• As quickly as possible

• Without assistance

Task timing began after the completion of the verbal instructions from the administrator; and after anacknowledgement from the participant that they were ready to begin. The task time was stopped once the participantindicated they had successfully completed the task.

Following the test, the administrator gave the participant the post-test questionnaire; and then thanked them for their time.

The TenEleven staff member responsible for logging data recorded all participants’ demographic information, task successrate, time on task, errors, and deviations into a spreadsheet.

Test LocationThe test was administered in a webex setting where participants were isolated from other participants in the study. Only thetest administrator was on the call with the participants while the study was being administered.

Test EnvironmentThe computers used for the testing were PCs running on Windows 7, Windows 8 or thin client. Users also used a mouse andkeyboard while interacting with the EHR. The TenEleven application is primarily a client-server solution with some webcomponents; and all computers were running on high speed internet connections.

Test Forms and ToolsNone

Participant InstructionsThe Administrator read the following instructions aloud to each participant:

“Thank you for participating in today’s usability study of the TenEleven eCR. In a few minutes, you will be asked to perform aseries of tasks and complete a user survey. Please attempt to complete the task as quickly as possible. The purpose of thisstudy is for TenEleven to obtain information on where enhancements are needed in the application based on how quickly andeasily the task is being performed in the eCR system.

When it is time to perform the task, I will state the instructions and then tell you to begin. Once you have completed the task,please say ‘Done’. After you have completed the task, I will ask for feedback on the actions you had taken during the task.You will be given a specified amount of time to complete the task. This time will not be communicated to you as we areinterested in seeing how long each task does take for you to perform.”

Usability MetricsThe goals of this test were to assess:

1. The efficiency of the TenEleven eCR by measuring the length of time it takes for a user to complete the task; and thesuccess of task completion.

2. The efficiency of the TenEleven eCR by measuring the path deviations taken by the user during the task.3. The effectiveness of the TenEleven eCR by measuring the number and types of errors experienced by the user during

the task.4. The satisfaction of the user with the TenEleven eCR by logging their comments on the task.

Data ScoringThe table below details how each task was scored.

Measure Rationale and Scoring

Task Time Timing started when the administrator said ‘Begin’. The time ended when the participant said‘Done’. In the event that the participant finished, and did not say ‘Done’, the administrator stoppedthe clock when it was clear the participant had completed the task. Task time was only counted ifthe participant completed the task in the allotted time.

Errors The task resulted in an error if the participant: failed to finish the task in the allotted time; or, if theybecame ‘stuck’ and could not proceed without asking for assistance. Task time was not countedwhen the task resulted in an error.

Path Deviations Path deviations were recorded as actions taken during the task that were not part of the necessaryactions needed to complete the task.

We calculated path deviations by taking the total number of observed deviations and dividing thatnumber by the total number of steps taken using an optimal path.

Task Success The task was considered a success if the participant completed the task in the allotted time.

Results

Data Analysis and ReportingThe results of the usability test were calculated according to the methods specified in the Usability Metrics section above.

The testing results for the TenEleven eCR are detailed below. The table below easily identifies the tasks performed and theperformance level for each task.

Errors Deviations

Task Participants TaskSuccess

Time toComplete(Avg)

TotalAverage (TotalObserved/CompletedTask)

Total(Observed/Optimal)

Average(Observed/Optimal)

1. ClinicalDecisionSupport

4 4 3 mins 3 15% 26/23 1.13

2. eMAR 4 44 mins 5 18.5% 38/34 1.12

3. ClinicalInformationReconciliation

4 42 mins 3 12.5% 12/9 1.33

4. CPOE – Labs &Radiology

1 11 min 2 8.7% 28/23 1.22

Effectiveness

Participants in the study made errors in the execution of the tasks, there were no functionality errors. The bulk of the errorsoccurred due to the fact the piece of functionality was new to the user. These errors are expected until formal training canoccur for each piece of new functionality. The Clinical Decision Support (CDS) and the Clinical Information Reconciliation (CIR)functionality are new components to the TenEleven eCR users. Both pieces of functionality are slated for futureenhancements to make them more intuitive, easier to use, and require less clicks.

EfficiencyParticipants in the study, for the most part, followed the optimal paths to complete the assigned tasks. However, there wereexceptions. As mentioned above; the CDS and CIR functionality are newer components to The TenEleven eCR and it wasexpected that users may not be fully aware of the most optimal paths to take when performing the tasks. While reconcilingthe Problem list, participants were hesitant to import medical problems (Diagnoses) into their The TenEleven eCR System asthis has potential to disturb their billing.

Satisfaction4 out of 5 users expressed they were “Very Satisfied” with the TenEleven eCR. 1 out of 5 users expressed a “2.5” out 5.

Major FindingsThe study showed no major findings. Some of the feedback pointed to refinement of some current functionality as well as themore widespread use of our Blog so users can brainstorm together with TenEleven experts.

Areas for ImprovementTenEleven’s plan is to implement a prioritized list of user suggestions for enhancements.

AppendicesThe following appendices include supporting data for this usability study.

1. Consent Form

TenEleven eCR Usability Study

Practice Consent

TenEleven would like to thank you for your participation in this study. The purpose of the study is to evaluate the usability of theTenEleven eCR system. Your participation in this study will include performing a specific set of tasks within the TenEleven eCR; andcompleting a short survey following the study. The study should take approximately 60 minutes. The information collected by TenElevenduring the study is for research purposes only. Your participation in this study is voluntary, so you are free to withdraw at any point duringthe study.

By signing below, I agree to participate in the study.

Name of Practice

Name of Participant

The TenEleven eCR User Role

Experience w/the TenEleveneCR (Yrs)

Date of Study

Location

Signature Date Printed Name Date

2. Post-Test Questionnaire

TenEleven eCR Usability Study Post-Test Questionnaire

1. How would you prefer we communicate new enhancements to your practice?ð Email ð Members Area Page ð Newsletter ð We do not care to be notifiedð Other ___________________________________________________________________________________

2. What is your preferred method of working with the TenEleven eCR Support department?ð Phone ð Email ð Online Chat (if this were available)ð Other ___________________________________________________________________________________

3. If you could change one part of the TenEleven eCR, what would you change?

4. If you could add one piece of functionality to the TenEleven eCR, what would you add?

5. On a scale of 1-5, how would rate your overall satisfaction with the TenEleven eCR?ð 1 (Very Dissatisfied) ð 2 (Somewhat Dissatisfied) ð 3 ð 4 (Somewhat Satisfied) ð 5 (Very Satisfied)

3. Moderator Guide

TenEleven eCR Usability Study Moderator GuideTask to perform during study

Functionality Task Goal Optimal Paths

1. Implement a ClinicalDecision Support Rule

Use the Clinical DecisionSupport screen toimplement a new ClinicalDecision Support rule

3 minutes 23

2. Perform electronicmedicationadministration

Use the Clinical InformationEMAR interface to performmedication administrationfor a patient.

4 minutes 34

3. Perform ClinicalInformationReconciliation

Use the Clinical InformationReconciliation to chooseitems to include from CCDAto chart

2 minutes 9

4. CPOE – Labs andRadiology Tests

Use the Order interface forLabs and Radiology tests 1 minute 23

1. Designated Task TimeTask Time Designated (minutes)1. Implement a Clinical Decision

Support Rule3

2. Perform electronicmedication administration

4

3. Perform Clinical InformationReconciliation

2

4. CPOE – Labs and RadiologyTests

1

2. Participants DemographicsGender

Male 0Female 5

The TenEleven eCR User RoleProvider 1Provider Agent 0Clinical Staff 1Non-Clinical Staff 3

Front Desk 0

Years of Experience w/Client orSoftware Name

<1 0<2 3>2 2

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

EHR Usability Test Report

Product: Rcopia

Version: 3

Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports

Date of Usability Test: June 2013

Date of Report: June 2013

Report Prepared By: DrFirst, Inc Jason Aquilante, Product Manager 301-231-9510 [email protected] 9420 Key West Ave Suite 230 Rockville, MD 20850

Note: The following study was developed using the NISTIR 7742 template as a guide for reporting our findings: Customized

Common Industry Format Template for Electronic Health Record Usability Testing.

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

Executive Summary A usability test of Rcopia, version 3, a modular EHR was conducted throughout the month of June 2013 by DrFirst, Inc. The

purpose of this test was to test and validate the usability of the current user interface, and to provide evidence of usability.

During the usability test, 10 healthcare IT users served as participants and used the EHR in simulated, but representative tasks.

This study collected performance data on 11 tasks typically conducted on an EHR. The tasks conducted were related to the

following:

Computerized Provider Order Entry

Drug-Drug/Drug-Allergy Alerts

Updating a Patient’s Medication List

Updating a Patient’s Allergy List

E-Prescribing

Clinical Information Reconciliation

During the 45 minute one-on-one usability test, each participant was greeted by the administrator and asked to review and

sign an informed consent/release form (included in Appendix). Participants had prior experience with the EHR. The

administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHR.

During the testing, the administrator timed the test and, along with the data logger, recorded user performance data on

paper. The administrator did not give the participant assistance in how to complete the task.

The following types of data were collected for each participant:

Time to complete each task

Number and types of errors

Path deviations

Participants verbalizations

Tasks completed in the allotted time.

All participant data was de-identified – no correspondence could be made from the identity of the participant to the data

collected. Following the conclusion of the testing, participants were asked to complete a post-test questionnaire. The

following is a summary of the performance and rating data collected on the EHR.

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

Errors Deviations

Task Participant

s Task

Success

Time to Complete

(Avg) Total

Avg (Total

Observed/Completed Task)

Impact Of

Error Risk/Priority

Total (Observed/ Optimal)

Average (Observed/

Optimal)

1. Create Rx from Fav. List 10 10 4.1 0 0 3 0/8 30/30 1

2. Create Rx from Drug Search

10 7 28.7 3 30% 4 1.2/4 50/50 1

3. Renew 2 Active Meds 10 10 3.8 0 0 3 0/8 11/10 1.1

4. Set preference to show all warnings

10 3 25.2 7 70%

2 2.1/2 33/30 1.1

5. Prescribe medication that displays Alert

10 8 16.6 2 20% 4 .8/6 60/60 1

6. Add Medication through drug search.

10 7 34.9 3 30% 4 1.2/4 40/40 1

7. Stop 1 medication 10 10 11.7 0 0 2 0/8 21/20 1.05

8. Add common allergy 10 9 21.7 1 10% 3 .3/7 30/30 1

9. Prescribe controlled Rx to EPCS pharmacy

10 5 38.8 5 50% 3 1.5/3 106/90 1.17

10. Prescribe medication through Rx Report

10 7 12.3 3 30% 3 .9/5 20/20 1

11. Merge two patient records 10 3 108 7 70% 4 2.8/1 94/80 1.175

Risk Analysis and Priority Assignment

To establish the risk associated with user errors, DrFirst implemented a risk analysis that developed a risk value being given to each task. Tasks with the highest risk values were prioritized in descending order.

Methodology:

An impact value of a user error was assigned by the scale below:

Impact of Error Column:

1- No risk to patient, if error not known/re-evaluated

2- Little risk to patient, if error not known/ re-evaluated

3- Some risk to patient, if error not known/ re-evaluated

4- Significant risk to patient, if error not known/ re-evaluated

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

The error average (probability of an error occurring) observed in UCD testing was multiplied by the impact value to derive a

risk value. Highest priority was assigned to the highest risk value with the lowest priority assigned to the lowest risk value.

In addition to the performance data, the following qualitative observations were made:

1. Major Findings

2. Areas for Improvement

Introduction The EHR tested for this study was Rcopia, version 3, a modular EHR. Designed to present medical information to healthcare

providers in ambulatory healthcare settings, the EHR allows providers to electronically prescribe medications to the pharmacy.

The system has evolved to allow practice users to maintain a list of a patient’s medications, allergies and problems. The

usability testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability

in the EHR. To this end, measure of effectiveness and efficiency (time to perform tasks; total number of deviations; total

number of errors; etc) were captured during the usability testing.

Method

Participants

A total of 10 participants were tested on the EHR. Participants in the test were individuals that work within an ambulatory

healthcare environment. Participants were contacted by DrFirst, Inc staff to participate in the study. In addition, participants

had no direct connection to the development of the EHR. Participants were not from DrFirst, Inc. All participants had the

same level of training as all other actual end users.

The following is a table of participants by characteristics, including demographics, user role, and product experience.

Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities. A

summary of the participant demographics can be found in the Appendix.

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

Participant ID Gender User Role Professional Experience Product Experience (Years)

1 20131 M Provider Agent 6 .5

2 20132 F Provider Agent 4 3

3 20133 F Provider Agent 4 3

4 20134 F Provider Agent 3 3

5 20135 F Provider Agent 15 1

6 20136 M Provider Agent 7 2

7 20137 F Provider Agent 6 2

8 20138 F Provider Agent 11 1.5

9 20139 F Clinical Staff 25 1.5

10 201210 M Provider Agent 20 1.5

100% of all participants recruited for the test showed up to participate in the test.

Participants were advised that the test would take 45 minutes; but to allocate 60 minutes for the test. The added 15 minutes

was to provide enough time for administrator instructions and time between tasks. A spreadsheet was used to track

participant schedules, and included each patient’s demographic characteristics.

Study Design

Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently,

and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test

may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided

the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to

identify areas where improvements must be made.

During the usability test, participants interacted with 1 EHR. Each participant was provided the same set of instructions. The

system was evaluated for effectiveness and efficiency as defined by measures collected and analyzed for each participant.

Time to complete each task

Number and types of errors

Path deviations

Participants verbalizations

Tasks completed in the allotted time.

A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with

this EHR, including:

Create Rx from Fav. List

Create Rx from Drug Search

Renew 2 Active Meds

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

Set preference to show all warnings

Prescribe medication that displays Alert

Add Medication through drug search.

Stop 1 medication

Add common allergy

Prescribe controlled Rx to EPCS pharmacy

Prescribe medication through Rx Report

Merge two patient records

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users.

Procedures

Upon arrival, participants were greeted and their identity was verified and matched with the participant’s name on the

schedule. Participants were then assigned a participant ID. All participants signed an informed consent form prior to the

testing. The test administrator witnessed each participant’s signing of the consent form.

To ensure that the test ran smoothly, two DrFirst staff members participated in the administration of the test. The test

administrator provided the instructions for each test, and noted all comments from the participants; while the data logger

noted all times, deviations and errors.

Participants were instructed to perform the tasks:

After listening to the instructions from the testing administrator

As quickly as possible

Without assistance

Task timing began after the completion of the verbal instructions from the administrator; and after an acknowledgement from

the participant that they were ready to begin. The task time was stopped once the participant indicated they had successfully

completed the task.

Following the test, the administrator gave the participant the post-test questionnaire; and then thanked them for their time.

The DrFirst staff member responsible for logging data recorded all participants’ demographic information, task success rate,

time on task, errors, and deviations into a spreadsheet.

Test Location

The test was administered in a setting where participants were isolated from other participants in the study. Only the test

administrator and logger were with the participants while the study was being administered. To ensure that the environment

was comfortable for users, noise levels were kept to a minimum with the ambient temperature within a normal range.

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

Test Environment

The computers used for the testing were PCs running on Windows XP or Windows 7. Users also used a mouse and keyboard

while interacting with the EHR. The Rcopia application is a web-based solution; so all computers were running on high speed

internet connections. All participants indicated that system performance during the test was what they were used to seeing

during their typical work day.

Test Forms and Tools

During the usability test, various instruments and documents were used, including:

Informed consent

Moderator guide

Post-test questionnaire

Examples of these documents are to be found in the Appendix section.

Participant Instructions

The Administrator read the following instructions aloud to each participant:

“Thank you for participating in today’s usability study of Rcopia. In a few minutes, you will be asked to perform a series of

tasks and complete a user survey. Please attempt to complete each task as quickly as possible. The idea behind this study is

for DrFirst to obtain information on where enhancements are needed in the application based on how quickly, and easily,

tasks are being performed in Rcopia.

When it is time to perform each task, I will state the instructions and then tell you to begin. Once you have completed the

task, please say ‘Done’. After you have completed the task, I will ask for feedback on the actions you had taken during the

task. You will be given a specified amount of time to complete each task. This time will not be communicated to you as we

are interested in seeing how long each task does take for you to perform.”

Usability Metrics

The goals of this test were to assess:

1. The efficiency of Rcopia by measuring the length of time it takes for a user to complete specific tasks; and the total

number of tasks successfully completed during the study.

2. The efficiency of Rcopia by measuring the path deviations taken by the user during the tasks.

3. The effectiveness of Rcopia by measuring the number and types of errors experienced by the user during the tasks.

4. The satisfaction of the user with Rcopia by logging their comments on the tasks

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

Data Scoring

The table below details how each task was scored.

Measure Rationale and Scoring

Task Time Timing started when the administrator said ‘Begin’. The time ended when the participant said ‘Done’.

In the event that the participant finished, and did not say ‘Done’, the administrator stopped the clock

when it was clear the participant had completed the task. Task times were only counted if the

participant completed the task in the allotted time. The average time per task was calculated for each

task.

Errors The task resulted in an error if the participant: failed to finish the task in the allotted time; or, if they

became ‘stuck’ and could not proceed without asking for assistance. Task time was not counted when

the task resulted in an error.

We calculated the error % for each task by taking the total number of errors for each task and divided

that number by the total attempts at the task.

Path Deviations Path deviations were recorded as actions taken during the task that were not part of the necessary

actions needed to complete the task.

We calculated path deviations by taking the total number of observed deviations and dividing that

number by the total number of steps taken using an optimal path.

Task Success A task was considered a success if the participant completed the task in the allotted time. To calculate

the task success rate, we simply divided the total number of successful tasks by the total number of

tasks completed.

The time designated for each task was determined by taking the optimal time to complete the task and

multiplying it by a factor of 1.25 to allow for those users that may not have been fully trained on the

application.

Results

Data Analysis and Reporting

The results of the usability test were calculated according to the methods specified in the Usability Metrics section above.

The testing results for Rcopia are detailed below. The table below easily identifies the tasks performed and the performance

level for each task.

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

Errors Deviations

Task Participants Task Success

Time to Complete (Avg)

Total Average (Total Observed/Completed Task)

Total (Observed/Optimal)

Average (Observed/Optimal)

1. Create Rx from Fav. List

10 10 4.1 0 0 30/30 1

2. Create Rx from Drug Search

10 7 28.7 3 30% 50/50 1

3. Renew 2 Active Meds

10 10 3.8 0 0 11/10 1.1

4. Set preference to show all warnings

10 3 25.2 7 70% 33/30 1.1

5. Prescribe medication that displays Alert

10 8 16.6 2 20% 60/60 1

6. Add Medication through drug search.

10 7 34.9 3 30% 40/40 1

7. Stop 1 medication

10 10 11.7 0 0 21/20 1.05

8. Add common allergy

10 9 21.7 1 10% 30/30 1

9. Prescribe controlled Rx to EPCS pharmacy

10 5 38.8 5 50% 106/90 1.17

10. Prescribe medication through Rx Report

10 7 12.3 3 30% 20/20 1

11. Merge two patient records

10 3 108 7 70% 94/80 1.175

Effectiveness

Participants in the study experienced errors in all but 3 tasks. However, the bulk of the errors occurred in three areas: Practice

Preference Page; Pharmacy search page; and the Merge Application. Errors found in these areas were of no surprise to

DrFirst. The Merge application is a newer component to DrFirst users. The practice preference page and the pharmacy search

page are two areas that are focal points for future enhancements. Each of these pages present usability issues for end users;

and requirements for enhancements are being drafted.

Efficiency

Participants in the study, for the most part, followed the optimal paths to complete the assigned tasks. However, there were

two exceptions: Pharmacy Search and the Merge Application. As mentioned above; the Merge Application is a newer

component to Rcopia and it was expected that users may not be fully aware of the most optimal paths to take when

performing the tasks. An abnormal number of deviations were also observed when participants were asked to prescribe a

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

medication to an EPCS-enabled pharmacy. Participants either elected to take extra steps to change the pharmacy; while

others forgot to select a pharmacy labeled with a ‘C’ to indicate it is a pharmacy that accepts Controlled Substances.

Satisfaction

9 out of 10 users expressed they were “Very Satisfied” with Rcopia.

Major Findings

The study showed no major findings. Overall, the participants verbalized their happiness with the system, but many did stress

they would like to see some minor enhancements. Their use of the system has not decreased over time; and most feel the

functionality allows them to execute on their job functions without causing additional work.

Areas for Improvement

The study did verify two assumptions: enhancements are needed to optimize usability; and there needs to be better means of

communication on newer features that may help improve patient safety and improve clinical workflows.

As mentioned previously, we are planning to modify our practice preference (and user preference) page; as well as our

pharmacy search area. We want to be sure our users can quickly (but efficiently) navigate through these areas so not to

impede on their prescription writing process.

Additionally, we need to find a better means of communication with our users to notify them of new features. Our post-study

questionnaire revealed that 8 out of 10 participants were not aware of a particular feature within Rcopia. Improving

communication with users will be discussed internally at DrFirst so we can make sure our users are aware of features that will

benefit them clinically and operationally.

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

Appendices The following appendices include supporting data for this usability study.

1. Sample Consent Form

Rcopia Usability Study

Practice Consent

DrFirst would like to thank you for your participation in this study. The purpose of the study is to evaluate the usability of the Rcopia

electronic prescribing system. Your participation in this study will include performing specific tasks within Rcopia; and completing a short

survey following the study. The study should take approximately 30 minutes. The information collected by DrFirst during the study is for

research purposes only. Your participation in this study is voluntary, so you are free to withdraw at any point during the study.

By signing below, I agree to participate in the study.

Name of Practice

Name of Participant

Rcopia User Role

Experience w/Rcopia (Yrs)

Date of Study

Location

Signature Date Printed Name Date

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

2. Sample Post-Test Questionnaire

Rcopia User Survey

1. How would you prefer we communicate new enhancements to your practice?

Email Members Area Page Newsletter We do not care to be notified

2. What is your preferred method of working with DrFirst Support department?

Phone Email Online Chat (if this were available)

3. What is your preferred method for creating a prescription?

Drug Search Favorites

4. How long does it take to enter a new patient in Rcopia (Demographic data only)?

0-3 minutes 3-5 minutes >5 Minutes

5. Which of the following Rcopia tasks would you use on a mobile device?

Viewing a patient’s record

eRx

Updating a Patient’s Med List, Allergy List, Problem List

I would not use Rcopia on a Mobile device

6. If you could change one part of Rcopia, what would you change?

7. If you could add one piece of functionality to Rcopia, what would you add?

8. On a scale of 1-5, how would rate your overall satisfaction with Rcopia?

1 (Very Dissatisfied) 2 (Somewhat Dissatisfied) 3 4 (Somewhat Satisfied) 5 (Very Satisfied)

9. Which of the following functions are available through DrFirst/Rcopia? (You may check more than 1 option)

Patient Education Materials

Electronically Prescribe Controlled Substances

Prescription Coupons for distributing to Patients

Secure email messaging

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

3. Moderator Guide

Rcopia Usability Study

Computerized Provider Order Entry

Tasks to perform during study:

Preparation Task Goal Optimal Paths

1. No Pending Rx’s 2. No Active Medications 3. No Allergies

Create Rx from Favorite List 6 seconds 3

1. No Pending Rx’s 2. No Active Medications 3. No Allergies

Create Rx from Drug Search 34 seconds 5

1. Create 2 Active Medications

Renew 2 active medications 8 seconds 1

Clinical Alerts

Tasks to perform during study

Preparation Task Goal Optimal Paths

NONE Set preference to have all warnings be shown for Drug-Allergy interactions

27 seconds 3

1. Create Allergy for patient

Prescribe medication that is known to be an allergy for patient.

20 seconds 6

Updating Medication List

Tasks to perform during study

Preparation Task Goal Optimal Paths

1. No Pending Rx’s 2. No Active Medications 3. No Allergies

Add Medication through drug search

39 seconds 4

1. No Pending Rx’s 2. No Active Medications 3. No Allergies

Stop 1 medication 16 seconds 2

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

Updating Allergy List

Tasks to perform during study

Preparation Task Goal Optimal Paths

1. No Pending Rx’s 2. No Active Medications 3. No Allergies

Add common allergy 31 seconds 3

eRx

Tasks to perform during study

Preparation Task Goal Optimal Paths

1. No Pending Rx’s 2. No Active Medications 3. No Allergies

Use print w/o sending to prescribe controlled substance to EPCS enabled pharmacy.

53 seconds 9

1. Make sure at least one pending Rx for current patient

Prescribe medication through Rx Report

22 seconds 2

Clinical Information Reconciliation

Tasks to perform during study

Preparation Task Goal Optimal Paths

1. Make sure a patient record has been sent through Akario to provider

Merge patient record from two different sources.

120 seconds 8

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

4. Designated Task Times

Task Time Designated (seconds) 1. Create Rx from Fav. List 6

2. Create Rx from Drug Search 34

3. Renew 2 Active Meds 8

4. Set preference to show all warnings

27

5. Prescribe medication that displays Alert

20

6. Add Medication through drug search.

39

7. Stop 1 medication 16

8. Add common allergy 31

9. Prescribe controlled Rx to EPCS pharmacy

53

10. Prescribe medication through Rx Report

22

11. Merge two patient records 120

DrFirst, Inc. - [email protected] - 866-263-6511 (Office) - 301-610-7543 (fax)

9420 Key West Avenue, Suite 230 - Rockville, MD 20850

Follow DrFirst on our Blog: http://blog.drfirst.com/ | Check out our videos at http://www.youtube.com/drfirst

5. Participants Demographics

Gender

Male 3

Female 7

Rcopia User Role

Provider 0

Provider Agent 9

Clinical Staff 1

Non-Clinical Staff 0

Front Desk 0

Years of Experience w/Rcopia

<1 1

<2 4

>2 5

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 10

Appendix B: Quality Management System

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 11

Appendix C: Privacy and Security

Test Results Summary for 2014 Edition EHR Certification14-2655-R-0079-PRA Version 1.1, February 28, 2016

©2016 InfoGard. May be reproduced only in its original entirety, without revision 12

Test Results Summary Document History Version Date

V1.0 11/21/2014V1.1 2/28/2016

Description of ChangeInitial release

END OF DOCUMENT

Updated Safety-Enhanced Design report