1
ACIAR: MOBILE ACQUIRED DATA (MAD) PILOTPresentation to ACIAR, 27TH OCTOBER, 2015
PT Collins Higgins Consulting
• MAD Pilot Overview
• Evaluation Objectives & Methodology
• Objective #1: Should ACIAR promote going digital?
• Objective #2: If so, with which app?
• Objective #3: If so, how should ACIAR promote the use of apps?
What were the lessons learned?
• Recommendations
2
PT Collins Higgins Consulting
PILOT EVALUATION OBJECTIVESThe evaluation collected various forms of data from different stakeholders along these three objectives.
Evaluation Objectives In simple words In a simple answer
1Do the apps realize the expected benefits for the relationshipsbetween user groups?
Should ACIAR promote going digital?
Yes
Apps enrich relationships between farmers, researchers and project leaders. Training required.
2
Which of the two apps realizes the expected benefits more effectively (user-friendliness and specific features)?
If so, with which app?
CommCareFor this specific pilot it was CommCare. Most likely a “horses for courses” approach when selecting apps for other projects.
3
What are the key lessons learned for each user group in order for ACIAR to replicate/scale up MAD apps?
If so, how should ACIAR promote the use of apps?
Technical preparation essentialSupport projects with technical and operational planning for DDCA implementation.
3
PT Collins Higgins Consulting
Phase 1: Identify ACIAR’s research information
needs and shortlist apps for piloting.
Phase 2: Implement a pilot and evaluate the
apps.
Phase 3: Present findings and recommendations
from the pilot.
MAD Pilot Research Phases
4
October, 2015July-Sept, 2015Feb-Apr, 2015
PHASE 1: RESEARCH NEEDS & APP SELECTION
Paper-based data collection
Digital, mobile-based data collection (DDCAs)
Integration with data management, analysis and vizualization software through APIs
Smallholders Field Researchers Researchers Project
Leaders RPMs ACIAR
DATA COLLECTION DATA COLLATION DATA ANALYSIS
DATA ANALYSIS DATA REPORTING DATA REPORTING
Project Data
M&E Data
MAD PILOT BSIP
5
APP ASSESSMENT METHODOLOGY
2 APPS TO PILOT
• 14 apps reviewed by Kopernik for ITT catalogue + 3 new apps suggested by ACIAR
• 7 ‘Top Recommendations according to ITT catalogue rigorous assessment
• 5 top-rated apps using modified assessment matrix suited to ACIAR’s perceived research information needs
• 2 apps to pilot based on ACIAR’s confirmedresearch information needs
6
MAD PILOT’s ASSESSMENT CRITERIA
CRITERION DEFINITION
1. Usability Richness and user-friendliness of features offered
2. Scalability Ability to handle multiple surveys and questions, numerous data collectors and users, and high data transmissions at the same time. Flexibility in using the tool for different purposes and geographical contexts
3. Affordability Cost of annual subscription plans and pricing structure to accommodate scale-up
4. Security Degrees of data security measures employed
5. User Support Levels and types of user/system support provided
7
8
Multi-purpose,
monitoring
Multi-purpose, one-off
Single-purpose, monitoring
Single-purpose, one-off
USAGE OF DIGITAL DATA COLLECTION APPS
• Need assessments• Household registration
surveys
• Output monitoring surveys
• Cattle weight monitoring survey using data from cattle feed survey
• Cattle weight monitoring survey
• Farmer registration survey
ACIAR project contextExamples
• Baseline survey using data from household registration survey
• Cattle registration survey based on data from farmer registration survey
BEN
EFIT
S• Output monitoring surveys
using data from other surveys that measure different outputs.
Multi-purpose,
monitoring
Multi-purpose, one-off
Single-purpose, monitoring
Single-purpose, one-off
Based on the assessment and discussions, the mobile-acquired data pilot will focus on SurveyCTO and Commcare.
TWO COMPREHENSIVE CASE MANAGEMENT APPS
9
PTCHC: Pilot Design & Management
Simulated Project LeaderDavid McGill
Logistics and CoordinationTeddy Kristedi &
Komang Budinata
DDCA testingKate Janetski
(Cocoa Care/Salesforce)
Pilot ManagementStuart Higgins 10
11
Kopernik: Design & Evaluation (+ survey building)
Survey Building
Pilot Evaluation
App selection
Evaluation Design & Management
Prita Raditiarini Yustika Muharastri
Vivien Ayun
Amber Gregory
Tomo Hamakawa
Edwin Mulianto
Udayana University: Field Researchers and Senior Researchers• Each team was lead by a Senior Researcher.
• Field Researchers were recent graduates in their respective field (3 animal husbandry and 3 veterinary sciences).
• They have passive English (writing and reading). • Previous experience includes field interviews, experience for research and community services as well as
on the job training in a vet clinic.
12
PT Collins Higgins Consulting
3 Villages• Gerokgak• Sanggalangit• Musi
13
• Surveys
14
Survey and Pilot Design
15
To simulate a typical ACIAR project
To test the different features of the applications
To test the ability to link surveys and ‘cascade’ information and data
To build complexity to test the case management capacity
Aims
The Survey Plan
#Households + Education
# animals/farm size
Production
Welfare/Production
Farm profitability
Types/amounts (Researcher interest)
Metrics
16
+
+
Farmer Stories General Stories
+ Cattle management
17
Animal Stories Provide depth to the data
Put together by Indonesian team.
Entered in Bahasa. A topic of interest to
the local team. Testing different features Testing case management
Looks simple right? 18
Process• Prepared first on
paper
• Converted to the apps by Kopernikstaff (Vivien/Amber)
• Requires clarity –knowing what the data is for!
Pilot statistics
• 10 days training• 3 weeks field work• 80 registered farmers• 183 registered cattle• 566 animal health surveys• 344 animal growth surveys• 3 days feedback
/evaluation
19
20
Note• Broke SurveyCTO• Became a killer criteria
Survey and Pilot Design
PT Collins Higgins Consulting
• MAD Pilot Overview
• Evaluation Objectives & Methodology
• Objective #1: Should ACIAR promote going digital?
• Objective #2: If so, with which app?
• Objective #3: If so, how should ACIAR promote the use of apps?
What were the lessons learned?
• Recommendations
21
EVALUATION METHODOLOGYKopernik’s evaluation team comprised of Tomo Hamakawa, Prita Raditiarini, and Yustika Muharastri engaged in various activities at key points in the pilot to collect information along the three evaluation objectives, as summarized below.
EVALUATION OBJECTIVES
BASELINE FIELDWORK 1 MIDLINE FIELDWORK 2 ENDLINEREPORT
WRITING
1. Benefits
• Researcher profiles
• Paper basedsurvey
All researchers requested to fill in daily feedback form to capture
their experiences using the apps.
This will be collected at
interim.
•Derived benefits
All researchers requested to fill in daily feedback form to capture
their experiences using the apps.
This will be collected at end-
line.
•Derived benefits
•Synthesizefindings andprepare finalreport
2. AppComparison
•App features
•App features
•App comparison
3. LessonsLearned
•Lessons learned
•Lessons learned
Eval
uatio
n Ac
tiviti
es
22
23
Evaluating famer impressions of being surveyed using DDCA.
23
PT Collins Higgins Consulting 24
Evaluating Field Researcher impressions of performing surveys using DDCA.
PT Collins Higgins Consulting
• MAD Pilot Overview
• Evaluation Objectives & Methodology
• Objective #1: Should ACIAR promote going digital?
• Objective #2: If so, with which app?
• Objective #3: If so, how should ACIAR promote the use of apps?
What were the lessons learned?
• Recommendations
25
EVALUATION OBJECTIVE #1: BENEFIT SUMMARYAmong the 8 expected benefits of the apps, those involving the project leader, researchers, and farmers were clearly realized.
BL F1 ML F2 EL 8 EXPECTED BENEFITS
ACIAR 3 3 1. Access to reliable cascading data2. Access to cascading data in real time
ProjectLeader 1 1 1
3. Improved data access*4. Improved data quality*5. Improved survey management
Researchers 8Daily
Feedback Form
8Daily
Feedback Form
8
6. Improved engagement 7. Improved access to knowledge8. Reduced survey timeFarmers 16 15
26
SURVEY DURATION COMPARISONThe pilot included a round of paper-based data collection for farmer registration and household income during the training. This allows us to compare the survey duration between paper-based and app-based not in a scientifically rigorous way, but as a rough estimate.
Paper-based• Conducting during training• Sample of 8 farmers• Caveat: researchers were not fully used
to the survey
App-based• CommCare and SurveyCTO• Sample of around 80 farmers• Automatically recorded on app
27
11
10
9
7Paper Based
Survey CTO
CommCare
Farmer Registration
Interview Data Entry
17
10
8
11Paper Based
Survey CTO
CommCare
Household Income
Interview Data Entry
28
20
17
18Paper Based
Survey CTO
CommCare
Total Duration
Interview Data Entry
(n=8) (n=8)
(n=39) (n=38)
(n=40)(n=41)
SURVEY DURATION: APP vs PAPER
28
Time duration to conduct interviewsNumber of Minutes; Average
Both apps achieved time reduction compared to paper-based surveys. The gains mostly come from the elimination of data entry process. Researchers were slightly faster completing CommCare surveys than SurveyCTO.
63% reduction
57% reduction
9
9
10
8
6
10
Survey CTO
CommCare
Farmer Registration Household Income
Cattle Management
n=38
n=41 n=41n=40
Farmer-level (per farmer)
7
6
4
4
5
5
Survey CTO
CommCare
Cattle Registration Animal Feed/Growth Survey
Animal Health Check
n=161n=84
n=99 n=183 n=297
n=269
Cattle-level (per animal)
n=39n=39
Comparing the survey duration between Commcare and SurveyCTO, there were similar times for all cattle related surveys and farmer related surveys with exception of Cattle Management Survey.
Survey TimeNumber of Minutes; Average
29
DDCA SURVEY DURATION: COMMCARE vs. SURVEYCTO
SURVEY TIME SPENT (FARMERS)
How many times were you visited by researchers? (n=15) Number of People (%)
34%
33%
13%
20%
Average 4.2 times
3 times
4 times
5 times
6 times
How long did each visit take? (n=15) Number of People (%)
13%
20%
40%
13%
7%7%
Average 40 minutes
15 minutes
20 minutes
30 minutes
60 minutes
90 minutes
120 minutes
Generally, the farmers were excited, interested and comfortable being interviewed using digital applications. Yet, the survey took up their time especially due to recurring visits in Fieldwork Phase 1.
30
31
Farmers’ experiences in survey process
32
FARMERS’ OVERALL EXPERIENCE
How would you rate your experience overall? (n=15) Number of People (%)
20%
53%
27%
Excellent Good Fair
Comments
“The surveys seemed efficient and can be conducted anywhere. I felt comfortable being interviewed while I’m working in the cattle shed or making tipat at home.”
“I am amazed when the researcher estimates my cattle weight automatically using digital app. I can see the result directly thus I can estimate my cattle selling price!”
“I was happy to get feedback about cattle feed from the researcher. It is very useful as an advice to fatten my cattle especially in the dry season like this.”
“At first, I did not realize that my cattle had physical health problems. After seeing the mark-up photo from the survey, I can clearly see that my cattle’s neck was bitten by flies.”
“The researchers came to my home and conducted interview too many times. I am afraid that they will use the data for not-so-good purposes.”
Positive
Negative
The majority of the farmers had an overall positive experience being part of the pilot.
Source: Farmer end-line survey
33
FARMERS’ EXPERIENCE IN THE SURVEY PROCESS
Did you get involved in any activity in the survey? (n=15) Number of People (%)
If yes, what kind of activity? (n=5)Number of People (%)
33%
67%
Yes
No
40%
60%
Take Photo
Drawing
Most farmers didn’t get involved in the survey activities because they were shy and afraid to touch the tablets. After being persuaded by researchers, 5 farmers tried to draw a map of their house and take pictures. Using tablets was a new thing for the farmers.
Source: Farmer end-line survey
34Left: Farmer making a ‘free drawing’ on the tablet.
Above: Farmer capturing an image of their livestock on the tablet.
35
Researchers’ experiences in survey process
Every day in the pilot, the evaluation team asked researchers to answer a simple question based on their experience that day.
Do you agree with this statement:“I think mobile apps make surveys easier”
36
RESEARCHERS’ OVERALL ASSESSMENT
11
1 1 1 1 1 1
55 5 5
31 3 1
12 2 2
4
54 6
1 2 3 4 5 6 7 8
Fieldwork #1(August 8-15, 2015)
111
1 1 11
21 1
1
32
12
12
34 4 4
3 3
9 10 11 12 13 14
Fieldwork #2(August 20-25, 2015)
DAY DAY
Do you agree with this statement: “I think mobile apps make surveys easier”(n=8) Number of People
• Starting from a low baseline, researchers’ affinity to digital apps grew as the pilot progressed. However, it decreased on day 9 when researchers switched apps and faced technical difficulties in using the new app.
• Some respondents did not complete the surveys on day 6, 7, 13, and 14.
Source: Daily Feedback Form w.1 & 2 37
Do apps improve farmer-researcher engagement?
38
Do apps give farmers access toinformation?Two examples• Body condition score (BCS)• Animal growth rate
39
40
Animal Health Check Feedback: Body Condition Score (BCS)
• Field researchers score cattle against photos pre-loaded into the survey.
• Results submitted and analysed by Project Leader over night.
• Feedback provided the next day to farmers using the tablet.
Feedback to farmersSurvey question using photos to assist
Animal growth check feedbackMeasure length
41
Animal growth check feedbackMeasure girth
42
Animal growth check feedbackEnter data into the survey containing the pre loaded formula.
43
Animal growth check feedbackAnimal’s weight calculated in survey and compared to farmer’s estimated weight.
44
Animal growth check feedbackFeedback to farmer in real time
45
FARMERS’ EXPERIENCE IN INFORMATION TRANSFER
Do you find being surveyed to be an interesting experience? (n=15) Number of People (%)
93%
7%
Yes
No
Why? Please explain your answer (n=15)Number of People (%)
60%27%
6%7%
Yes, information
Yes, app features
Yes, experience
No, time
Most farmers found that being surveyed was an interesting experience because of the information they received related to cattle’s body condition (weight and health), cattle’s food, and animal diseases.
Source: Farmer end-line survey
46
47
FARMER-RESEARCHER INTERACTION
Daily Question to Researchers: How did the app help or improve your interaction with farmers? (n=8) Multiple responses; Number of People
10 0 0 0 0
43
6
8
34
5 5 5 54 4
5
7
43
54
Day 9 Day 10 Day 11 Day 12 Day 13 Day 14
Nothing Informative for farmers Interactive Efficient (Reduced time)
Similarly from the researcher perspective, researchers felt that the app helped make their engagement with farmers informative and interactive.
Source: Daily Feedback Form week 2
In this period, data analysis on cattle weight and condition was provided to the farmers
Do apps improve data quality?
48 48
SIMPLIFYING DATA CONCEPTSIn the original evaluation design, two more data-related concepts – data management and data cleaning - were included as part of the expected benefits of apps. Since data management is a general idea and data cleaning is an intermediate step toward data quality, these two have been removed. This evaluation focused on data access and quality.
DATA ACCESS
DATA QUALITY
DATA MANAGEMENT
A general phrase
referring to data-related
tasks
Real-time data access makes data cleaning possible.
Data cleaning improves data quality.
DATA CLEANING
SURVEY MANAGEMENT
Real-time data access and immediate data cleaning process allow rapid confirmation and correction while the data collection process is still ongoing.
Good survey management improves the collected data, thus less need for cleaning data and improvement in data quality
49
50
PERCEIVED DATA ACCURACY
Source: Researcher end-line survey
Why? Please give examples
“Error notifications prevent missing data or wrong data input.” (SR)
“The data can become even more accurate because it could be edited before being uploaded.” (SR)
“Wrong calculation for estimating land area could be corrected immediately.” (SR)
“GPS ensures field researchers actually conduct surveys in the field.” (SR)
Do you think apps improve data accuracy? (n=8) Number of People (%)
In both midline and endline surveys, all researchers found that using apps improved data completeness and accuracy. The apps prevent blank answers and do not allow researchers to move on to the next page if the previous page hasn’t been completed.
Source: Researcher end-line surveySource: Researcher mid-line survey
100%
0%
Yes No
100%
0%
Yes No
Midline Endline
Do apps improve survey management?
51
52
UPSTREAM BENEFITS IN THE DATA VALUE CHAIN
DATA COLLECTION AND CLEANING
UPLOAD DATA
REVIEW
DATA CLEANING
Researchers Project Leaders RPMs ACIAR CORPORATE
By using digital apps, project leaders could access data as soon as researchers connect to the internet. This allows for adaptive management for project leaders.
• Real-time data access• Project leader is able to clean data immediately even while researchers are still in the field, thus
allowing for rapid corrections and confirmation.• The project leader cited specific instances of adaptive management: he checked the dataset at the
end of the day and, on the next morning, praised some researchers for their accuracy and pace, while cautioning others to emphasize thoroughness.
APP BENEFITS
53
Very (very) short time frames? Keep an eye on their approach? Discuss survey times/objectives
Wulan’s times are more encouraging
Do apps improve access to reliable, real-time cascading data?
54
55
DOWNSTREAM BENEFITS IN THE DATA VALUE CHAIN
CASCADING DATA
DATA AGGREGATION DATA RE-INTERPRETATION REPORT
Researchers Project Leaders RPMs ACIAR Corporate
Project leader is able to provide high-level analysis in real-time to RPMs and ACIAR. This proves that app usage can contribute to cascading data.
• Real-time data export features
• Project leader analyzed the dataset and developed a high-level excel-based dashboard for real-time updates (see right image)
• RPMs could re-interpret/re-configure the data based on the dashboard provided by the project leader. RPMs do not need to access raw data.
APP BENEFITS
EVALUATION OBJECTIVE #1: BENEFIT SUMMARYAmong the 8 expected benefits of the apps, those involving the project leader, researchers, and farmers were clearly realized.
BL F1 ML F2 EL 8 EXPECTED BENEFITS EVALUATION FINDINGS
ACIAR 3 31. Access to reliable
cascading data
2. Access to cascading data in real time
• Project leader could access the entire dataset and provide high-level summaries to ACIAR in real time basis, but ACIAR’s requirements regarding the cascading data requires further clarification.
• Depending on the nature of ACIAR’s requirements, digital data collection apps could be combined with a simple excel-based dashboard or integrated with a more advanced analytical platform/CRM.
ProjectLeader
1 1 1 3. Improved data access*
4. Improved data quality*
5. Improved survey management
• Real-time access to the complete dataset enables project leaders to review and communicate with researchers for cleaning purposes.
• Both the project leader and researchers perceive digital apps to improve data quality due to data validation and error notifications.
• The project leader cited instances of adaptive management: he checked the dataset at the end of the day and, on the next morning, praised some researchers for their accuracy and pace, while cautioning others to emphasize thoroughness.
Researchers 8
Daily
Fee
dbac
k Fo
rm 8
Daily
Fee
dbac
k Fo
rm 8
6. Improved engagement
7. Improved access to knowledge
8. Reduced survey time
• The interactive features of the apps improved the researcher-farmer engagement from both perspectives.
• Farmers enjoyed learning about their own cattle’s conditions, diseases, etc. especially in comparison to their neighbors.
• Apps realized an estimated 57-63% time reduction compared to paper-based surveys.Farmers 16 15
56
57
EVALUATION OBJECTIVE 1 : SUMMARY
Key Evaluation ObjectivesIn simple
wordsKey Evaluation Findings
1
Do the apps realize the expected benefits for the relationships between user groups?
Should ACIAR promote going digital?
YES• The expected benefits of the apps were realized strongly in the
‘upstream’ of the data value chain, i.e. among farmers, researchers, and project leaders.
• Given the apps’ inherent limitation in ‘downstream’ analysis/reporting, the benefits are less clear on that end. More clarity from ACIAR on cascading data requirements in the future would further contribute to assessing the apps’ benefits for reporting purposes.
PT Collins Higgins Consulting
• MAD Pilot Overview
• Evaluation Objectives & Methodology
• Objective #1: Should ACIAR promote going digital?
• Objective #2: If so, with which app?
• Objective #3: If so, how should ACIAR promote the use of apps?
What were the lessons learned?
• Recommendations
58
PT Collins Higgins Consulting
Both apps tested in this pilot were chosen as they are very powerful tools. Their relative strengths and similarities, including their case management functionality, makes them both very competitive applications.
This pilot set out to understand out of the two apps which is most suited to the environment in which ACIAR projects operate - that being, remote and rural areas, with often limited internet connectivity.
EVALUATION OBJECTIVE # 2: WHICH APP?
CommCare and SurveyCTO are generally equal in terms of features with some differences. A number of app-unique features were not tested in this pilot.
EVALUATION OBJECTIVE # 2: WHICH APP?
60
Data Input• Text, numeric• Single/ multiple choice• Date/ time• Audio (Input/Output)• Video (Input/Output)• Image (Input/Output)• GPS Coordinate
Logic Functions• Calculation logic• Skip logic• Answer limits & validations
Data Collector• Single form data sharing across
user accounts
Data Analyzer• Data export formats (.tsv/ .csv/
.xlsx/ .xml)
Form builder types:Web-based, drag down
Data Input• QR code scanner• Signature
Data Collector• Historical data saved
on device, available for skip logic and validation
Form builder types:Import XML/CSV (manual)
Data Input• Annotated Image• RFID
• SMS data
collection
• SMS messaging
• Audio messaging• Built-in single-
chart and basic-map-based views
• Question timer
• Audio audit
• Speed limits
• Automated data quality checker
Common featuresUnique to CommCarebut not tested in pilot
Unique to SurveyCTOUnique to CommCare Unique to SurveyCTObut not tested in pilot
61
COMPARISON OF COMMON FEATURESFEATURE COMMCARE SURVEYCTO
Text, numeric Excellent, responsive text and numbering input (FR,SR) Good, responsive text and numbering input but touch screen sensitivity is too high (FR,SR)
Single/ multiple choice Good, no problem found (FR,SR) Good, no problem found (FR,SR)
Date/ time Good but could be better if date and time are in the same page with researcher’s name (FR – DFF) Good (FR)
Audio (Input/Output) Poor audio due to error recording and very small volume (FR,SR) Poor audio due to error recording. (FR)
Video (Input/Output) Good video quality, hard to upload long duration video (FR,SR) Good video quality, hard to upload long duration video (FR,SR)
Image (Input/Output) Excellent (FR,SR) Good image quality but improper crop feature (FR)
GPS Coordinate Slow and inaccurate GPS due to poor mobile coverage (FR,SR) Slow and inaccurate GPS due to poor mobile coverage (FR,SR)
Calculation Logic Excellent (FR,SR) Excellent (FR,SR)
Skip Logic Excellent (FR,SR) Excellent (FR,SR)
Answer limits & validations
Very clear error notification, highlighted the error part(FR) Show error notification, but sometimes unclear (FR)
Single form data sharing across user accounts
Good (FR) Good (FR)
Data export formats Better data export (cases/ forms) and easy to filtering and cleaning (PL) Challenging dashboard and data export (PL)
62
APP-UNIQUE FEATURES REVIEW
COMMCARE-UNIQUE FEATURE REVIEW
Form builder types: Web-based, drag down Clean form building interface with comprehensive tools, yet building form online requires fast and stable internet.
QR code scanner Difficulties in scanning when the cattle keeps moving (FR,SR)
Signature Good feature to improve interaction with farmers, but has no eraser tool which does not allow for editing. (FR)
Historical data saved on device, available for skip logic and validation Excellent feature to monitor work progress in a poor mobile coverage area (FR,SR)
SURVEYCTO-UNIQUE FEATURE REVIEW
Form builder types: Import XML/CSV (manual) Basic interface with very few form building tools available, hence most form building tasks are completed in Excel
Annotated Image Excellent (FR,SR)
RFID Good, easy to scan, but if the RFID chip is missing, the survey couldn’t proceed to following sections (FR,SR)
The annotated image feature in SurveyCTO was found to be a helpful feature, while RFID is easier than QR code in terms of scanning process.
The ‘signature’ feature in CommCare was found to be the most interesting feature for researchers and farmers. It improves the interactions while conducting the survey.
63
RESEARCHER’S DAILY PERCEPTION
1 2 3 4 5 6 7 8 9 10 11 12 13 14
User Group #1CommCare switched to Survey CTO
Do you agree with this statement: “I think mobile apps make surveys easier” (n=4) Number of People
DAY DAY
F1 F2 F1 F2
Monitoring researchers’ daily perception of apps reveals two points both indicating CommCare’s relative fit to ACIAR’s needs.1. In the change from Day 8 to 9 when researchers switched apps, the SurveyCTO-to-CommCare transition experienced
a smoother landing than vice versa. 2. CommCare users in both groups and SurveyCTO users in group #2 underwent positive changes in perception, while
SurveyCTO users in Group #1 didn’t become fully convinced as a group even towards the end.
1 2 3 4 5 6 7 8 9 10 11 12 13 14
User Group #2SurveyCTO switched to CommCare
Source: Daily Feedback Form w.1 & 2
SELECTED APP BY RESEARCHERS
64
“Which app do you prefer to use in the future?” (n=8) Number of People (%)
0%
100%
SurveyCTO CommCare
“Why Commcare?”
• “Less problem, easier” (FR)• “Easier to use, better error notification” (FR)• “Easier, more simple, 1 page many questions, good
structure.” (FR)• “It works well and less challenging (not so many problems)”
(SR)• “Rarely found any problems using Commcare, no need to
send data many times.” (SR)• “More practical and easy, not so many problems, no need
to scan animal codes many times, the interface of forms is more structured, video could be uploaded easily.” (FR)
• “It’s easier to use, but probably it will better if Commcareuse RFID.
• CTO was hard to use because needs to upload the data before move on to the next step, but RFID is better than QR code.” (FR)
Source: Researcher end-line survey
Multi-purpose,
monitoring
Multi-purpose, one-off
Single-purpose, monitoring
Single-purpose, one-off
Based on the assessment and discussions, the mobile-acquired data pilot will focus on SurveyCTO and Commcare.
TWO COMPREHENSIVE CASE MANAGEMENT APPS
?
64
It has been decided that these two apps with similar functions will not be used in the pilot for the following reasons.
ELMINATION OF TAROWORKS AND IFORMBUILDER
Taroworks• Its Salesforce/CRM integration is an advantage,
yet it makes a head-to-head comparison with other apps difficult.
• Given the Salesforce integration, Taroworks is prohibitively expensive in the longer term roll out phase.
66
67
68
Salesforce/Taroworks vs CommCare• A team member with previous Salesforce/Taroworks experience but not familiar with CommCare
made comparison of the two apps. • Focus on survey building, user support and app features.
• No field testing of Salesforce/Taroworks in the pilot.
Pros Cons
• Clean, streamlined survey building page • Requires Salesforce apps to incorporate more advanced features such as video / audio outputs• Wide range of question types
• Extensive online help directory. • Salesforce manages relationships so new farmers have to be synced in order to autopopulate to other surveys on the tablet. Once it is synced it will remain available for selection.
• Arrange questions into sections• Preview surveys online easily
• Deploy survey in multiple languages• Very stable system with very little downtime • Calculations not available to add directly into
the survey as yet. • Regional offices to expedite help• One:one support available via phone, skype,
email • Errors are easy to interpret
• Large range of apps that can be incorporated into TaroWorks via Salesforce
69
Pros and cons of building surveys for TaroWorks in Salesforce
Pros Cons• User-friendly interface. • Frequent downtime due to server errors. • Easy Navigation around CommCare HQ. • CommCare is generally slow loading and saving. • Incorporate calculations / formulas into
questions• Time difference with the US can make technical
help & fixing system issues slow.• Extensive choice of question types. • Extensive online Wiki help directory. • One on one help limited• Format questions into folders, groups • Errors are technical and difficult to interpret• Add photos, videos to questions easily. • Survey formatting / setup can be time
consuming• CloudCare- test surveys on desktop• Export form details and XML script• Deploy survey in multiple languages• Large number of features to add to the surveys• Deploying surveys to tablets is very easy with
clear instructions provided.
Pros and cons of survey building in CommCare
70
71
• TaroWorks is built on the back of the Salesforce system which is an extremely powerful data management and customer relationship management (CRM) tool that has endless reporting and data collection capabilities.
• Salesforce is quite a complex system to set up, which includes setting up the field mapping of surveys to display certain ways in Salesforce CRM and link to reporting etc, and would require time, dedication and extensive learning by someone who was able to think from a database perspective.
• With power comes effort to establish and operate.
Taroworks/Salesforce comments
EVALUATION OBJECTIVE 2 : SUMMARY
72
2
Which of the two apps realizes the expected benefitsmore effectively?
If so, with which app?
• The two apps performed close to par in terms of usability and common features.
• In the very specific environment which ACIAR projects operate in, CommCare proved to be more suitable due to its ability to sync surveys locally in mobile devices.
• Taroworks/Salesforce has similar survey syncing feature to SurveyCTO. It’s CRM is incredibly powerful.
• CommCare has automatic data export options giving opportunities for linking with live reporting options (eg; dashboard). Survey CTO requires manually downloading excel spreadsheets from their website
• CommCare’s survey building process is very user-friendly.
Key Evaluation ObjectivesIn simple
wordsKey Evaluation Findings
PT Collins Higgins Consulting
• MAD Pilot Overview
• Evaluation Objectives & Methodology
• Objective #1: Should ACIAR promote going digital?
• Objective #2: If so, with which app?
• Objective #3: If so, how should ACIAR promote the use of apps?
What were the lessons learned?
• Recommendations
73
74
EVALUATION OBJECTIVE #3: LESSON LEARNED
User Group Lessons Specific to User Group
ACIAR
• RPM-Project Leader relationship: The relationship between RPMs and Project leaders will not drastically change with the introduction of apps, though this will depend on the nature of project size and RPM/project leader’s willingness to engage with the apps.
• Questions around cascading data: Selected data automatically cascading from project to ACIAR is technically possible but depending on project size and complexity is most likely not practical.
• Adaptive management: The apps enable project leaders and RPMs to make more timely course corrections. This adaptive management capacity is an area worthy of further exploration.
• Stories of Change: There is an opportunity for ACIAR’s communication team to engage more closely with project leaders/RPMs if the ‘story survey and categorization’ feature is adopted by projects.
• Budget for app training: Future ACIAR projects choosing to use DDCA should be allocating funds for DDCA survey design and training.
This pilot revealed a number of key lessons learned that ACIAR should keep in mind as it promotes the useof apps to its partner organizations. The following tales summarizes the key lessons per user group.
75
User Group Lessons Specific to User Group
Project Leader
• Hidden, killer criteria: Conducting pilots of this kind is worthwhile since they reveal critical criteria that end up outweighing all other benefits. These criteria are often difficult to pick up in desktop research and thus ‘hidden’ until tested in real field conditions. Pilot free versions first with the intention to upgrade.
• Technical preparation: A smooth implementation of DDCA requires an important step: technical preparation. This involves checking the mobile device settings including memory card, data usage, audio levels, etc.
• Monitor field activities: To harness the power of the apps and improve survey quality, time is required to monitor surveys and interact with the teams in real time.
• Budget and training: To harness the power of the apps requires an additional investment in time and training. Depending on complexity and duration of a project, determine if a turn key solution is best or invest time in learning all app features.
• Stories of Change: Allocate budget and training to empower field researchers to collect stories from the field using the apps.
• Prudent use: It is easy to get caught up in the power of DDCA and collect large amounts of data. Prudent use of the technology is necessary to ensure data volumes are relevant and manageable.
EVALUATION OBJECTIVE #3: LESSON LEARNED
76
User Group Lessons Specific to User Group
Field Researcher
• Hidden, killer criteria: Field researchers very quickly identify which DDCA suits a particular requirement for a survey. Recommend consistent surveying (formal or informal) of field researchers’ attitudes to the app.
• Team communication: Field researchers use the tablets to coordinate with each other. In the MAD pilot they used “Line”. Very powerful tool for survey management in the field.
• Data usage: Be mindful of data usage for non survey purposes. • Ease of uptake: The uptake of technical use of DDCA was surprisingly fast during training.
Shift focus of training to why collecting data using DDCA. Spend time showing field researchers how the data is presented in excel.
• Hardware and software selection: The size of screen is an important element for information sharing.
• Incomplete surveys: Field researchers would save surveys as ‘incomplete’, review, edit and upload at the end of the day. Ability to become story tellers.
EVALUATION OBJECTIVE #3: LESSON LEARNED
77
User Group Lessons Specific to User Group
Farmers
• Information sharing: DDCAs do have the ability to share information in near real time back to farmers. Both the farmers and the field researchers found sharing information a rewarding experience. Real two way communication.
• Interactivity valuable: Farmers appreciate and enjoy the features of the apps. Enjoyment=learning.
• Farmers use DDCA: With minimal training farmers could use apps to record data e.g. select farmers using herd books or farmer diaries.
EVALUATION OBJECTIVE #3: LESSON LEARNED
78
3
What are the key lessons learned for each user group in order for ACIAR to replicate/scale up MAD apps?
If so, how should ACIAR promote the use of apps?
• A smooth implementation of DDCAs require technical preparation for devices deployed.
• Conducting pilots of this kind is worthwhile since they reveal critical criteria that could end up outweighing all other benefits.
• Auto-cascading of select data from project to ACIAR is technically possible but not practical.
• The apps enable project leaders and RPMs to make course corrections efficiently.
• Apps have the capacity to lay the grounds for capturing stories from the field
• The relationship between RPMs and Project leaders will likely not change with the introduction of apps.
Key Evaluation ObjectivesIn simple
wordsKey Evaluation Findings
EVALUATION OBJECTIVE 3 : SUMMARY
PT Collins Higgins Consulting
• MAD Pilot Overview
• Evaluation Objectives & Methodology
• Objective #1: Should ACIAR promote going digital?
• Objective #2: If so, with which app?
• Objective #3: If so, how should ACIAR promote the use of apps?
What were the lessons learned?
• Key Evaluation Findings and Recommendations
79
RECOMMENDATIONS FOR ACIAR
80
PROJECT COMPLEXITY DDCA ANALYTICS KEY FINDINGS RECOMMENDATIONS FOR ACIAR
Low CommCare or any DDCA* None Not tested in this pilot
Medium/High
CommCareSurveyCTO
Excel Dashboard
• The expected benefits of the apps were realized strongly in the ‘upstream’ of the data value chain; ‘downstream’ benefits are less clear given the apps’ inherent limitation in analysis/reporting.
• For ACIAR’s purposes, CommCare was confirmed to be the more flexible and versatile app in the specific context of the pilot and the environment of its project.
• Key lessons learned included the following: ‘killer’ criteria,technical preparation, adaptivemanagement, and stories ofchange.
• In future DDCA roll-outs, shift focus from downstream benefits to upstream: understand requirements for analytics and cascading data.
• For all levels of project complexity, promote CommCareto partners since it’s the most flexible and versatile. Even in projects with high level of uncertainty, CommCare will not close doors but leave options open.
• Before at-scale promotion, conduct field tests to identify ‘killer’ criteria.
• Further explore the mechanisms for DDCA to contribute to rich narratives from the field to ACIAR comms unit.
• Support partners to ensure technical preparation of DDCA use is adequate.
• Encourage adaptive management between RPMs and project leaders through use of DDCAs.
High CommCareTaroWorks (?)
CRMSalesForce
(?)CRMs and Salesforce not tested in this pilot
PT Collins Higgins Consulting 81
FINAL THOUGHTS• Not if, but when: DDCAs are already being adopted by projects with or without ACIAR
involvement/support.• ACIAR is not a service provider; ACIAR is best-placed to set up a pathway for project-
to-project learning.• Project leaders focus on choosing the right app for their project needs.• ACIAR could support capacity-building for projects through a Masterclass series on
DDCAs, focusing on aligning DDCA capabilities with ACIAR reporting andcommunication requirements.
• How can ACIAR support projects to determine their DDCA capacity-building needs?Ranging from DIY to turn-key solution (or somewhere in between)?
Key Evaluation Objectives In simple words Key Evaluation Findings
1
Do the apps realize the expected benefits for the relationships between user groups?
Should ACIAR promote going digital?
• The expected benefits of the apps were realized strongly in the ‘upstream’ of the data value chain, i.e. among farmers, researchers, and project leaders.
• Given the apps’ inherent limitation in ‘downstream’ analysis/reporting, the benefits are less clear on that end. More clarity from ACIAR on cascading data requirements in the future would further contribute to assessing the apps’ benefits for reporting purposes.
2
Which of the two apps realizes the expected benefits more effectively?
If so, with which app?
• The two apps performed close to par in terms of usability and common features.
• For ACIAR’s purposes, CommCare came out as the more fit app mainly due to its ability to synchronize surveys locally in the mobile devices.
• Taroworks/Salesforce has similar survey syncing feature to SurveyCTO. It’s CRM is incredibly powerful.
• CommCare has automatic data export options for linking with live reporting options (eg; dashboard). Survey CTO requires manually downloading excel spreadsheets from their website.
• CommCare has a more user-friendly drop and drag survey building process.
3
What are the key lessons learned for each user group in order for ACIAR to replicate/scale up MAD apps?
If so, how should ACIAR promote the use of apps?
• A smooth implementation of DDCAs require technical preparation for devices deployed.
• Conducting pilots of this kind is worthwhile since they reveal critical criteria that could end up outweighing all other benefits.
• Auto-cascading of select data from project to ACIAR is technically possible but not practical.
• The apps enable project leaders and RPMs to make course corrections efficiently.
• Apps have the capacity to lay the grounds for capturing stories from the field
• The relationship between RPMs and Project leaders will likely not change with the introduction of apps.82
KEY EVALUATION FINDINGS: SUMMARY