lfa osdvdatareporting template en
DESCRIPTION
gfTRANSCRIPT
OSDV Data Reporting Template
Checklist to Assess Program/Project Data Quality
Number of Regional Aggregation Sites
Number of District Aggregation Sites
Number of Service Delivery Sites
Version: April 2012
Important notes for the use of this spreadsheet:
1. If you are using MS Excel 2007 or 2010, please click on 'enable content' to enable macros when prompted by the security warning when you open this OSDV tool the first time. If you are using earlier versions of Excel, you will need to ensure that your 'macro security' is set to something less than 'high'. With the spreadsheet open, go to the 'Tools' pull-down menu and select 'Macro', then 'Security'. Select 'medium'. Close Excel and re-open the file. When you open the file the next time you will have to select 'Enable Macros' for the application to work as designed.
2. On the START Page (this page), please select number of intermediate aggregation sites (IAS) and Service Delivery Points (SDPs) that you plan to review from the dropdown lists above. IAS are typically the district level health unit of the Ministry of Health.
3. Kindly ensure that background information is provided as per worksheet labelled " Information_Page i.e Country, Grant number, indicator assessed, reporting period assessed etc. DO NOT RENAME the worksheets. However you can indicate the site name within the worksheet at cell D2. For example if your first service point corresponds to a site name "Health clinic A", put this name in cell D2 of the worksheet labelled "Service Point 1". Do not change the name of the worksheet, leave it as "Service Point 1" and so on.
B – INSTRUCTIONS FOR USE OF THE OSDV Data Reporting Template
1. Determine Purpose
The OSDV checklist can be used for:
Initial assessment of M&E systems established by new implementing partners (or in decentralized systems) to collect, manage and report data.
Routine supervision of data management and reporting systems and data quality at various levels. For example, routine supervision visits may include checking on a certain time period worth of data (e.g. one day, one week or one month) at the service site level, whereas periodic assessments (e.g. quarterly, biannually or annually) could be carried out at all levels to assess the functioning of the entire Program/project’s M&E system.
Periodic assessment by donors of the quality of data being provided to them (this use of the DQA could be more frequent and more streamlined than official data quality audits that use the DQA for Auditing) but less frequent than routine monitoring of data.
Preparation for a formal data quality audit.
2. Level/Site SelectionSelect levels and sites to be included (depending on the purpose and resources available). Once the purpose has been determined, the second step in the OSDV is to decide what levels of the data-collection and reporting system will be included in the assessment - service sites, intermediate aggregation levels, and/or central M&E unit. The levels should be determined once the appropriate reporting levels have been identified and “mapped” (e.g., there are 100 sites providing the services in 10 districts. Reports from sites are sent to districts, which then send aggregated reports to the M&E Unit). In some cases, the data flow will include more than one intermediate level (e.g. regions, provinces or states or multiple levels of program organizations).
3. Identify indicators, data sources and reporting period. The OSDV is designed to assess the quality of data and underlying systems related to indicators that are reported to programs or donors. It is necessary to select one or more indicators – or at least program areas – to serve as the subject of the OSDV. This choice will be based on the list of reported indicators. For example, a program focusing on treatment for HIV may report indicators of numbers of people on ART. Another program may focus on meeting the needs of orphans or vulnerable children, therefore the indicators for that program would be from the OVC program area. A malaria program might focus on providing insecticide-treated bed nets (ITN) or on treating people for malaria – or on both of those activities.
4. Conduct site visits. During the site visits, the relevant sections of the appropriate checklists in the Excel file are filled out (e.g. the service site checklist at service sites, etc). These checklists are completed following interviews of relevant staff and reviews of site documentation. Using the drop down lists on the HEADER page of this workbook, select the appropriate number of Intermediate Aggregation Levels (IAL) and Service Delivery Points (SDP) to be reviewed. The appropriate number of worksheets will automatically appear in the OSDV workbook (up to 12 SDP and 4 IALs).
5. Review outputs and findings. The OSDVoutputs need to be reviewed for each site visited. Site-specific summary findings in the form of recommendations are noted at each site visited.
6. The OSDV checklists exist in MS Excel format and responses should be entered directly into the spreadsheets on the computer if possible. Alternatively, the checklists can be printed and completed by hand but the data must then be entered into the Excel tool to allow the production of summary statistics for each site and level of the reporting system.
7. A 'Global Dashboard' shows statistics aggregated across and within levels to highlight overall strengths and weaknesses in the reporting system. For this analysis questions are grouped by the applicable dimension of data quality (e.g. accuracy or reliability) and the number of responses by type of response (e.g. 'Yes - completely', 'Partly' etc.) are plotted as a percentage of all responses. A table of survey questions and their associated dimensions of data quality can be found on the 'Dimensions of data quality' tab in this workbook.
C – BACKGROUND INFORMATION – OSDV
Country:
Grant Number Related Grants
Name of Program/project:
Indicator Reviewed:
Reporting Period Assessed: Start Date (DD/MM/YYYY) End date (DD/MM/YYYY)
LFA Name:
Assessment Team: Name Title Email
Primary contact:
Indicator Sequence Number on OSDV Word Template
Service Point 1 Page 4
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 1 Page 5
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3jkjk
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
PHPM Data Assessment
PHPM Data Assessment - Service Delivery Sites and/or Peripheral Stores
Answer codes:
Reviewer’s comments
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Yes completely – 3, Partly – 2,Not at all - 1
Service Point 1 Page 6
Inventory Management Record Keeping
Service Point 2 Page 7
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 2 Page 8
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Service Point 3 Page 9
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 3 Page 10
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Service Point 4 Page 11
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 4 Page 12
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
PHPM Data Assessment
PHPM Data Assessment - Service Delivery Sites and/or Peripheral Stores
Answer codes:
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Service Point 4 Page 13
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Reviewer’s commentsYes completely – 3, Partly – 2,Not at all - 1
Service Point 5 Page 14
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 5 Page 15
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Service Point 6 Page 16
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 6 Page 17
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Service Point 7 Page 18
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 7 Page 19
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Service Point 8 Page 20
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 8 Page 21
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Service Point 9 Page 22
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 9 Page 23
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Service Point 10 Page 24
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 10 Page 25
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Service Point 11 Page 26
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 11 Page 27
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Service Point 12 Page 28
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
B - Recounting reported Results:
4
5
6 N/A
7
C - Cross-check reported results with other data sources:
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from source documents, compare the verified numbers to the site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by reviewing the source documents. [A]If LFA is unable to recount due to missing or unavailable source documents, write N/A (to represent Not Available). Only write 0 (i.e. zero) if the verified number was 0.
Enter the number of people, cases or events reported by the site during the reporting period from the site summary report. [B] If no report was found at the site, please write "N/A". Only write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the Register and from Register to Patient Treatment Cards).
Service Point 12 Page 29
Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III - Data-collection and Reporting Forms and Tools
8
9
10
11
IV- Data Management Processes
12
13
14
15 Is supervision provided for SDPs by higher level?
V - Links with National Reporting System
20
21
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to districts, to regional offices, to the central M&E Unit).
The responsibility for recording the delivery of services on source documents is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The data collected on the source document has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies desegregation by these characteristics).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
The reporting system enables the identification and recording of a "drop out", a person "lost to follow-up" and a person who died.
When applicable, data are reported through a single channel of the national information systems.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
District Site 1 Page 30
Data Verification and System Assessment Sheet - District Site
District Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the district (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all service sites? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of period reported on and signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
District Site 1 Page 31
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
V - Links with National Reporting System
19
20
PHPM Data Assessment
PHPM Data Assessment - Service Delivery Sites and/or Peripheral Stores
Answer codes:
Reviewer’s comments
Inventory Management Record Keeping
1
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
When applicable, the data are reported through a single channel of the national reporting system.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Yes completely – 3, Partly – 2,Not at all - 1
Stock cards and/or bin cards are available for each product of a tracer list of medicines stored in the facility
District Site 1 Page 32
2
3
4
5
Stock cards and/or bin cards record at least the following information: name of the drug, dose, form, batch number, expiring date, transactions by date, minimum and maximum stock levels, and buffer stock.
Information in the stock cards and/or bin cards is correctly filled: all cells contain the information requested and the numbers add correctly for a tracer list of medicines.
Stock balances stock cards and/or bin cards match the quantities of the products actually in the store, for a tracer list of medicines.
There are written records of temperature monitoring of the storeroom(s) and other storing locations (cold room, refrigerator or other). Recorded temperature monitoring data should be available for review.
District Site 2 Page 33
Data Verification and System Assessment Sheet - District Site
District Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the district (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all service sites? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of period reported on and signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
District Site 2 Page 34
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
V - Links with National Reporting System
19
20
PHPM Data Assessment
PHPM Data Assessment - Service Delivery Sites and/or Peripheral Stores
Answer codes:
Reviewer’s comments
Inventory Management Record Keeping
1
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
When applicable, the data are reported through a single channel of the national reporting system.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Yes completely – 3, Partly – 2,Not at all - 1
Stock cards and/or bin cards are available for each product of a tracer list of medicines stored in the facility
District Site 2 Page 35
2
3
4
5
Stock cards and/or bin cards record at least the following information: name of the drug, dose, form, batch number, expiring date, transactions by date, minimum and maximum stock levels, and buffer stock.
Information in the stock cards and/or bin cards is correctly filled: all cells contain the information requested and the numbers add correctly for a tracer list of medicines.
Stock balances stock cards and/or bin cards match the quantities of the products actually in the store, for a tracer list of medicines.
There are written records of temperature monitoring of the storeroom(s) and other storing locations (cold room, refrigerator or other). Recorded temperature monitoring data should be available for review.
District Site 3 Page 36
Data Verification and System Assessment Sheet - District Site
District Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the district (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all service sites? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of period reported on and signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
District Site 3 Page 37
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
V - Links with National Reporting System
19
20
PHPM Data Assessment
PHPM Data Assessment - Service Delivery Sites and/or Peripheral Stores
Answer codes:
Reviewer’s comments
Inventory Management Record Keeping
1
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
When applicable, the data are reported through a single channel of the national reporting system.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Yes completely – 3, Partly – 2,Not at all - 1
Stock cards and/or bin cards are available for each product of a tracer list of medicines stored in the facility
District Site 3 Page 38
2
3
4
5
Stock cards and/or bin cards record at least the following information: name of the drug, dose, form, batch number, expiring date, transactions by date, minimum and maximum stock levels, and buffer stock.
Information in the stock cards and/or bin cards is correctly filled: all cells contain the information requested and the numbers add correctly for a tracer list of medicines.
Stock balances stock cards and/or bin cards match the quantities of the products actually in the store, for a tracer list of medicines.
There are written records of temperature monitoring of the storeroom(s) and other storing locations (cold room, refrigerator or other). Recorded temperature monitoring data should be available for review.
District Site 4 Page 39
Data Verification and System Assessment Sheet - District Site
District Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the district (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all service sites? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of period reported on and signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
District Site 4 Page 40
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
V - Links with National Reporting System
19
20
PHPM Data Assessment
PHPM Data Assessment - Service Delivery Sites and/or Peripheral Stores
Answer codes:
Reviewer’s comments
Inventory Management Record Keeping
1
2
3
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
When applicable, the data are reported through a single channel of the national reporting system.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Yes completely – 3, Partly – 2,Not at all - 1
Stock cards and/or bin cards are available for each product of a tracer list of medicines stored in the facility
Stock cards and/or bin cards record at least the following information: name of the drug, dose, form, batch number, expiring date, transactions by date, minimum and maximum stock levels, and buffer stock.
Information in the stock cards and/or bin cards is correctly filled: all cells contain the information requested and the numbers add correctly for a tracer list of medicines.
District Site 4 Page 41
4
5
Stock balances stock cards and/or bin cards match the quantities of the products actually in the store, for a tracer list of medicines.
There are written records of temperature monitoring of the storeroom(s) and other storing locations (cold room, refrigerator or other). Recorded temperature monitoring data should be available for review.
District Site 5 Page 42
Data Verification and System Assessment Sheet - District Site
District Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the district (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all service sites? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of period reported on and signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
District Site 5 Page 43
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
V - Links with National Reporting System
19
20
PHPM Data Assessment
PHPM Data Assessment - Service Delivery Sites and/or Peripheral Stores
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
When applicable, the data are reported through a single channel of the national reporting system.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
District Site 6 Page 44
Data Verification and System Assessment Sheet - District Site
District Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the district (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all service sites? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of period reported on and signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
District Site 6 Page 45
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4sdafsdaf
5sdfafsaf
6asdffad
7sdfafdsa
III- Data-collection and Reporting Forms / Tools
8sdfafdsa
9sdfasdfa
10sdfafdas
IV- Data Management Processes
11dfassdfa
12sdasdfa
13sdfadfas
14sdfafdas
15sdfadfsa
16
sdfaf
17sdfasfafsa
18
sdfafsd
V - Links with National Reporting System
19sdfsdaf
20sdfasdfaf
PHPM Data Assessment
PHPM Data Assessment - Service Delivery Sites and/or Peripheral Stores
Answer codes:
Reviewer’s comments
Inventory Management Record Keeping
1ljklkjsldfkjkljsdfa
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
When applicable, the data are reported through a single channel of the national reporting system.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Yes completely – 3, Partly – 2,Not at all - 1
Stock cards and/or bin cards are available for each product of a tracer list of medicines stored in the facility
District Site 6 Page 46
2jlfkdjkljsdfa
3jksfadjlkjsfklajkljsdfa
4fadjklsdajlk;jfsdal
5
ksadfl;kjlsafjkls
Stock cards and/or bin cards record at least the following information: name of the drug, dose, form, batch number, expiring date, transactions by date, minimum and maximum stock levels, and buffer stock.
Information in the stock cards and/or bin cards is correctly filled: all cells contain the information requested and the numbers add correctly for a tracer list of medicines.
Stock balances stock cards and/or bin cards match the quantities of the products actually in the store, for a tracer list of medicines.
There are written records of temperature monitoring of the storeroom(s) and other storing locations (cold room, refrigerator or other). Recorded temperature monitoring data should be available for review.
District Site 7 Page 47
Data Verification and System Assessment Sheet - District Site
District Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the district (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all service sites? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of period reported on and signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
District Site 7 Page 48
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
V - Links with National Reporting System
19
20
PHPM Data Assessment
PHPM Data Assessment - Service Delivery Sites and/or Peripheral Stores
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
When applicable, the data are reported through a single channel of the national reporting system.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
District Site 8 Page 49
Data Verification and System Assessment Sheet - District Site
District Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8 22
9 -
10
11 -
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from service sites to the District and compare to the value reported by the District. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the district (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Service Delivery Sites. How many reports should there have been from all Sites? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all service sites? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of period reported on and signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
District Site 8 Page 50
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
V - Links with National Reporting System
19
20
PHPM Data Assessment
PHPM Data Assessment - Service Delivery Sites and/or Peripheral Stores
Answer codes:
Reviewer’s comments
Inventory Management Record Keeping
1
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
When applicable, the data are reported through a single channel of the national reporting system.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Yes completely – 3, Partly – 2,Not at all - 1
Stock cards and/or bin cards are available for each product of a tracer list of medicines stored in the facility
District Site 8 Page 51
2
3
4
5
Stock cards and/or bin cards record at least the following information: name of the drug, dose, form, batch number, expiring date, transactions by date, minimum and maximum stock levels, and buffer stock.
Information in the stock cards and/or bin cards is correctly filled: all cells contain the information requested and the numbers add correctly for a tracer list of medicines.
Stock balances stock cards and/or bin cards match the quantities of the products actually in the store, for a tracer list of medicines.
There are written records of temperature monitoring of the storeroom(s) and other storing locations (cold room, refrigerator or other). Recorded temperature monitoring data should be available for review.
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from the Districts to the Region and compare to the value reported by the Region. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the Intermediate Aggregation Site (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Districts within the Region. How many reports should there have been from all Districts? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all Districts? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of the period reported ,and the signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
Data Verification and System Assessment Sheet - Regional Site
-
-
Part 1: Data Verifications
A - Recounting reported Results:
B - Reporting Performance:
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System Answer Codes: Yes -
completelyPartly
No - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
Data Verification and System Assessment Sheet - Regional Site
-
-
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System Answer Codes: Yes -
completelyPartly
No - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
20
PHPM Data Assessment – Central level
Answer codes:
Reviewer’s comments
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Answer Codes: Yes - completely -3Partly - 2No - 1 N/A
Data Verification and System Assessment Sheet - Regional Site
-
-
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
PHPM Data Assessment – Central level
Reviewer’s comments
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from the Districts to the Region and compare to the value reported by the Region. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the Intermediate Aggregation Site (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Districts within the Region. How many reports should there have been from all Districts? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all Districts? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of the period reported ,and the signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
Data Verification and System Assessment Sheet - Regional Site
-
-
Part 1: Data Verifications
A - Recounting reported Results:
B - Reporting Performance:
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System Answer Codes: Yes -
completelyPartly
No - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
Data Verification and System Assessment Sheet - Regional Site
-
-
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System Answer Codes: Yes -
completelyPartly
No - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
20
PHPM Data Assessment – Central level
Answer codes:
Reviewer’s comments
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Answer Codes: Yes - completely -3Partly - 2No - 1 N/A
Data Verification and System Assessment Sheet - Regional Site
-
-
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
PHPM Data Assessment – Central level
Reviewer’s comments
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from the Districts to the Region and compare to the value reported by the Region. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the Intermediate Aggregation Site (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Districts within the Region. How many reports should there have been from all Districts? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all Districts? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of the period reported ,and the signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
Data Verification and System Assessment Sheet - Regional Site
-
-
Part 1: Data Verifications
A - Recounting reported Results:
B - Reporting Performance:
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System Answer Codes: Yes -
completelyPartly
No - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
Data Verification and System Assessment Sheet - Regional Site
-
-
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System Answer Codes: Yes -
completelyPartly
No - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
20
PHPM Data Assessment – Central level
Answer codes:
Reviewer’s comments
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Answer Codes: Yes - completely -3Partly - 2No - 1 N/A
Data Verification and System Assessment Sheet - Regional Site
-
-
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
PHPM Data Assessment – Central level
Reviewer’s comments
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Answer Codes: Yes - completely
PartlyNo - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from the Districts to the Region and compare to the value reported by the Region. Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the Intermediate Aggregation Site (and submitted to the next reporting level)? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Districts within the Region. How many reports should there have been from all Districts? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all Districts? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of the period reported ,and the signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
Data Verification and System Assessment Sheet - Regional Site
-
-
Part 1: Data Verifications
A - Recounting reported Results:
B - Reporting Performance:
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System Answer Codes: Yes -
completelyPartly
No - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2
3
II- Indicator Definitions and Reporting Guidelines
Relevant staff have clear understanding on …
4
5
6
7
III- Data-collection and Reporting Forms / Tools
8
9
10
IV- Data Management Processes
11
12
13
14
15
16
17
18
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness and timeliness) received from sub-reporting levels (e.g., service points).
There are designated staff responsible for reviewing aggregated numbers prior to submission to the next level (e.g., to the central M&E Unit).
All relevant staff have received training on the data management processes and tools.
,,, what they are supposed to report on.
… how (e.g., in what specific format) reports are to be submitted.
… to whom the reports should be submitted.
… when the reports are due.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
Feedback is systematically provided to all service points on the quality of their reporting (i.e., accuracy, completeness and timeliness).
If applicable, there are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
If applicable, there is a written back-up procedure for when data entry or data processing is computerized.
If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with service points on data quality issues.
If data discrepancies have been uncovered in reports from service points, the Intermediate Aggregation Levels (e.g., districts or regions) have documented how these inconsistencies have been resolved.
Data Verification and System Assessment Sheet - Regional Site
-
-
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Part 2. Systems Assessment
Data Verification and System Assessment Sheet - Regional Site
Regional Site/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To
Component of the M&E System Answer Codes: Yes -
completelyPartly
No - not at all N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
20
PHPM Data Assessment – Central level
Answer codes:
Reviewer’s comments
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
Answer Codes: Yes - completely -3Partly - 2No - 1 N/A
Data Verification and System Assessment Sheet - Regional Site
-
-
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
PHPM Data Assessment – Central level
Reviewer’s comments
National Level - M&E Unit Page 76
Data Verification and System Assessment Sheet - National Level M&E Unit
National Level M&E Unit/Organization:
Indicator Reviewed: -
Date of Review:
Reporting Period Verified: - To -
Component of the M&E System
Part 1: Data Verifications
A - Recounting reported Results:
1
2
3 N/A
4
B - Reporting Performance:
5
6
7 -
8
9 -
10
11 -
Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
1
2 All staff positions dedicated to M&E and data management systems are filled.
3
4
5
6
Answer Codes: Yes - completely
PartlyNo - not at all
N/A
REVIEWER COMMENTS(Please provide detail for each response not coded "Yes - Completely".
Detailed responses will help guide strengthening measures. )
Recount results from the periodic reports sent from the intermediate aggregation sites to the National Level and compare to the value published by the National Program (or reported by the National Program to the Donor, if applicable). Explain discrepancies (if any).
Re-aggregate the numbers from the reports received from all Service Delivery Points. What is the re-aggregated number? [A]If LFA is unable to re-aggregate due to missing or unavailable reports, write N/A (to represent Not Available).
What aggregated result was contained in the summary report prepared by the M&E Unit? [B] If no report was sent, please write "N/A", but write 0 ( i.e zero) if 0 was the actual number reported.
Calculate the ratio of recounted to reported numbers. [A/B]
What are the reasons for the discrepancy (if any) observed (i.e., data entry errors, arithmetic errors, missing source documents, other)?
Review availability, completeness, and timeliness of reports from all Intermediate Aggregation Sites. How many reports should there have been from all Aggregation Sites? How many are there? Were they received on time? Are they complete?
How many reports should there have been from all reporting entities (e.g., regions, districts, service points)? [A]
How many reports are there? [B]
Calculate % Available Reports [B/A]
Check the dates on the reports received. How many reports were received on time? (i.e., received by the due date). [C]
Calculate % On time Reports [C/B]
How many reports were complete? (i.e.complete means that the report contained all the required indicator data, dates of the period reported ,and the signature of the signing authority). [D]
Calculate % Complete Reports [D/B]
There is a documented organizational structure/chart that clearly identifies positions that have data management responsibilities at the M&E Unit. (to specify which Unit: e.g. MoH, NAP, GF, World Bank)
A senior staff member (e.g., the Program Manager) is responsible for reviewing the aggregated numbers prior to the submission/release of reports from the M&E Unit.
There are designated staff responsible for reviewing the quality of data (i.e., accuracy, completeness, timeliness and confidentiality ) received from sub-reporting levels (e.g., regions, districts, service points).
There is a training plan which includes staff involved in data-collection and reporting at all levels in the reporting process.
All relevant staff have received training on the data management processes and tools.
National Level - M&E Unit Page 77
II- Indicator Definitions and Reporting Guidelines
7
8
9
10
III- Data-collection and Reporting Forms / Tools
11
12
13
14
15
16
IV- Data Management Processes
17
18
19
20
21
22
23
24
25
26
V- Links with National Reporting System
27
28
29
30
31
The M&E Unit has documented and shared the definition of the indicator(s) with all relevant levels of the reporting system (e.g., regions, districts, service points).
There is a description of the services that are related to each indicator measured by the Program/project.
There is a written policy that states for how long source documents and reporting forms need to be retained.
The M&E Unit has provided written guidelines to all reporting entities (e.g., regions, districts, service points) on reporting requirements and deadlines.
If multiple organizations are implementing activities under the Program/project, they all use the same reporting forms and report according to the same reporting timelines.
The M&E Unit has identified a standard source document (e.g., medical record, client intake form, register, etc.) to be used by all service delivery points to record service delivery.
The M&E Unit has identified standard reporting forms/tools to be used by all reporting levels.
Clear instructions have been provided by the M&E Unit on how to complete the data collection and reporting forms/tools.
The data collected by the M&E system has sufficient precision to measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if the indicator specifies disaggregation by these characteristics).
All source documents and reporting forms relevant for measuring the indicator(s) are available for auditing purposes (including dated print-outs in case of computerized system).
The M&E Unit has clearly documented data aggregation, analysis and/or manipulation steps performed at each level of the reporting system.
Feedback is systematically provided to all sub-reporting levels on the quality of their reporting (i.e., accuracy, completeness and timeliness).
(If applicable) There are quality controls in place for when data from paper-based forms are entered into a computer (e.g., double entry, post-data entry verification, etc).
(If applicable) There is a written back-up procedure for when data entry or data processing is computerized.
...If yes, the latest date of back-up is appropriate given the frequency of update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international confidentiality guidelines.
The recording and reporting system avoids double counting people within and across Service Delivery Points (e.g., a person receiving the same service twice in a reporting period, a person registered as receiving the same service in two different locations, etc).
There is a written procedure to address late, incomplete, inaccurate and missing reports; including following-up with sub-reporting levels on data quality issues.
If data discrepancies have been uncovered in reports from sub-reporting levels, the M&E Unit (e.g., districts or regions) has documented how these inconsistencies have been resolved.
The M&E Unit can demonstrate that regular supervisory site visits have taken place and that data quality has been reviewed.
When applicable, the data are reported through a single channel of the national reporting system.
When available, the relevant national forms/tools are used for data-collection and reporting.
Reporting deadlines are harmonized with the relevant timelines of the National Program (e.g., cut-off dates for monthly reporting).
The service sites are identified using ID numbers that follow a national system.
The system records information about where the service is delivered (i.e. region, district, ward, etc.)
I II
M&E Unit
- - N/A N/A
Regional Level
Intermediate Aggregation Level Sites
Service Delivery Points/Organizations
N/A N/A
SUMMARY TABLE
Assessment of Data Managementand Reporting Systems
M&E Structure, Functions and
Capabilities
Indicator Definitions and Reporting
Guidelines
Average (per functional area)
III IV V Color Code Key
green 2.5 - 3.0
M&E Unit yellow 1.5 - 2.5
N/A N/A N/A N/A red < 1.5
Regional Level
Intermediate Aggregation Level Sites
Service Delivery Points/Organizations
N/A N/A N/A N/A
Av
era
ge
(pe
r si
te)
Data-collection and Reporting Forms /
Tools
Data Management Processes
Links with National Reporting System
Color Code Key 0 0 0 0 0 0
Yes, completely
Partly
No - not at all
Service Delivery Points Verification Factors
Service Delivery Points Verified Result (V) Reported Result (R) Verification Factor (VF)
No. of Site 0 0 0 Weighted Avg (AD) #DIV/0!
District Level Aggregation Summary Statistics
Data aggregation Level Verified Result (V) Reported Result (R) Verification Factor (VF)
No. of Districts 0 0 0 Avg. Abs. Diff (SDP) #DIV/0!
Regional Level Aggregation Summary Statistics
Data aggregation Level Verified Result (V) Reported Result (R) Verification Factor (VF)
No. of Regions 0 0 0 Avg. Abs. Diff (SDP) #DIV/0!
National Level Aggregation Summary Statistics
Data aggregation Level Verified Result (V) Reported Result (R) Verification Factor (VF)
Value Missing Value Missing N/A
0 0.00%
#DIV/0!
Overall Verification Factor for the Indicator #DIV/0!
Overall Data Quality Rating for the Indicator#DIV/0!
Absolute difference |100-VF|(AD)
Absolute difference |100-VF|
Absolute difference |100-VF|
Absolute difference |100-VF|
No. of Aggregation Points
Average absolute difference for all Aggregation levels
M&E Structure, Functions and
Capabilities
IndicatorDefinitions
and Reporting Guidelines
Data-collectionand ReportingForms / Tools
Data Management
Processes
Links with National Reporting
System
048
Data Management Assessment - Global Aggregate Score
% Available % On Time % Complete Verification Factor0%
200%
400%
600%
800%
1000%
1200%
Data and Reporting Verifications - Global Aggregate Score
E: Assessment of Reporting Performance of the Grant Information SystemsDistrict Level Summary Statistics
District Availability Timeliness Completeness
Average - - -
Regional Level Summary Statistics
Regional Availability Timeliness Completeness
Average - - -
National Level Summary Statistics
Availability Timeliness Completeness
- - - -
E: Assessment of Reporting Performance of the Grant Information Systems
G. Assessment of PHPM Data System
SUMMARY TABLE
Assessment of PHPM System
Central Stores or National Level
1 - N/A
Average Score per site