1 performance measurement and the ojjdp data collection tool presented at the ojjdp national grantee...
Post on 26-Dec-2015
219 Views
Preview:
TRANSCRIPT
1
Performance Measurement and the Performance Measurement and the OJJDP Data Collection ToolOJJDP Data Collection Tool
presented at the presented at the
OJJDP National Grantee OrientationOJJDP National Grantee Orientation
April 6–7, 2010April 6–7, 2010
2
CSR’s DCTAT & Performance Measurement Team
• Agnes Cholewa• Ashley Hayward• Mary Leonard• Elizabeth Logan• Ursula Murdaugh• Monica Robbers• Matt Watson
3
Outline
• Requirements
• Performance Measurement
• Data Collection
• Reporting Performance Measurement Data to OJJDP
4
Requirements
Projects are required to:
• Collect and report performance measurement data
• Participate in an OJJDP DCTAT training session
• Submit a report on these data to OJJDP semiannually
5
Performance Measures
• Concerned with collecting information to determine whether a program achieved its goals and objectives
• Information from performance measurement is used to improve the operation of the program
• Inputs, outputs, and outcomes are collected and reported
6
Performance Measurement vs. Evaluation
Question How much? What does it mean?
Example Game score Game analysis
Offers A tally Causality
Timeframe Continuous (Ongoing) Interval (Discrete)
Cost Less expensive More expensive
Performance measurement is necessary, but not sufficient, for evaluation.
Feature EvaluationPerformance Measurement
7
Performance Measurement and Data Collection
• Performance measures and data collection are building blocks of evaluation
• Hard proof of what/how/when/why your program is doing• Documentation supports sustainability efforts• Specifically:
– Strengthens accountability– Enhances decision-making (helps governments and
communities determine effective resource use)– Improves customer service– Supports strategic planning and goal setting
8
Federal Initiatives on Performance Measurement
• Government Performance and Results Act (GPRA, 1993)– Shift from accountability for process to accountability for results– Programs must show effectiveness to justify funding
• Federal Agency Rating of Programs• President's Agenda – “Transparency and accountability a
priority”• Several State-level efforts also in place
9
$
$
$
$
$
$
OJJDP
Grantees
Communities
Info
rmatio
n
Congress and OMB
Funding and Information Flows
10
History of Performance Measurement at OJJDP
JABGPerformance
MeasuresDeveloped
2002JABG PART
DCTATopened forJABG Data Reporting
2004JABG Reportto Congress
IncludedQuantitativePerformance
Data
2006PART ofJuvenileJustice
Programs
2004/2005Title V Reportto Congress
IncludedQuantitativePerformance
Data
DCTATopened forTitle V and
Formula GrantsData Reporting
DCTATopened forTYP, EUDL
BG, andDiscretionary
Grant DataReporting
DCTAT openedfor T-JADG Data
Reporting
A Brief History… A Brief History…
11
Office of Juvenile Justice and Delinquency Prevention
Mission/Purpose:• Authorizing legislation is the Juvenile Justice and
Delinquency Prevention Act of 2002• Focus is on helping States and localities to respond to
juvenile risk behavior and delinquency• Primary function of the agency is to provide program grant
funding, and support research and technical assistance/training
• Long-term goal is prevention and reduction in juvenile crime and victimization
12
Diversity of Programs
• Formula, Block Grants for States• Tribal Youth Programs• Discretionary Competitive Programs• Enforcing Underage Drinking Laws (Block and Discretionary
Grants• Victimization Grants (Amber Alert, Internet safety)• Congressional Earmark Grants
13
OJJDP Funding
OJJDP generally funds 4 types of programs/projects:• Direct-Service Prevention• Direct-Service Intervention• System Improvement• Research and Development
14
Development of Core Measures for OJJDP Programs
• A small number of measures that directly link to OJJDP’s core mission
• Comparability within and across programs• A focus on quality services and youth outcomes
15
Snapshot of Discretionary Performance Measures
Category Mandatory Measures
1. Direct-Service Prevention
2. Direct-Service Intervention
1. Funds awarded for activity/program
2. Number of youth or families served
3. Implementation of an evidence-based program
4. Number served with evidence-based program
5. Number who successfully complete program
6. **Number who exhibit desired change in targeted behavior
7. Number who offend/reoffend or
8. Number victimized/re-victimized
3. System Improvement
1. Funds awarded for activity
2. Implementation of an evidence-based program
3. Number served with evidence=based program
4. Research and Development
1. Funds awarded for activity
2. Funds awarded for evaluation
3. Number of programs (by type) evaluated
4. Number of final reports accepted
5. Number of training curricula accepted
16
27% of discretionary grantees are implementing one or more evidence-based programs: (July-December 2009 Reporting Period)
*Definition: Programs and practices that have been shown, through rigorous evaluation and replication, to be effective at preventing or reducing juvenile delinquency or victimization, or related risk factors. Evidence-based programs or practices can come from many valid sources (e.g., Blueprints for Violence Prevention, OJJDP’s Model Programs Guide). Evidence-based practices may also include practices adopted by agencies, organizations, or staff which are generally recognized as “best practice” based on research literature and/or the degree to which the practice is based on a clear, well-articulated theory or conceptual framework for delinquency or victimization prevention and/or intervention.
Evidence-Based Programs*
17
OJJDP’s “Behavior” Measure Options
Percentage of program youth who exhibit a desired change in the targeted behavior. (Several options – select most relevant behavior)
Substance use Social competence School attendance GPA GED High school completion Job skills Employment status Family relationships Family Functioning Antisocial behavior Gang-related activities
18
Other Data Results
From the July – December 2009 Reporting Period• Number of Youth Served: 109,656• Number of Youth Who Offend or Reoffend: 644 • Funds Used For:
– Direct Service Prevention: $3,253,214 – Direct Service Intervention: $1,719,168– System Improvement: $1,163,804– Research and Development: $336,584
19
OJJDP’s Performance Measures Website
http://ojjdp.ncjrs.org/grantees/pm/http://ojjdp.ncjrs.org/grantees/pm/
20
Data CollectionData Collection
21
Data Collection
• Need up-front planning• Need a sense of what you are trying to accomplish• What data will you collect and why?• What data sources are available and which will you use?• How will you use the data beyond just reporting it to OJJDP?
22
Purpose of Data Collection
• An ongoing process that keeps the project focused• Provides the information needed to report on performance
measures• Data and data collection are the building blocks of
performance evaluation• Use data collection to enhance your ability to monitor and
evaluate your program
23
Data Collection Standards
• Program documentation– Clearly describe and document performance measures– Keep logic model and performance measure
documentation together as part of the history of your program
• Formal agreements for data collection– Make sure that written agreements are clear
• Collect valid and reliable data– Report accurate data
24
Data Collection Standards (cont.)
• Analyze Data
– Quantitative data (i.e., data from surveys) and qualitative data (i.e., from interviews) should be appropriately and systematically analyzed
– Obtain training and technical assistance for this if necessary
• Justify Conclusions
– Justify the conclusions you make from your data
• Protect Rights of Program Participants
– Design and conduct data collection to protect the rights and welfare of all participants
– Obtain training and technical assistance for this if necessary
25
Keeping Track of Data
• Use data collection planning tool
• Identify staff member to coordinate and monitor data collection
• Assemble data collection checklists– Develop forms and instruments – Develop procedures or policies for collecting needed data
• Must collect accurate data in a systematic manner• Develop a codebook to define the data you collect• Policies and data collection codebooks can help keep the
program on track even with staff turnover
Pilot-test your procedures!Pilot-test your procedures!
26
Plan for Performance Measurement in Ongoing Program Assessment
To assess your program, include plans for:• Analysis/synthesis – How performance measurement data
will be analyzed and summarized• Interpretation – How the program will interpret what the data
mean• Dissemination – Which program stakeholders will receive the
results of the performance measurement?• Recommendations – How the group will identify
recommendations based on the results of the performance measurement
27
Reporting Performance Reporting Performance Measurement Data to OJJDPMeasurement Data to OJJDP
28
The Data Collection and Technical Assistance Tool (DCTAT)
• The OJJDP Data Collection Tool (DCTAT) is a resource for your program
– Lists data submission deadlines– Includes a training power point for how to use the DCTAT – Lists webinar-based training schedules, phone number and e-mail for
technical assistance– Links to performance measure (indicator) grids– Generates reports– Generates documentation for your program
• Include with bi-annual CAPRs• For use in your program
• Changes and improvements to the DCTAT are ongoing
29
The DCTAT
Steps to Complete Reporting in the DCTAT:• Log in• Profile (Review, Complete, or Revise)• Select a Reporting Period• Step 1: Enter Award Information (Includes Target Population
Information)• Step 2: Select Program Categories• Step 3: Select Performance Indicators• Step 4: Enter Data • Step 5: Create a Report to Submit to OJJDP• Complete the User Feedback Form
30
DCTAT Sign-in Screen
This screen contains information and resources
for your program
This screen contains information and resources
for your program
Website address:http://www.ojjdp-dctat.org
Website address:http://www.ojjdp-dctat.org
Grantee (Grantor) is defined as the primary recipient of
funds from OJJDP.
Grantee (Grantor) is defined as the primary recipient of
funds from OJJDP.
The Grantee will be provided with a user ID and password from the
System Administrator
The Grantee will be provided with a user ID and password from the
System Administrator
31
Profile Screen
Please update this page frequently to receive
important e-mails from the DCTAT.
Please update this page frequently to receive
important e-mails from the DCTAT.
Profile screen contains
information received via a download from
GMS.
Profile screen contains
information received via a download from
GMS.
If you are a first-time user, the system will take you
to this screen first.
If you are a first-time user, the system will take you
to this screen first.
Most screens in the DCTAT have
helpdesk contact info
Most screens in the DCTAT have
helpdesk contact info
32
Grant Program Selection Screen
If you are a returning user, the system will take you to this screen first.
If you are a returning user, the system will take you to this screen first.
The purpose of this screen is for you to select the reporting period for which you need to enter data for a current reporting period or view data entered previously. If you are not sure, please call the DCTAT
help desk.
The purpose of this screen is for you to select the reporting period for which you need to enter data for a current reporting period or view data entered previously. If you are not sure, please call the DCTAT
help desk.
33
Designation Screen
The purpose of this screen is for you to inform the DCTAT how you as the Grantee administer your funds. There are 2 methods: 1) Grantee spends funds and/or awards funds to subaward recipients (subgrantees); 2) the
Grantee solely uses all funds.
The purpose of this screen is for you to inform the DCTAT how you as the Grantee administer your funds. There are 2 methods: 1) Grantee spends funds and/or awards funds to subaward recipients (subgrantees); 2) the
Grantee solely uses all funds.
NOTE: Subgrantees are secondary recipients of funds from the Grantor (not from OJJDP). Secondary awards were made from the primary award
received from OJJDP.
NOTE: Subgrantees are secondary recipients of funds from the Grantor (not from OJJDP). Secondary awards were made from the primary award
received from OJJDP.
34
Grantee Status Summary Screen
This screen provides the status of performance
measures data entry at the grantee level
This screen provides the status of performance
measures data entry at the grantee level
35
Grantee Status Summary Screen with Subgrantees
This screen provides the status of performance
measures data entry at the grantee level and subgrantee
level (if applicable)
This screen provides the status of performance
measures data entry at the grantee level and subgrantee
level (if applicable)
The system has red buttons that lead you to the next
action or step. “Follow the red buttons!”
The system has red buttons that lead you to the next
action or step. “Follow the red buttons!”
36
Step 1: Award Information Screen (1 of 3)
This is a view of the first data entry screen. It is general info questions about your award or subaward.
This is a view of the first data entry screen. It is general info questions about your award or subaward.
37
Step 1: Award Information Screen (2 of 3)
• Tell OJJDP about the population that is served/funded by your award. This will be different at the grantee level or subgrantee level.
• Programs that directly provide services/programs to youth are asked to define the population by race/ethnicity, justice involvement, gender, age, geographic location of population served by the federal award.
• Grants that use funds for “system improvement” type projects should select the option “Youth population not directly served”.
• Tell OJJDP about the population that is served/funded by your award. This will be different at the grantee level or subgrantee level.
• Programs that directly provide services/programs to youth are asked to define the population by race/ethnicity, justice involvement, gender, age, geographic location of population served by the federal award.
• Grants that use funds for “system improvement” type projects should select the option “Youth population not directly served”.
Target Population Information ContinuedTarget Population Information Continued
38
Target Population Information ContinuedTarget Population Information Continued
Step 1: Award Information Screen (3 of 3)
The “other” category is to define other factors that may define the population that you are serving.
Are these additional factors that were proposed when you applied for funding?
The “other” category is to define other factors that may define the population that you are serving.
Are these additional factors that were proposed when you applied for funding?
39
Step 2: Program Category Selection Screen
Remember activities funded by your award are organized into these 4 categories:• Prevention – Youth has not had any involvement in the juvenile
justice (jj) system but may have risk factors for involvement.• Intervention – Youth has had some involvement in the jj system
and you would like to intervene to prevent further involvement• System Improvement – a program or project may need hiring
of staff, staff training, new policies/procedures; MIS development/enhancement
• Research and Development – a project is research or evaluation focused; related to a juvenile justice program or population; or development of materials that will be considered for use with a juvenile justice population or program
Remember activities funded by your award are organized into these 4 categories:• Prevention – Youth has not had any involvement in the juvenile
justice (jj) system but may have risk factors for involvement.• Intervention – Youth has had some involvement in the jj system
and you would like to intervene to prevent further involvement• System Improvement – a program or project may need hiring
of staff, staff training, new policies/procedures; MIS development/enhancement
• Research and Development – a project is research or evaluation focused; related to a juvenile justice program or population; or development of materials that will be considered for use with a juvenile justice population or program
The next step is to Select Program Categories.
The next step is to Select Program Categories.
40
Step 3: Indicator Selection Screen
The next step is to select indicators (performance measures) that represent your grant-funded activities
The next step is to select indicators (performance measures) that represent your grant-funded activities
Can Be Mandatory and OptionalCan Be Mandatory and Optional
The indicators are presented as mandatory (those that OJJDP requires you to report to support their “core” measures) and then optional indicators. The optional indicators are additional measures for which you are encouraged to select as many as apply to your grant-funded activities. This is data that may help to maintain and manage your program activities.
The indicators are presented as mandatory (those that OJJDP requires you to report to support their “core” measures) and then optional indicators. The optional indicators are additional measures for which you are encouraged to select as many as apply to your grant-funded activities. This is data that may help to maintain and manage your program activities.
41
Step 4: Data Entry Screen
This screen provides you with all of themandatory and optional measures thatwere selected for data reporting.• If a mandatory measure does not relate
to your grant-funded activities, enter zero.
• If you do not have data this reporting period for a selected optional or mandatory measure, just enter zero.
• In the comments section of the Performance Data Report, you can explain the zero values that were reported.
42
Step 5: Reports Menu
There is 1 Mandatory Report Type:
Once all data entry has been completed, you are ready to create the mandatory report that should be submitted to OJJDP through the Grants Management System (GMS).
Performance Data Report: Aggregates your data; submit this one to OJJDP through GMS
There is 1 Mandatory Report Type:
Once all data entry has been completed, you are ready to create the mandatory report that should be submitted to OJJDP through the Grants Management System (GMS).
Performance Data Report: Aggregates your data; submit this one to OJJDP through GMS
43
Step 5: Reports Menu (cont.)
1. Performance Data Summary Report – Provides a comparison of a grantee’s aggregated data to an aggregate of national data by federal program.
2. Subaward Detail Data Report – Contains performance measurement data for all active awards at the grantee and/or subgrantee level for the reporting period.
3. Performance Data Report by Subgrantee – An aggregate data report by subgrantee by federal award. (only displays when applicable)
4. Close Out Report – Provides in aggregate form, data reported across all reporting periods during the life of the award. It should be submitted as a part of the close-out package when the close out process has been initiated in the GMS system.
1. Performance Data Summary Report – Provides a comparison of a grantee’s aggregated data to an aggregate of national data by federal program.
2. Subaward Detail Data Report – Contains performance measurement data for all active awards at the grantee and/or subgrantee level for the reporting period.
3. Performance Data Report by Subgrantee – An aggregate data report by subgrantee by federal award. (only displays when applicable)
4. Close Out Report – Provides in aggregate form, data reported across all reporting periods during the life of the award. It should be submitted as a part of the close-out package when the close out process has been initiated in the GMS system.
1
2
3
In addition to the mandatory report, the DCTAT provides other reports for your use.
In addition to the mandatory report, the DCTAT provides other reports for your use.
44
User Feedback Form
Wait - before you go!Let us know about your experience
using the DCTAT and how you would like to use your data!
45
Please Remember!
• Report accurate data! • Prepare your data before entering the tool• Follow the red buttons to get to the next step• When data entry is complete, select “Mark data as complete and
create final Performance Data report”• Export the Performance Data Report (PDF or Word format) and save
to your computer• After saving to your computer, be SURE to upload this document to
GMS as an attachment to get credit for reporting
46
DCTAT and GMS Reporting Schedule
Congressional Earmark and Discretionary Grantees
Activity Period DCTAT Due Date Upload to GMS?
January – June July 30
Yes
by July 30
July – December January 30
Yes
by January 30
47
Questions/CommentsQuestions/Comments
48
Contact Information
• To access the DCTAT website, please go to: http://www.ojjdp-dctat.org
Website
• E-mail: ojjdp-dctat@csrincorporated.com• Toll-free: 1 (866) 487-0512
Technical Assistance
top related