Download - AGEP Evaluation Capacity Meeting 2008
AGEP Evaluation Capacity Meeting 2008
Yolanda George, Deputy Director,
Education & Human Resources Programs
2
Objectives
Identifying methods and questions for evaluation studies related to STEM graduate student progression to the PhD and professoriate, including admissions/selections, retention/attrition, PhD completion, and post-doctoral experiences, including collection of quantitative and qualitative data.
Identifying methods and questions for Alliance evaluations, particularly in terms of progression to PhD and the professoriate. What can AGEPs learn from cross institutional studies?
3
As you listen to presentations….
What research informed the design of the study?
What type of data was collected? What was the rationale for deciding to collect this data?
What methods were used? What was the rationale for selecting methods used?
How were comparisons groups constructed? What are the reporting limitations in regards to the construction of the comparison groups?
4
Another Objective for this AGEP Meeting
Developing and writing impact statements or highlights (nuggets) that include data for use in:
AGEP NSF Annual Reports Findings section
AGEP Supplemental Report Questions
NSF Highlights
Brochures and Web sites
5
The poster should include quantitative and qualitative data that provides evidence of:
Graduate student changes for selected STEM fields or all STEM fields
Infrastructure changes. This can include changes in institutional or departmental polices or practices
Alliance impact. This can include changes in institutional or departmental policies or practices related to graduate school affairs, postdoctoral arrangements, or faculty hiring.
Stories and pictures are welcome but the major emphasis must be on quantitative and, as appropriate, qualitative data.
Program descriptions need to be kept to a minimum and put in the context of the data behind decisions to keep or eliminate strategies. A focus can be on what works and what doesn't as long as the emphasis on the data that showed whether different strategies worked on not.
6
Impact Evaluations and Statements
An impact evaluation measures the program's effects and the extent to which its goals were attained. Although evaluation designs may produce useful information about a program's effectiveness, some may produce more useful information than others.
For example, designs that track effects over extended time periods (time series designs) are generally superior to those that simply compare periods before and after intervention (pre-post designs);
Comparison group designs are superior to those that lack any basis for comparison; and
Designs that use true control groups (experimental designs) have the greatest potential for producing authoritative results.
http://www.ojp.usdoj.gov/BJA/evaluation/guide/documents/impact_eval_gangs.htm
7
8
http://www.ed.gov/about/inits/ed/competitiveness/acc-mathscience/report.pdf
9
Strategies that Matter for Graduate Student Retention & Progression to the PhD
Student admissions/selection criteria
Financial aid packages that reduces debt burden
Mentoring (Faculty and staff)
Supplementary academic support in writing, statistics, and other subjects
Social integration into department
Early intellectual integration into research projects
Research productivity (posters, papers, etc)
Attention to PhD milestones
Attention to family/work balance
Institutional and departmental programs and practices
10
Given limited evaluation budgets:
Use evaluators to conceptualize and design evaluation instruments.
Don’t evaluate every component of the program each year.
Look for natural opportunities to conduct an evaluation. Make evaluation a part of the implementation project
Use electronic student systems
Involve all faculty and staff in data collection and analysis
11
Work Groups
Two Groups
1. What types of studies and evaluations are you already doing to measure retention/attrition or progression to the PhD? What types of comparisons groups are you using in these studies?
Two Groups
2. What types of studies and evaluations are you already doing to measure institutional impact? What types of comparisons groups are you using in these studies?
Lead Alliance Leaders
3. What types of studies and evaluations are you already doing to measure Alliance impact? What types of comparisons groups are you using in these studies?
12
Work Groups Continued (All Groups)
One Group
4. What type of studies and evaluations are you already doing to measure progression and retention in the professoriate? What types of comparisons groups are you using in these studies?
All Groups
5. What are other natural opportunities for collecting evaluation data? What types of comparisons groups would you use in these studies?
6. What are some solutions to IRB challenges?
13
Homework
Write an impact statement about graduate student changes, as a result of AGEP.
Write an impact statement about your institutional changes, as a result of AGEP.
Write an impact statement about your Alliance, as a result of AGEP.
Write an impact statement about progression and retention in the STEM professoriate, as a result of AGEP.
14
In summary, evaluation
Examination of something in order to judge its value, quality, importance, extent, or condition
Part of the ongoing program implementation
Meaningful activity among the entire project team, including faculty and administrators