evaluation of michigan child care expulsion prevention
TRANSCRIPT
Evaluation of Michigan Child Care Expulsion Prevention
Program (CCEP), 2007-2010
Michigan State UniversityOctober 27, 2010
Rosalind H. Kirka
John S. Carlsona
Laurie A. Van Egerena
Holly Brophy-Herba
Stacy L. Bendera
Betty Tablemana
Mary A. Mackrainb
Deb Marciniakc
Sheri FalvaydaMichigan State UniversitybMichigan Child Care Enhancement ProgramcMichigan Public Health InstitutedMichigan Department of Community Health
CCEP’s research questions (child, provider, program, family, CCEP process & fidelity)
Evaluation approachEvaluation strategiesStrategies – strengths and challengesUse of CCEP evaluation results
Agenda
Child Care Expulsion Prevention Program (CCEP), Michigan
Began in late ‘90s Initiated by MDCH, supported with funding from MDHS Plans for state-wide coverage At time of evaluation, 16 programs covering 31 out of
83 counties Approx. 500 - 600 children per year. Programmatic consultation also provided. After T1 data collection ended in 2009, focus of CCEP
changed to 0-3 yrs. Along with many other MI programs funding ended on
30 September 2010
Research questions Child Outcomes (John)1. Does the severity of children’s challenging
behavior decrease from the onset of CCEP services to the conclusion of services?
2. Does children’s social and emotional health increase from the onset of CCEP services to the conclusion of services?
3. Does the impact of services on children’s behavior last past services?
4. Do children receiving CCEP services successfully stay in child care vs. being expelled?
Research questions
Parent outcomes (Holly)5. Do subjective feelings of parental
competence in dealing with their child’s challenging behavior increase as a result of CCEP services?
6. Are families able to consistently attend work or school?
Research questions
Child Care Provider outcomes(Laurie)
7. Is the childcare provider better able to recognize early warning signs of social and emotional challenges in infants, toddlers, and preschoolers?
8. Is the child care provider better able to manage challenging behavior in the child care setting, with all children?
Research questions
Child Care Program outcome (Ros)9. Has the social and emotional quality
of the child care setting receiving CCEP services improved?
Research questions
Program Fidelity (Laurie)10. What is the fidelity of the child and
family consultation process among CCEP programs?
11. What is the fidelity of the programmatic consultation process among CCEP programs?
Evaluation approach
Collaborative and consultative
Built upon existing systems
Mixed method – mainly quantitative, some qualitative
Four overall strategies 1. Cross-sectional (formative): Consultant
survey
2. Longitudinal study (mainly summative): Pre-post data + 6 month follow-up from intervention group using measures of child, parent, provider outcomes
3. Quasi-experimental comparison study (summative): Comparison group with pre-post data matching longitudinal intervention group
4. Case studies (formative): Perceptions of experiences with CCEP based on interviews.
1. Cross-sectional strategy: strengths
On-line survey of consultants on participation in CCEP and delivery of service, including compliance with six CCEP cornerstones
‘Snap-shot’ of program and processes based on perceptions of consultants and administrators
Electronic surveys are accessible, flexible, user friendly and can be quick to analyze
Very collaborative with CCEP in design, data collection, interpretation
Provided a wealth of information for program improvement, etc.
Collaboration provided opportunity to share expertise & help develop CCEP internal monitoring systems
Cross-sectional strategy: potential challenges
Potential factors affecting response rate: organizational change, personal views about evaluation, stress levels, vacations, sickness, staff turnover, workload, length of survey etc,
Anonymity can mean that survey data more likely to be accurate but non-respondents cannot be targeted to increase response rate.
Cross-sectional strategy: survey of consultants, 2008 (N =29)
GenderGender
Female 100%
AgeAge
Mean (yrs) 43
Range 27-60
Race/ethnicityRace/ethnicity
White 76%
Af-Amer 21%
Asian 3%
Endorsement MI AIMHEndorsement MI AIMHEndorsement MI AIMH
Level 2 24%24%
Level 3 72%72%
ExperienceExperienceExperience
Child MHChild MH 10 years
CCEP (yrs)CCEP (yrs) 4
Status with CCEPStatus with CCEPStatus with CCEP
Full-timeFull-time 59%
Part-timePart-time 41%
PT MeanPT Mean 20 hrs
Educational LevelEducational Level
Master’s 83%
Bachelor’s 17%
Degree/MajorDegree/Major
Social work 59%
Psychology 17%
Other 24%
State licensureState licensure
Yes 83%
No 17%
Cross-sectional strategy: survey summaries/ research briefs
1. Informing Providers About CCEP Services 2. Child and Family Consultation Processes 3. Programmatic Consultation Processes 4. Reflective Supervision 5. Group Training and Individual Coaching
of Providers and Parents 6. Consultants: Experience, Job Satisfaction, and
Organizational Support 7. The Most Important Things Consultants Do 8. Collaboration with Michigan Child Care Coordinating
Council, MSU Extension, and the Great Start Collaborative 9. State-Level Training and Technical Assistance
Cross-sectional strategy: other survey results
Preventing Children’s Expulsion from Childcare: Variations in Consultation Processes in a Statewide Program
Poster and Survey summaries/research briefs at SRCD conference (2009) View at: http://outreach.msu.edu/cerc/
2. Longitudinal strategy - strengths
Able to assess child, parent, provider and program outcomes pre (T1) and post (T2) and if these were sustained over 6 months(T3).
Collaborative – state and local e.g. consultation on selection, organization and use of measures; attendance at monthly meetings; electronic Q & A; personal contacts between consultant and MSU team especially with new staff; collaborative troubleshooting at state level.
Built on existing systems so incorporated measures already used by consultants e.g. DECA
Longitudinal study sample size
Time when cases received
Sample size
Child & family cases (T1)
432
Child & family cases (T2)
394
Child & family cases Follow-up (T3)
177
Programmatic cases(all) 55
Sample sizes included in analyses varied depending onthe quality of the data collected
Children & Families intervention sample (N=361)
Child’s age-monthsChild’s age-monthsMean (SD)43.2 (13.2)Mean (SD)43.2 (13.2)
0-35 25%
36-60+ 75%
GenderGenderMale 75%
Race/ethnicityRace/ethnicityAfr- Amer 15%
White 77%
Other 8%
Hispanic 8%
Household incomeHousehold incomeHousehold incomeLow 34%34%
FamilyFamilyFamily2-parent 60%60%
ProviderProviderProviderCenterCenter 86%
F. homeF. home 5%
Gp. homeGp. home 7%
RelativeRelative 1%
In-homeIn-home 1%
Previous expulsionsPrevious expulsions
10%
3. Quasi-experimental strategy Includes collection of matching data from a sample of
children exhibiting challenging behaviors but resident in a county where CCEP unavailable. Need to create a matched sample (N=86).
Enables comparison with CCEP intervention group beyond maturation changes
Ongoing challenges (resources - time, staff, organization, incentives) for recruiting and participation of comparison group but not resident in county with CCEP
Limitations –missing data, multiple raters, reliance on self-report measures and interviews, how representative was the intervention group who participated in the evaluation, were comparison families enough like CCEP group even with matching? what other services, if any, were comparison families receiving in their own counties? Did counties with CCEP differ from counties without?
Outcome MeasureChild 1. Devereux Early Childhood Assessment (DECA; LeBuffe & Naglieri,
1999). 2. DECA-Infant-Toddler Version (DECA-IT; Mackrain, LeBuffe & Powell,
2007) 3. Problem Coding Grid developed by Michigan CMH. 4. Subscales from the Behavior Assessment System for Children-
Second Edition (BASC-2; Reynolds & Kamphaus, 2004)5. Retention, placement, and expulsion.
Parent 1. Parenting Stress Index/Short Form (PSI/SF; Abidin, 1990)2. Skills and Knowledge subscale of the Psychological Empowerment
Scale (PES; Akey, 1996)3. Work productivity
Provider 1. Early Warning Signs (developed by MSU team)2. Goal Achievement Scale (GAS; Alkon, Ramler, & MacLennon, 2003)3. Teacher Opinion Survey (TOS; Geller & Lynch, 1999).
Consultation process, effectiveness, and acceptability
Adaptation and/or sub-scales of various instruments including:1.Parent-Teacher Relationship Scale, (PTRS; Vickers & Minke, 1995)2.Consultation Evaluation Form (CEF; Erchul, 1987).3.Behavioral Intervention Rating Scale (BIRS; Von Brock & Elliott, 1987).4.Benefits of Consultation (Sheridan, 1998, 2000a, 2000b) including other sub-scales from BIRS)5.Competence of Other (Sheridan 1998, 2000a, 2000b)
Does consultation make a difference to parents?
Awaiting final results, (on child, parent, provider, program outcomes and perceptions of effectiveness & relationships).
With qualifications, trends prior to the final analyses have indicated that:
Both parental competence increased and stress reduced more among parents who used consultation services.
There was strong/high levels of satisfaction with the perceived consultation process, its’ effectiveness, and acceptability by both parents and providers.
Interim results (N=129) - Change in child outcomes after early childhood mental health
consultation (see link to poster)Before taking dosage of CCEP into account, raw parent &provider data showed: Both CCEP and comparison children showed significant improvements in behavior
problems and positive behaviors over the study period. For parent report in the CCEP group, attention problems and functional communication
continued to improve 6 months after consultation; most others remained level.Are higher doses of consultation linked to greater improvement in child
challenging and positive behaviors compared to lower doses? After taking satisfaction with CCEP into account, more hours of consultation with
providers (but not parents) predicted increases in provider reports of some positive behaviors.
At 6-month follow-up, more hours of provider consultation was linked to continued improvements in parent-reported attention problems.
Gains made in behavioral concerns and functional communication were not sustained.Do children with challenging behavior who receive consultation show more
behavior improvement compared to children with challenging behavior who do not receive consultation?
While children in the intervention (N=129 and comparison (N=59) groups both improved over time, probably due to maturation, the CCEP group showed greater improvements in behavior than the comparison group in almost all areas.
4. Case studiesSample: (N=9 children) 2 programs,
3 consultantsMethod: Interviews in-person or
phone with parent, provider (s) and consultant
Analyses: Coded & content thematically organized around process and outcomes
CASE STUDY SAMPLECASE STUDY SAMPLECASE STUDY SAMPLECASE STUDY SAMPLECASE STUDY SAMPLECASE STUDY SAMPLECASE STUDY SAMPLE
Name Sex Age Reason for referral
Household # I Outcome
Dylan M 60m Listless, withdrawn Mother, stepfather
5 Adjusted, kindergarten
Sophia F 40m Defiant, aggressive Mother, boyfriend, sibling
2 Mom lost job, withdrawn from cc
Jason M 71m Head-banging, tantrums
Single mother 3 Reduced intensity
Ryan M 51m Tantrums, screaming
2 bio. parents, twins
3 Reduced intensity, moved on to schoo
Kayla F 41m Defiant, hyperactive 2 adoptive parents, sibling
3 Parent & provider behavior adapted
Nathan M 49m Developmental delay, aggressive
2 bio. Parents, sibling
3 Parent & provider behavior adapted
Madison F 60m Tantrums, disruptive
2 bio. parents 1 Provider adaptedKindergarten
Hannah F 42m Aggression Single mother 3 Incomplete consultation- moved out of state
Daniel M 48m Aggression, sexualized behavior
Single mother 4 Expelled
#I=Number of interviewees#I=Number of interviewees#I=Number of interviewees#I=Number of interviewees#I=Number of interviewees#I=Number of interviewees#I=Number of interviewees
Case studies: strengths Combines quantitative and qualitative
methods. Illustrates the variation and unique
relevance for individual children Adds depth to the understanding of the
processes that underpin consultation Highlight the importance of context and
relationships for intervention
Case studies: challenges Balancing case study importance with a
primarily outcome focused evaluation. Self-selection bias in sample Combining meaningfully with quantitative
data- using quotes in body of report (outcomes), thematic table about process and ‘stories’ about children with standardized scores compared to mean
Program’s use of preliminary evaluation results
Accountability. Was the money being spent as agreed? Was it being spent wisely?
Planning – program and community Where to focus limited resources? Was more needed? Help others understand the consultants’ role and perspective and the contribution it can make to community planning. Grant preparations.
Quality improvement. How could CCEP build on its strengths? What could CCEP have done better? Ready access to evaluator expertise offered more support. e.g. internal monitoring systems.
Advocacy & Dissemination. Tell others about CCEP successes and challenges. Politicians, potential funders, academics-contribution to the ECMH knowledge base.
Closing comments from Daniel’s mom
“I think it’s (CCEP) an awesome program, I really do. There are a lot of daycares out there that if they come across just the littlest behavior, and the child becomes difficult to take care of, they just give up and say ‘okay, well we can’t have him in the daycare’. So someone like Julie (consultant) that could come out and talk to the caregivers and explain different ways of doing things, I mean, I think that’s awesome because then you know, the kid can stay in the daycare and the mother can continue working. I mean, I think it’s a really good program.”
Further informationPrincipal Investigators: John Carlson, PhD, NCSP; Asc. Professor, School
of Education; [email protected] Holly E. Brophy-Herb, PhD, Associate Professor,
Human Dev. & Family Studies; [email protected] Laurie A. Van Egeren, PhD, Director, Community
Evaluation and Research Center (CERC), University Outreach and Engagement; [email protected]
Project Manager: Rosalind Kirk, PhD, Dip. SW; [email protected]
Useful links MSU CCEP EVALUATION RESULTS REFERRED TO HERE: BRIEFS AND
POSTERS http://outreach.msu.edu/cerc/research/ccep.aspx
TECHNICAL ASSISTANCE CENTER FOR SOCIAL EMOTIONAL INTERVENTION: http://www.challengingbehavior.org/
UNIVERSITY OF WISCONSIN – EXTENSION: http://www.uwex.edu/ces/pdande/evaluation/index.html
NSF ONLINE EVALUATION RESOURCE LIBRARY:
http://www. oerl.sri.com
TROCHIM, W.M. THE RESEARCH METHODS KNOWLEDGE BASE, 2ND EDITION: http://www.socialresearchmethods.net/kb/