sacramento, ca march 16, 2009
DESCRIPTION
Making a Difference in After School: Measuring and Improving After School Quality Nicole Yohalem, Forum for Youth Investment. Sacramento, CA March 16, 2009. Quality assessment tools. Assessing Afterschool Program Practices Tool (APT) - PowerPoint PPT PresentationTRANSCRIPT
Making a Difference in After School: Measuring and Improving
After School Quality Nicole Yohalem,
Forum for Youth Investment
Sacramento, CAMarch 16, 2009
© The Forum for Youth Investment 2008
Quality assessment tools
• Assessing Afterschool Program Practices Tool (APT)National Institute on Out-of-School Time and the MA Department of Education
• CORAL Observation Tool (CORAL) Public/Private Ventures
• Out-of-School Time Observation Instrument (OST)Policy Studies Associates
• Program Observation Tool (POT)National Afterschool Association
• Program Quality Observation (PQO)Deborah Vandell and Kim Pierce
• Promising Practices Rating Scale (PPRS)WI Center for Education Research and Policy Studies Associates, Inc.
• Quality Assurance System (QAS)Foundations Inc.
• Program Quality Self-Assessment Tool (QSA)New York State Afterschool Network
• School-Age Care Environment Rating Scale (SACERS)Frank Porter Graham Child Development Center, UNC
• Youth Program Quality Assessment (YPQA)High/Scope Educational Research Foundation
Measuring Youth Program QualityA Guide to Quality Assessment ToolsUpdated January 2009
© The Forum for Youth Investment 2008
Quality assessment tools
There is a lot of similarity in how quality practice is defined. All tools assess: • Relationships• Environment• Engagement• Social/Behavioral Norms• Skill Building Opportunities• Routine/Structure
Note: CA self-assessment tool includes items that address these areas.
© The Forum for Youth Investment 2008
Measuring what matters
• Importance of the point-of-service.
• Good measures have clear, unambiguous items.
• The best measures also teach.
© The Forum for Youth Investment 2008
Emphasis on point-of-service
• CA Tool: 16 of 77 items focus on POS
• SACERS & NAA < half focus on POS
• APT & YPQA > half focus on POS
© The Forum for Youth Investment 2008
Clear and unambiguous?
Examples from the CA tool:
High inference• Ensures staff & volunteers have respectful
interactions with participants & families.
Low inference: • Regularly provides families with program
information in multiple languages and literacy levels.
© The Forum for Youth Investment 2008
Measures that teach?
Examples from the CA Tool:
Diagnostic• Provides opportunities & support for participants to take on
leadership roles.
Diagnostic and prescriptive• Regularly provides collaborative partners with program
information, such as program progress and evaluation reports and information about program events, in a variety of formats and in multiple languages if appropriate.
© The Forum for Youth Investment 2008
Quality improvement
Key components of quality improvement systems:• Quality standards that include what should happen
at the point of service• Ongoing assessment of how well services compare
to the standards• Targeted plans for how to improve• Training and coaching that fits improvement plans
© The Forum for Youth Investment 2008
Emerging examples and lessons
• Afterschool Program Assessment System (APAS)National Institute on Out-of-School Time
• Youth Program Quality Intervention (YPQI)Weikart Center for Youth Program Quality
© The Forum for Youth Investment 2008
APAS pilot
• Conducted by NIOST, Wellesley College• October 2006-July 2008• Atlanta, Boston, Charlotte , Middlesex Cnty NJ• 65 individuals, 28 programs, 3 intermediaries• Well-established K-8 after-school programs• Low stakes• Emphasis on continuous improvement, flexibility
© The Forum for Youth Investment 2008
Core APAS tools and supports
Tools• Survey of Afterschool Youth Outcomes Tool (SAYO)• Assessing Afterschool Program Practices Tool (APT)• Web-Based Data Management System
Supports• Training (2 days up front, online training ongoing)• 1-day site visit• Local coach
© The Forum for Youth Investment 2008
Findings from the APAS pilot
• APAS helped programs identify areas for improvement and staff development
• Most sites said they made program changes as a result.• Coaches are key to implementation and useful to sites • Engagement across staff levels is important• Engaging funders is important (even with low stakes)
based on follow-up phone interviews with sites and coaches
For more on APAS: www.niost.org/content/view/1654/282/
© The Forum for Youth Investment 2008
mbus
etroit
Minneapolis
`` `
l
Kentucky
Iowa
Oklahoma
New York
Rhode Island
Austin
Sacramento/ Georgetown
Divide Columbus
Indianapolis
Grand Rapids
Nashville
St. Louis
Washington*
West Palm Beach County
Rochester
Chicago
Youth Program Quality Intervention
Systemic quality improvement systems (QIS) anchored by the YPQA being developed in: –Statewide strategies: MI, ME, MN, RI, NM, KY, IA, WA, AR, NY–Cities and Counties: Austin, Chicago, Rochester, Detroit, Grand Rapids, Palm Beach County, Baltimore,
Nashville, St. Louis, Louisville, Georgetown Divide/Sacramento, Columbus IN, Indianapolis IN, Tulsa OK
New Mexico
Arkansas
Baltimore
Seattle
© The Forum for Youth Investment 2008
YPQI Focus: POS quality in context
POSPoint-of-Service
Engagem
ent
Interactio
n
Support
SafetyPLC
Professional Learning
Community
SAESystem
AccountabilityEnvironment
•Org policies/practices•Management values•Performance feedback•Continuity/staffing•Standards and metrics•Staff development
Youth PQA Form A
Youth PQA Form B
© The Forum for Youth Investment 2008
Incentivizing participation
PASA “endorsed” programs must: • Maintain certain enrollment and retention
benchmarks• Have a written curriculum• Undergo self-assessment using RIPQA annually
In exchange for: • Streamlined grant application process• Small administrative funding supplement
© The Forum for Youth Investment 2008
Requiring participation
Excerpt from Rhode Island 21st CCLC RFP
“Applicants must participate in the 21st CCLC Rhode Island Youth Program Quality Assessment Process (RIPQA), which includes the use of a self-assessment tool, outside observations, development and implementation of action plans to strengthen the program over time, working with a Technical Advisor, including designation of staff to coordinate the process.”
© The Forum for Youth Investment 2008
Rhode Island 21st CCLC pilot
Assessment & Planning1. Kick-off, 2-day training on RIPQA2. Quality Advisor (QA) meets with programs individually to orient3. Observation visits (3-8 programs per site)4. QA develops progress report, teams meet with instructors to share reports and
develop action plans5. ED and other key staff complete Form B individually6. QA summarizes, meets with team to discuss scores and improvement strategies 7. QA generates overall report on strengths and improvement steps
Training & Technical Assistance• Series of 2-hour workshops focused on RI-PQA content• Additional training on behavior management• AYD training (32 hours) offered twice annually• 4-session supervisor training• 5 hours of on-site coaching per site from QA
© The Forum for Youth Investment 2008
RI 21st CCLC pilot – lessons
Lessons Learned• Programs liked tool and found process worthwhile• Initial data collection model was time consuming• Timing is important to ensure changes get implemented• Needs across sites are very similar• Strong desire for on-site TA/coaching
Adjustments for Cohort 2• Smaller observation teams, fewer observations per site• One program report as opposed to individualized reports• Additional TA/training• Start with Form B, then observations (Form A)
For more information: www.mypasa.org/pasa-strategies
© The Forum for Youth Investment 2008
Palm Beach County QIS Pilot
PDTraining
• Centerpiece of the Prime Time Initiative• 38 providers in pilot; now working with 90• January 2006 – fall 2007• Based on the PBC-PQA• Financial incentives for programs
© The Forum for Youth Investment 2008
Findings from the Palm Beach pilot
• Most programs completed all phases of QIS• Quality improved• Quality improvement is a long-term process• On-site TA very important component• Clarity of purpose is critical
Spielberger & Lockaby, 2008www.chapinhall.org
© The Forum for Youth Investment 2008
Coaching
Characteristics: • Willing to listen• Experienced• Accessible• Flexible• Responsive• Creative• Resourceful
Roles/functions: •Keep programs engaged•Deliver training•Answer questions on tools, process•Participate in observations•Generate reports•Facilitate improvement planning•Provide on-site feedback, modeling
Key considerations: •Program vs. system-level coaching, role of intermediaries•Dosage
© The Forum for Youth Investment 2008
Purposes and methods
Smith, Devaney, Akiva & Sugar forthcoming in New Directions
© The Forum for Youth Investment 2008
Lessons for California
1. Have well defined purposes for the system. 2. Focus on the point of service. 3. Anchor quality improvement efforts with data about the POS.4. Create incentives for continuous improvement.5. Build in on-site, ongoing technical assistance/coaching. 6. Be intentional about pilot participation.7. Build learning communities.8. Recognize that management is a key lever.9. Worry about the quality of your measures and data.
For more information: Nicole Yohalem, Program Director
Forum for Youth [email protected]
www.forumfyi.org