aggregating outcomes for effort and effect: what ntac learned from its site review

Post on 13-Jan-2016

34 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Aggregating Outcomes for Effort and Effect: What NTAC Learned from its Site Review. Ella L. Taylor, Ph.D. NTAC Teaching Research Institute Western Oregon University. Agenda. Site review response to evaluation Revisions to evaluation plan Collecting effort and effect data Aggregating data - PowerPoint PPT Presentation

TRANSCRIPT

Aggregating Outcomes for Effort and Effect:

What NTAC Learned from its Site Review

Ella L. Taylor, Ph.D.NTAC

Teaching Research InstituteWestern Oregon University

Agenda

Site review response to evaluation

Revisions to evaluation plan

Collecting effort and effect data

Aggregating data Questions

Site Review Concerns about Evaluation

Very complex Less likely to succeed than a simpler

plan Needs to be simplified Needs to be made more realistic and

appropriate

Constraints of Evaluation

Everyone wants evaluation to be seamless and transparent

Reticence to see the direct benefit for improving the project

Evaluation is seen as cumbersome and confusing

Primary purpose of evaluation

1. Did we do what we said we were going to do? (effort)

2. What was the impact of what we did? (effect)

Effort vs. Effect

Effort – actions carried out by the project• Satisfaction data

• Numbers of participants, events, etc.

Effect – impact of the actions • What outcome resulted from the activity?

• Change of awareness

• Change of knowledge

• Change in skill/implementation (service provider, family, systems)

• Change in child (child change data)

Alignment

What needs to be done?

(Needs Assessment)

How to meet the need?(Activity)

(Effort data)

Did we meet the need?

(Evaluation)(Effort & Effect data)

Did we do what we said we were going to do? (EFFORT)

Grant objectives• Met = explain how we met them

• Not met = explain why not

Data• Number of events

• Number of participants

• Satisfaction with effort

What was the impact of what we did? (EFFECT)

What to measure? How to measure? How to report succinctly? How to aggregate?

What was the impact of what we did? (EFFECT) Outcome and Performance Indicators

(NTAC’s OPIs)

Outcome • A statement of a measurable condition or an expected

result or change. (i.e. increase, improvement, progress toward).

Performance Indicator• A statement that helps quantify the outcome and

indicates whether the outcome has been achieved. Often, multiple indicators may provide better evidence of the achievement of an outcome.

Kudos to…

John Killoran Kathy McNulty Paddi Davies

Many, many hours of development and refinement of NTAC’s OPIs

OPIs Comprehensive outcomes for children (15),

families (9), service providers (19) and systems (6)

Embed in all aspects of planning, delivery and evaluation

Aligns needs assessment, project activities and measurement of impact

On web (http://www.tr.wou.edu/ntac/evalforms/)

How we are using OPIs Planning

• Identify the stakeholder group (service provider, child, family and/or systems)

• Identify the outcomes you will target• Identify the performance indicators that will help you

determine attainment of targeted outcome(s) Delivery of service

• Implement TA that is targeted to the outcomes selected

Evaluation• Tailor assessment/evaluation measures to targeted

outcomes and performance indicators

An Example

Webinar Example: Planning

Needs Assessment • Comments at Project Directors’ Meeting during self-

evaluation breakout sessions

• Conversations with state project directors, coordinators and staff

• NTAC Advisory committee meeting Outcome Goal: The use of formative and

summative evaluation of the systems change and/or capacity building has increased (S4)• Performance Indicator: Uses outcomes measures

(S4g)

Webinar Example: Delivery of Service

Activity: Webinar Align the intensity of our evaluation with

the intensity of the activity• One time activity results in less intense

evaluation than sustained professional development.

Align needs with activity with evaluation

Webinar Example Evaluation: Change of awareness

Item

Rating

SA AN ANorD

D SD NA

1. The presenter had the necessary expertise.

5 4 3 2 1 NA

2. I increased my awareness of using outcome measures in evaluation (S4g).

5 4 3 2 1 NA

3. I felt this activity was a good use of my time.

5 4 3 2 1 NA

A slightly more complicated example…

Example 2

Planning/Needs: Ongoing training and support• Outcome: Use of formative and summative evaluation

(Systems 4)

• Performance Indicators:

• Uses satisfaction measures (S4d)

• Uses awareness, knowledge or skills measures (S4e)

• Uses outcomes measures (S4g)

Activity: Series of webinars for one region Evaluation: Change of knowledge & skill

Evaluation:Change of knowledge & skill

I learned how to develop, implement and analyze…

Rating

SA AN ANorD

D SD NA

1. Satisfaction measures (S4d)

5 4 3 2 1 NA

2. Awareness, knowledge and skills measures (S4e).

5 4 3 2 1 NA

3. Outcomes measures (S4g).

5 4 3 2 1 NA

Another more complicated example…

Example 3 Planning/Needs: Development, implementation, analysis &

support in the use of formative and summative evaluation Outcome: Use of formative and summative evaluation

(Systems 4) Performance indicators:

• Uses participant demographic data (S4c)• Uses satisfaction measures (S4d)• Uses change in awareness, knowledge or skills measures (S4e)• Uses outcomes measures (S4g)• Uses formative and summative evaluation information for ongoing

feedback and continuous improvement (S4j)• Disseminates evaluation results of the systems change or capacity

building activities (S4k)

Sustained Professional Development

Activity: Multiple visits by TAS and evaluation specialist to 2 states

Evaluation: • Change of knowledge & skill

• Follow-up

Follow-up

As a result of the training, my ability to develop, implement and analyze the following evaluation measures has seen …

Rating

Substantial progress

Some progress

No progress

1. Use of demographic data (S4c)

3 2 1

2. Use of satisfaction data (S4d).

3 2 1

3. Use of A, K or S… (S4e).

3 2 1

4. The remainder of PI… 3 2 1

Aggregating the Data

Data Collection

• What data did we collect?– Number of

events/activities (effort)

– Satisfaction (effort)– Change of awareness

(effect)– Change of knowledge

& skill (effect)– Follow-up evaluation

(effect)

• On what, did we collect data?– Outcome: Use of

formative & summative evaluation (Systems 4)

– Performance indicators:• demographic data (S4c)• satisfaction data (S4d)• awareness, etc. (S4e)• outcomes measures

(S4g)• ongoing feedback (S4j)• dissemination (S4k)

Data

Webinar(5 point)

Series (5 point)

Visits(5 point)

Follow-up(3 point)

Demo. data m = 4 m = 3

Satisfaction data

m = 4 m = 4 m = 3

A, K or S data

m = 4 m = 4 m = 3

Outcomes data

m = 4 m = 3 m = 4 m = 3

Ongoing feedback

m = 3 m = 2.5

Dissemina-tion

m = 3 m = 2

Aggregate the data

Each event carries equal weight

Assign different weight to each event because events have different levels of importance• Follow-up evaluation should carry more

weight because the intensity of the effort was greater

Conversion to 4 point scale

Convert 5 point scale to 4 point• 5 (strongly agree) = 4 (achieved)

• 4 (agree) = 3 (nearly)

• 3 (neither) = 2 (emerging)

• 2 & 1 (disagree) = 1 (non-existent) Convert 3 point scale to 4 point

• 3 (substantial) = 4 (achieved)

• 2 (some) = 3 (nearly)

• 1 (no) = 1 (non-existent)

Webinar(4 point)

Series (4 point)

Visits(4 point)

Follow-up(4 point)

Demo. data m = 3 m = 4

Satisfaction m = 3 m = 3 m = 4

A, K or S m = 3 m = 3 m = 4

Outcomes m = 3 m = 2 m = 3 m = 4

Ongoing m = 2 m = 3.5

Dissem. m = 2 m = 3

Subtotal 3 7 16 22.5

Weight .25 .25 .25 .25

Total .75 1.75 4 5.625

Mean of the means = (.75 + 1.75 + 4 + 5.625 = 12.125)

12.125 divided by 4 events = 3.03 (nearly achieved)

What does this mean?

Effort • NTAC conducted one national webinar, a

series of webinars for one region, and several onsite consultations with two states to increase the states’ capacity to use formative and summative evaluation systems. Across the trainings, participants indicated 90% satisfaction with the skill of the consultants and the content of the activities.

What does this mean?

Effect• Across the trainings and consultations,

participants report that they are very near achieving the ability to develop, implement, and analyze formative and summative evaluation measures to increase capacity and systems change (m = 3.03/4.0 scale).

• Could elaborate by listing performance indicators if needed.

Response to NTAC Site Review: 2004 – 2005 Field test

1)Embed Outcomes and Performance Indicators (OPIs) in planning and delivery of service.

2)Embed OPIs in all evaluation measures.

3)Share our evaluation systems and data with our state/multi-state partners

Addressing the constraints

Aligning needs, delivery and evaluation through the OPIs yields a more seamless system (Constraint 1)

Sharing data facilitates the use of data (Constraint 2)

Consistency helps diminish confusion (Constraint 3)

Questions?

Contact

Region 1• Shawn Barnard

• Paddi Davies

Region 2• Jon Harding

• Barb Purvis

Region 3• Nancy Donta

• Amy Parker

Region 4• Kathy McNulty

• Therese Madden Rose

Additional Examples

The following information will not be shared during the discussion, but is being provided as additional material.

Weighting the events differently

Using the previous examples, let’s say that we believe the follow-up data should carry more weight since it indicates more long-term implementation and attainment of the outcome.

We want the follow-up evaluation to carry 40% of the weight.

Webinar(3 point)

Series (3 point)

Visits(3 point)

Follow-up(3 point)

Demo. data m = 3 m = 4

Satisfaction m = 3 m = 3 m = 4

A, K or S m = 3 m = 3 m = 4

Outcomes m = 3 m = 2 m = 3 m = 4

Ongoing m = 2 m = 3.5

Dissem. m = 2 m = 3

Subtotal 3 7 16 22.5

Weight .20 .20 .20 .40

Total .60 1.40 3.2 9.0

Mean of the means = (.60 + 1.40 + 3.2 + 9.0 = 14.20)

14.20 divided by 4 events = 3.55 (with rounding – achieved!)

Embed in basic evaluation (Service Provider 1a) Satisfaction data

• “I was satisfied with my opportunity to learn about the impact of deaf-blindness on an individual’s overall development (i.e. social, emotional, cognitive).”

Change of awareness• “I have increased my awareness about the impact of

deaf-blindness on an individual’s overall development (i.e. social, emotional, cognitive).”

Change of knowledge/skill• “As a result of the training, I can use my knowledge

about the impact of deaf-blindness on an individual’s overall development (i.e. social, emotional, cognitive) to plan instruction.”

Embed in Follow-up

Service Providers• “Based on the recent training provided on understanding

how a combined vision and hearing loss impacts learning and social/emotional development, please indicate your progress in performing the following tasks…”

Child change• “Three months ago, you received technical assistance on

understanding how a combined vision and hearing loss impacts learning and social/emotional development. As a result of that training, please indicate any progress the student has made in the following skills…”

top related