performance monitoring : thoughts, lessons, and other practical considerations

17
Performance Monitoring: Thoughts, Lessons, and Other Practical Considerations

Upload: heather-dean

Post on 11-Jan-2016

214 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

Performance Monitoring:Thoughts, Lessons, and Other Practical

Considerations

Page 2: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 2

Objectives

Identify key concepts Discuss applications Address questions and concerns

Page 3: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 3

Key Concepts

Performance monitoring What you do

How well you do it

Do you accomplish something

Process, quality, capacity, outcomes The window Baselines and standards Risk or case mix adjustment

Page 4: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 4

Performance Monitoring

Part of much larger cycle of program design and implementation

Performance - this is about definitions Inputs, outputs, or the relationship

between inputs and outputs?

Monitoring - This is about data collection and analysis Important with respect to investment - are

you getting something back?

Page 5: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 5

The Framework

Page 6: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 6

Process of care

Referral, intake, and assessment Service planning, link to

interventions Reassessment, follow-up, case

closure

Page 7: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 7

Quality of Care

Human resources Physical plant and equipment Practice protocols - evidence base Supervision Consumer feedback Agency management around

practice model fidelity

Page 8: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 8

Capacity

Enough trained workers Enough office space Enough funding Enough information Enough is defined by the relationship

between process, quality, and outcomes

Page 9: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 9

Outcomes

Depends on the program and intervention Well-being

Safety

Family provides stable nurturing

Education

Health

Behavioral health

Page 10: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 10

Process, Quality, and Outcomes

Highly interdependent Quality depends on a process

Process is different than quality

Quality without outcomes is ‘inefficient’

Agencies invest in process, quality, and capacity

Page 11: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 11

The Window

Performance happens in time Improvement is change in

performance over time Sampling in time is difficult but

critical

Page 12: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 12

Clinical Experience in Time:(Each line represents the start and end of service within

the window)

Jan. 1, 2000 Jan. 1, 2001 Jan. 1, 2002

Page 13: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 13

Sampling

Inception Process vs child

How much time do you have to observe the process?

Page 14: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 14

Baselines and Standards

Baseline is a measure taken prior to intervention

Standards of practice and performance The usual as in standard practice

Fidelity or compliance

Standards are better suited to process and quality; baselines are better suited to outcomes

Page 15: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 15

Risk or Case Mix Adjustment

An important question when facing variation in performance:Is the variation a function or performance or

the result of client differences

Children/families have different outcomes for reasons that are intrinsic to them Baseline mortality rates differ by age

Adjustment for case mix refers to taking the intrinsic differences into account somehow when measuring outcomes

Page 16: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 16

Case Mix Adjustment Applied

Case mix adjustment makes more sense for outcomes, less so for process and quality

Process/quality standards apply to all children, given the process standard applies in the first instance (differential diagnosis)

Baselines for outcomes should be adjusted

Standards don’t work as well for outcomes because of the random component.

Page 17: Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations

slide 17

Comments, Questions, Concerns

Thank you!