1.11 data and performance simplified

64
National Alliance to End Homelessness Washington, DC July 2011 Presented by Iain De Jong Data & Performance Measurement Simplified

Upload: national-alliance-to-end-homelessness

Post on 12-Jul-2015

706 views

Category:

Business


2 download

TRANSCRIPT

Page 1: 1.11 Data and Performance Simplified

National Alliance to End Homelessness Washington, DC

July 2011

Presented by Iain De Jong

Data & Performance Measurement Simplified

Page 2: 1.11 Data and Performance Simplified

To Get Started

•  If you want a copy of the presentation, send an email.

•  At some point presentations will be on the NAEH website – www.naeh.org - as well as oodles of other cool resources.

•  The OrgCode website - www.orgcode.com - will also contain the presentation and other information you may find useful and/or interesting.

Page 3: 1.11 Data and Performance Simplified

Interactive Options

@orgcode

[email protected]

Page 4: 1.11 Data and Performance Simplified

Overview

a)  About the Presenter b)  Why Do We Need Good Data? c)  You’re Better At This Than You Thought d)  Key Concepts and Definitions e)  How It All Works Together f)  Case Study g)  Funder Expectations h)  Creating a Data Loving Culture i)  Common Problems j)  Q and A

Page 5: 1.11 Data and Performance Simplified

About Iain De Jong

•  Managing Partner, OrgCode Consulting Inc.

•  Performance measurement, standards and quality assurance program development for $750 Million housing, shelter and homeless program service system

•  Developed and provided leadership to North America’s largest integrated ICM Housing First program

•  Reduced street homelessness in North America’s fifth largest city by 50% in three years using a data driven approach

•  Worked with non-profits, academics, funders, elected officials and government in more than 100 jurisdictions from Seattle to NYC; Whitehorse to Chattanooga; Milan to Melbourne

Page 6: 1.11 Data and Performance Simplified

About Iain De Jong

•  Examples of current projects: –  Optimizing data and performance, especially for HEARTH

–  Speaking series on ending homelessness

–  10 Year Plan Evaluations

–  Service User Surveys & Counts

–  Service Prioritization Decision Assistance Tool

–  Municipal Housing Strategies

–  Housing First, Rapid Re-housing and Intensive Case Management Training

•  Recipient of 16 national and international awards and accolades in housing and homelessness for innovation, quality assurance, leadership and policy

•  Part-time faculty member in Graduate Planning Programme at York University

Page 7: 1.11 Data and Performance Simplified

Math & The Human Services Professional

•  I know what you’re thinking

–  Pick a number between 1 and 25 –  Double it –  Add 12

–  Divide by 2 –  Subtract the original number –  And the answer is…

•  I know how you feel

Page 8: 1.11 Data and Performance Simplified

Why Do We Need Good Data?

Page 9: 1.11 Data and Performance Simplified

Good Data: Understanding What You’re Doing

“It is a capital mistake to theorize before one has data.”

Arthur Conan Doyle

Page 10: 1.11 Data and Performance Simplified

Good Data

•  The plural of “anecdote” is not data.

•  If you can’t measure it, you can’t manage it.

•  “Some” is not a number, “soon” is not a time.

•  A sample size of “one” is unlikely to be indicative of the population as a whole and is several data points short of a trend.

•  That which we think and that which we know can be two totally different things.

Page 11: 1.11 Data and Performance Simplified

Good Data

•  Demonstrate prudent use of funding.

•  Make program improvements.

•  Meet client needs.

•  Ensure delivery of service relative to mandate.

•  Inform hiring and staffing decisions.

Page 12: 1.11 Data and Performance Simplified

Good Data A Client-Centred Approach

# of Different Clients Served

# of Clients Served > 1 Time

Quality of Life Changes As A Result

of Service

Unmet Service Needs

System Gaps

Basic Demographics

What Happened As A Result of Your

Service?

Advocate for System Change

Prioritize Who Gets Served Using Which

Services

Page 13: 1.11 Data and Performance Simplified

Good Data: 5 Good Reasons To Collect It & Use It

1.  To understand whether current activities are working to achieve intended results.

2.  To drive program improvement and share information on effective practices with others.

3.  To ensure a common understanding among all partners, staff, and clients of what you intend to achieve and how you intend to do it.

4.  To communicate and advocate for community support, public interest, combating NIMBY, leveraging funding.

5.  To accomplish your goals. What gets measured, gets done.

Page 14: 1.11 Data and Performance Simplified

How Many?...How Much? ™

A statistical measure of Badness

Really Bad Not so

bad

Page 15: 1.11 Data and Performance Simplified

PAUSE

-  Someone (or seven) has come in late -  If they sat near you, tell them how they can get the presentation

Page 16: 1.11 Data and Performance Simplified

You’re better at this than you thought!

Page 17: 1.11 Data and Performance Simplified
Page 18: 1.11 Data and Performance Simplified

Learning to ask the right questions.

I want to live in a world where a chicken can cross the road without its motives being questioned.

Page 19: 1.11 Data and Performance Simplified

Find x

3”

6”

X

Page 20: 1.11 Data and Performance Simplified

Find x

3”

6”

X

Found it!

Page 21: 1.11 Data and Performance Simplified

Key Definitions & Core Concepts

“It depends on what your definition of ‘is’ is.”

- Bill Clinton

Page 22: 1.11 Data and Performance Simplified

Inputs

Inputs include resources dedicated to, or

consumed by, the program: money, staff and staff

time, volunteers and volunteer time, facilities,

equipment, and supplies.

Page 23: 1.11 Data and Performance Simplified

Activities

Activities are what the program does with the inputs to

fulfill its mission, such as providing shelter, managing

housing subsidies, or providing case management.

Page 24: 1.11 Data and Performance Simplified

Outputs

Outputs are the direct products of program activities.

Outputs are usually presented in terms of the volume of work accomplished: the number of participants served, the percentage of participants who received rent subsidies and the average subsidy value, or the frequency and intensity of service engagements each participant received.

Outputs document what you delivered, so you can exactly replicate or adjust your approach in the future.

Page 25: 1.11 Data and Performance Simplified

Outcomes

Outcomes are benefits or changes among clients during or after participating in program activities.

Outcomes may relate to change in client knowledge, attitudes, values, skills, behaviors, conditions, or other attributes.

You can quantify a program’s outcomes by methodically mapping and describing its results.

Page 26: 1.11 Data and Performance Simplified

Let’s Make & Bake a Cake!

•  What resources (inputs) do we need?

•  What do we need to do (activities) with those inputs?

•  What do those activities result in (outputs)?

•  What is the benefit (outcomes) of those outputs?

Page 27: 1.11 Data and Performance Simplified

Outputs and Outcomes: Common Confusion

OUTPUTS…

•  Focused on what the client and/or program will do to achieve the outcome.

• Quantified in terms of the frequency and intensity of the activity from the client’s perspective.

• Specific to the activity described for the program.

•  Feasible.

• Attainable.

• Understandable to someone outside of the program.

OUTCOMES…

•  Focused on what the client will gain from that program.

• Quantified in terms of the client-level impact with clear targets and methods.

• Specific and attributable to (a result of) that program.

• Meaningful.

• Attainable.

• Understandable to someone outside of the program.

Page 28: 1.11 Data and Performance Simplified

Creating a Data Typology

•  Remember what you learned in 9th Grade Science

•  Categorize your data based upon such things as funding source, type of program, type of service, characteristics of client served (singles, families, veterans, substance users, consumer survivors), who offers the service, etc.

Page 29: 1.11 Data and Performance Simplified

Creating a Data Typology

•  Don’t lose sight of the client in the typology!

•  The six types of people you serve will always be the same:

1.  Someone’s father 2.  Someone’s mother 3.  Someone’s brother 4.  Someone’s sister 5.  Someone’s son 6.  Someone’s daughter

Page 30: 1.11 Data and Performance Simplified

Logic Models

•  Used to organize inputs, activities, outputs and outcomes for a project or program.

•  Clearly state the problem/need and a service to address that problem/need.

•  Should make the road map clear – not purposely obfuscate.

•  May provide indication of how evaluated, other measures to be considered, and/or timeline.

Page 31: 1.11 Data and Performance Simplified

Agency Name: Problem Statement: Service:

Input Activity Output Outcome Measurement Tool

Timeline

Project Logic Model

A project logic model may look like this…

Page 32: 1.11 Data and Performance Simplified

Agency Name: Problem Statement: Program:

Service Input Activity Output Outcome Measurement Tool

Timeline

Program Logic Model

A program logic model may look like this…

Page 33: 1.11 Data and Performance Simplified

Interactive Case Study

It’s All About You…

Page 34: 1.11 Data and Performance Simplified

Your Reality

•  One volunteer raise your hand!

•  Congratulations – you now get to pick someone else at random!

•  Let’s walk through the inputs, activities, outputs and outcomes of a service you are doing or thinking about doing together.

Page 35: 1.11 Data and Performance Simplified

Your Reality: Everyone This Time!

•  Think of a service you provide or are thinking of providing

•  Jot down what you believe the inputs, activities, outputs and outcomes would be

•  Chat with the person on either side of you

Page 36: 1.11 Data and Performance Simplified

Analyzing Your Data

Page 37: 1.11 Data and Performance Simplified

Start with a Baseline

•  Baseline data is the basic information gathered before a program starts.

•  Baseline data is used later to provide a comparison of whether the program is having an impact.

•  Clear goals and objectives for a program make it easier to determine baseline data.

•  There are two types of baseline data: –  Determinate – clearly indicated by the goals and

objectives of the program. –  Indeterminate – useful for understanding context,

though not directly related to goals and objectives.

Page 38: 1.11 Data and Performance Simplified

Turkeys & Mutual Funds – What Both Have to Tell Us About

Community Responses to Homelessness

38

Page 39: 1.11 Data and Performance Simplified

39

Page 40: 1.11 Data and Performance Simplified

What the Turkey Story Tells Us

40

Page 41: 1.11 Data and Performance Simplified

Set Clear Targets & Milestones in Advance

•  Target is the future reference point to aim your goal towards.

•  While some will set a target as an aspiration, it is better to be SMART: Specific, Measurable, Attainable, Realistic and Timed.

•  Milestone is a pre-determined or major project event, such as the first cohort through a program, the end of a funding year, 100th graduate, etc.

Page 42: 1.11 Data and Performance Simplified

Create a Data Analysis Plan at the Beginning

•  Look at how you are doing compared to your baseline at predetermined times — quarterly, annually, etc.

•  Look at how you are doing relative to your targets at predetermined — monthly, annually, etc.

•  Look at how you are doing each time a milestone is reached.

Page 43: 1.11 Data and Performance Simplified

Make Data Analysis Happen Effectively

•  Do it when you say you will do it.

•  Present your findings in different ways – visually, written, numerically, etc.

•  Avoid common mistakes: incomplete data set; averaging averages; using words like “sample” or “significant” in the wrong way; etc

•  Don’t ignore the data if it tells you something you didn’t want to see/know.

Page 44: 1.11 Data and Performance Simplified

"The only man I know who behaves sensibly is my tailor; he takes my measurements anew each time he

sees me. The rest go on with their old

measurements and expect me to fit them.”

- George Bernard Shaw

www.worldandi.com

Page 45: 1.11 Data and Performance Simplified

Another Way of Thinking About How to Simplify Data

Page 46: 1.11 Data and Performance Simplified

A Vision to End Homelessness

Page 47: 1.11 Data and Performance Simplified

And if you are a funder – how to be a nicer, gentler and more

organized requester of data…

Meeting Funder Expectations

Page 48: 1.11 Data and Performance Simplified

Grants Management Lifecycle

Page 49: 1.11 Data and Performance Simplified

HEARTH: Key Indicators

•  HEARTH identifies the following indicators to be used by HUD: –  Length of time homeless –  Recidivism (subsequent return to homelessness) –  Access/coverage (thoroughness in reaching persons who

are homeless) –  Overall reduction in number of persons who experience

homelessness –  Job and income growth for persons who are homeless –  Reduction in first time homeless –  Other accomplishments related to reducing homelessness –  Prevention/independent living for families with children and

youth defined as homeless under other federal programs

Page 50: 1.11 Data and Performance Simplified

Meeting Funder Expectations

•  Meet core requirements of funding approval.

•  Ensure data collection is in place at the time the program is started – not after!

•  Don’t be afraid to collect data to meet your own needs too!

Page 51: 1.11 Data and Performance Simplified

And if you are a Funder…

•  Have a longitudinal perspective even if the funding is for a shorter number of years at a time.

•  Be consistent.

•  Create business practices that ensure data collection is accurate and comparable.

•  No retroactive requests!

•  Respect that the landscape of community partners is already stressful…

Page 52: 1.11 Data and Performance Simplified

Under Pressure: Stress Landscape of Community Partners

Page 53: 1.11 Data and Performance Simplified

Some Data Requests Are Outside Your Control

•  Lift your right leg. •  Turn your leg clockwise. •  Extend your right index finger. •  In the air, draw the number six. •  What happened to your leg? •  Now force your leg to go back to the original clockwise

direction.

Page 54: 1.11 Data and Performance Simplified

Creating a Data Loving Culture

Helping you, your staff and peers find their inner nerd and love it.

Page 55: 1.11 Data and Performance Simplified

Align Metrics to Vision

Page 56: 1.11 Data and Performance Simplified

Creating a Data Loving Culture

•  Ask for input – what do others think is important to track.

•  Be consistent in what you want and how you want it.

•  Follow change management practices and realize the six stages of change apply to data behavior too!

•  Create time in the work day for data entry as part of the “real” work.

•  Use data to report back in staff meetings, newsletters, community meetings, press releases, fundraisers, etc.

Page 57: 1.11 Data and Performance Simplified

Creating a Data Loving Culture

•  Set realistic targets and incentivize!

•  Simplify the message.

•  Track only what you need to track to know if the mission is being accomplished.

•  Systematic review as easy as 1, 2, 3!

•  Use adult learning strategies to make your data “real” to all of your audiences.

•  Avoid junk science and distracters.

Page 58: 1.11 Data and Performance Simplified

Common Problems

And how to fix them.

Page 59: 1.11 Data and Performance Simplified

•  Difference between output and outcome

•  Overshooting on targets

•  Lag in data entry

•  Staff resistance

•  Output is what will be done. Outcome is what will be gained – the result.

•  Examine targets on a regular cycle. Revise at start of funding cycle.

•  Ensure time is set aside every day for timely data entry

•  Fire them. •  Kidding. •  Try the ideas for a data loving culture.

Problem Solution

Page 60: 1.11 Data and Performance Simplified

•  Just meeting funder needs

•  Complex data system/ HMIS

•  Different approaches across multiple program areas in one agency

•  Multiple funders with different requirements

•  Focus on your own needs and becoming reflective practitioners.

•  Consider moving to a friendly platform. Build in time for training.

•  Establish an agency standard. Appoint someone to create common process.

•  Slap them. •  Kidding. •  Try to find commonality in

outputs and alter logic model to best capture other areas.

Problem Solution

Page 61: 1.11 Data and Performance Simplified

•  Confusing logic model

•  Brainiacs mucking around

•  Want to analyze the data in different ways

•  Keep it simple. Consider starting at a service level and working up to a program level.

•  Design your data process, data collection and data analysis such that a PhD is not required.

•  When choosing an approach make sure you can export data for other analysis.

Problem Solution

Page 62: 1.11 Data and Performance Simplified

•  Can’t tell if making a difference

•  A gazillion pieces of data

•  Fruits and vegetables

•  Forgot the good stuff from the presentation

•  Don’t collect data for data sake. Ensure what you collect tells you if you are getting results.

•  KISS. •  Collect only what you need to know

if you are achieving results.

•  Build consistency throughout your agency and entire CoC. Consider a data committee.

•  Email me. You can have the whole thing.

Problem Solution

Page 63: 1.11 Data and Performance Simplified

Questions

Page 64: 1.11 Data and Performance Simplified

Thank you. twitter.com/orgcode

@orgcode