program evaluation, research design, and the logic model research methods for public administrators...

59
Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material, by Dr. Mario Rivera for PADM 552, Designing Applied Research

Upload: allen-flowers

Post on 25-Dec-2015

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Program Evaluation, Research Design, and the Logic Model

Research Methods for Public AdministratorsDr. Gail Johnson

Adapted, with additional material, by

Dr. Mario Rivera for PADM 552,

Designing Applied Research

Page 2: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

2

Multiple Methods—Mixed Methods Combination and Integration of Quantitative and Qualitative

Methods Neither is inherently better Realities of the situation rule Each work well in some situations, less well in others Arguably, all research is qualitative at its inception (choosing and

defining the research question) and at its end (interpreting, evaluating findings); most program evaluations and other applied research involve multi-method triangulation (triangulation of both method and data)

Quantitative and qualitative data collection often used together—mixed-methods approaches Available data with surveys Surveys with observations Pre-post data analysis (for instance, using t-tests) Surveys with focus groups

Page 3: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

3

The Hypothesis DilemmaHypotheses are designed to express relationships between or among variables. They are testable propositions that formally state theoretical expectations about these relationships among. If this is the nature of your topic or question, a hypothesis can add to your research. However, if your topic is more descriptive or explorative and analytical, generating a hypothesis may not be appropriate. A hypothesis may not be appropriate if: You have not identified and articulated a particular theoretical construct You do not have a set of defined variables. Your question turns on phenomenological description Your question entails an analytical review of the problem posed Your question involves in whole or part the study of a cultural group You will both engage in and research the program in question

Page 4: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

What to Evaluate? Projects: a single intervention in one location or a single

project implemented in several locations. Programs: an intervention comprised of various

activities or projects which are intended to contribute to a common goal.

Organizations: multiple intervention programs delivered by an organization. While seldom done, it is possible and sometimes desirable to evaluate an entire organization, or a complex of programs essentially defined by and defining an entire organization—e.g., the curriculum of a graduate program, its organizational core, without necessarily looking at all operational elements of the organization.

Page 5: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

When to Evaluate? Before program starts:

To improve design During the implementation:

To improve implementation Identify barriers to be removed Lessons learned about implementation To assess a critical element of a program under

review, for instance a risk-management training component of an equal opportunity office in a public agency.

Page 6: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

When to Evaluate Mid-term evaluation

Relevance, effectiveness, efficiency Lessons learned: management tool

Impact evaluation (versus outcome evaluation) Either at the end of the project or a few years after the program has

been operating: assessing a mature program Can also look at: effectiveness, efficiency, early signs of impact and

sustainability Lessons learned for future projects or programs Distinction between short- and medium-term outcomes or results

versus longer-term impacts. An impact is the net, long-term effect or complex of effects ensuing from a program. Gross impacts are affected by other causes, programs, secular changes (e.g., in attitudes toward smoking). If one can gauge gross impacts (the longer after the program the more the exogenous influences), can one isolate and evaluate net program impacts?

Page 7: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Why Is Evaluation Useful?

Feedback Accountability Learning Improvement Results Testing underlying assumptions or theory (change

model or theory of change, action model or action theory)

Funding decisions—publicizing and disseminating program results

Page 8: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Evaluation QuestionsCompliance/ Accountability Questions

Did the promised activities actually take place as they were planned? Implementation fidelity

“How” Questions What was the sequence or processes that led to successful (or unsuccessful) outcomes

Impact Questions Did the program achieve the desired results? What of positive but unplanned ones? Intended and unintended consequences

Page 9: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Types of Evaluations

Auditing: accounting for moneyIs the money being spent according to plan?Efficiency and effectiveness.

Monitoring: measuring implementation and resultsIs the intervention producing the intended results?

Process: measuring operations and service deliveryAre there problems in service delivery?

Page 10: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Types of Program Evaluations

Feasibility evaluations Before the program begins Intended to improve program design

Evaluability assessments Assesses potential usefulness of the evaluation Used to test out different strategies for conducting

an evaluation What is doable given the situation?

Page 11: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Evaluability Assessment

Helps to define the actual objectives, implementation and management of a program. The actual objectives may differ from those initially

planned. Determines the coherence of the program: are

goals, activities, program infrastructure linked?

Page 12: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Evaluability Assessment

Key steps in the process: Interview key program staff to actual program mission,

goals, objectives and activities. Site visits to observe and get a sense of what is going on.

May include interviews with key stakeholders. Observe program delivery.

Page 13: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Evaluability Assessment

Reach agreement as to: Whether to conduct the evaluation. Scope and objectives of the evaluation.

The decision could be to not conduct the evaluation.

Page 14: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Evaluability Assessment: Challenges

Key components of the program may not be well defined: Lack of agreement on program objectives. Lack clear, measurable indicators of performance

and/or impact. Target group may not be clearly defined. The delivery system is poorly articulated.

Page 15: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Types of Program Evaluations

Formative evaluations During implementation Feedback about operations and processes Used to make mid-course corrections

Page 16: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Definition: Performance Monitoring

Performance monitoring: the continuous process of collecting and analyzing data

to compare how well a project, program or policy is being implemented against expected results.

Performance measurement tells you the what of what has occurred, while program evaluation explores the why. Program evaluation requires performance data but brings it to interpretive effort eventuating in judgments of value—evaluative decisions, or evaluation.

Traditional vs. newer forms of assessment: Traditional focus on inputs, activities and outputs. Contemporary emphasis on if-then causal questions,

incorporating change and action modeling.

Page 17: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Types of Evaluation:Monitoring

Ongoing review: On-time On-budget On-target

Linked with ongoing management Measured against established baselines Indicators of progress toward targets

Page 18: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Types of Program Evaluations

Summative Evaluations At the end of the program or after the program has been running

long enough to achieve its goals (with “mature” programs) Identify lessons learned Other issues: unintended outcomes, program sustainability,

program efficiency, costs and benefits Sometimes called impact evaluations and ex-post evaluations

Page 19: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Program Evaluation

Summative Evaluation Question: Do Public Programs Work?

Implied cause-effect relationship Did the program cause a desired outcome?

Performance-based: Focus on outcomes, results, impacts, goal achievement.

Page 20: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Differences

Formative Evaluations Project Monitoring Best Suited to Early Years of Implementation Key Question:

Are we doing things right? Have we hired the right people with the right skills? Have we marketed the program effectively? Have we met our strategic objectives? Have we spent our money according to our plan?

Page 21: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Differences

Summative Evaluations Measuring Results or Impacts A longer time before results or impacts are visible Key Question:

Are we doing the right thing? This gets back to the theory or underlying assumptions of the

program: We can do an excellent job at training people but if the problem

derives from larger structural economic issues, a training program, no matter how well implemented, may show little result.

Page 22: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Participatory Evaluation

Responsibility for planning, implementing, evaluating and reporting is shared with all stakeholders.

A partnership based on dialogue and negotiation.

Page 23: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Participatory Traditional

Participant focus and ownership

Focus on learning Flexible design Rapid appraisal methods Outsiders are facilitators

Donor focus and ownership

Focus on accountability and judgment

Predetermined design Formal methods Outsiders are evaluators

Page 24: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Participatory Evaluation

Participants: Engage in group discussions Conduct interviews Conduct field workshops Analyze data and develop conclusions Write the report Disseminate the evaluation results

Page 25: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Participatory Evaluation

How is it done? No single right way Commitment to the principles of participation and inclusion

Core assumption: Those closest to the situation have valuable and necessary information

Develop strategies to develop trust and honest communication Information sharing and decision-making Create even tables

Page 26: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Participatory Evaluation

Benefits: Increased credibility of results Results are more likely to be used Increased buy-in, less resistance Increased sustainability A tool for empowering the affected populations More flexibility in approaches

Page 27: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Participatory Evaluation

Challenges: Time consuming Clarifying roles, responsibilities and process Skilled facilitation Just-in time training No predetermined evaluation plan May be seen as less objective

Page 28: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Participatory Evaluation

Is it the right approach for you? Is there a need for:

An independent outside judgment? Technical information?

Will stakeholders want to participate? Is there sufficient agreement among the stakeholders

so they can work together?

Page 29: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Rapid Assessments

Described as:

“fairly-quick and fairly-clean” as opposed to “quick and dirty” or “long and clean” studies.

Page 30: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Rapid Assessments Uses

Diagnostic evaluations: processes, problems and causes of problems.

Trouble-shooting: Why isn’t a program working as expected? What can be done to improve the program?

Page 31: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Rapid AssessmentsData Collection

Observe: Patterns, land use, behavior Converse: Talk with people, listen to

their concerns and views; interviews, meetings.

Record: Write everything down. Locate: Existing data: reports,

records, maps, prior studies.

Page 32: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Principles of Rapid Assessments

Process: Don’t rush when gathering information Probe and explore Listen rather than speak Be unimposing, open, non-judgmental Seek out the people most likely to be overlooked and

find out what concerns them

Page 33: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Rapid Assessments:No Fixed Definition

Intended to do evaluations quickly while obtaining reasonably accurate and useful information

Uses a systematic strategy to obtain just essential information

Focus is on practical issues

Sources: FAO.Org/docrep/

Page 34: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Principles of Rapid Assessments

Optimize trade-offs: quantity, relevance, accuracy and timeliness. Planned but ability to pursue serendipity.

Triangulation: use more than one technique and source of information.

Face-to-face, on site learning. Learn with a general blueprint that is adapted as more

information is obtained: use multiple methods, improvisation, cross-checking.

Source: FAO.Org/docrep/.

Page 35: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Rapid Assessments

Is it the right approach for you? Important but no major decisions based on the study. Not controversial so face validity is sufficient. Limited time and resources available. Information is already available or requires in-depth

understanding of process.

Page 36: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Social Assessment

Social assessment is a process that brings relevant social information into the decision-making process for program design, implementation, monitoring and evaluation.

It assists in forming key outcome measures to be used in evaluation.

Page 37: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Social Assessment

Tools and Approaches include: Stakeholder analysis Gender analysis Participatory Rapid Appraisal Observation, interviews, focus groups Mapping, Analysis of Tasks, Wealth Ranking Workshops Objective-oriented Project Planning

Page 38: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Working with Logic Models

Visualize a program in context Systems approach, within an environment

Identify the relationships between various components

Identify cause and effect Identify key assumptions

Page 39: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Models: Cause and Effect:

Did the program cause something to happen?

Education Employment

Page 40: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Reduced Poverty

Improved Quality of Life

Increased Income

Job

Training

Unemployed

Hierarchy of Objectives

Sequencing

Page 41: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Logic Models

The focus is on results or impacts rather than inputs and activities We are not training people just for the sake of training people We believe if we train the chronically unemployed, then there

quality of life will be improved and poverty will decrease. Our goal is to reduce poverty

Also called Program Outcome Model or Measuring for Results

Remember, models are not reality; avoid reification. What makes a good model?

Page 42: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Elements of the Logic Model

Inputs: what resources are used University inputs: budget, number of faculty,

number of staff, number of buildings, number of classrooms

Activities: what the program does University activities: teaching, research, and

service

Page 43: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Elements of the Logic Model

Outputs: the services or products produced University outputs: number of students that graduate,

number of articles and books published by faculty Outcomes: what happened: immediate results

Graduates are sought after, get good jobs, active alumni who donate big bucks

Faculty well-known, obtain big grants, enhance rating of university

Page 44: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Elements of the Logic Model

Impacts: the “so what.” Larger, long term results, usually tied to program goals. A more informed and engaged citizenry,

preserves democratic institutions, future leaders. Faculty research contributes to knowledge.

Page 45: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Logic Model

Logical Connections: Inputs to do activities Activities lead to outputs Outputs lead to one or more outcomes Outcomes lead to impacts

Page 46: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Logic Model: Training Program

Inputs Activities Outputs

Resources

•money

•staff

•Supplies

•mentors

•Training Programs

•Dress for success coaching

•Interview coaching

•Resume assistance

Products

•Number of graduates per training session

•% graduate rate

Outcomes

Benefits, changes

•Increased skills

•% Obtain jobs

•% Obtain high paying, quality jobs

•Increased self-esteem

Impacts

Goals

•Increased income

•Self-sufficiency

•Family stability

•Reduction in poverty

Page 47: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Mario Rivera 47

Complex effects chain in partnered programs—network logic models

Partners 1, 2, 3, etc.

Shared Common Outcomes

Attribution difficulties; transparency & accountability challenges

Page 48: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Mario Rivera 48

Participatory evaluation

Participatory evaluation is complex, requiring a correspondingly complex evaluative approach that can adequately deal with complex causality and interaction

The case study is one approach to evaluation that can capture such complexity, through “thick description.”

Page 49: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Mario Rivera 49

Logic Models The focus is on results or impacts rather than inputs and

activities, although all of these are specified, along with indicators and measures. For example, you are in effect saying that We are not training people just for the sake of training

people. We believe if we train the chronically unemployed, then they

might gain meaningful and sustainable employment. Their quality of life will be improved thereby, and with enough such results from this and other efforts poverty will decrease.

Our strategic goal is to help improve that quality of life and reduce poverty—these are the anticipated or hoped-for program impacts.

Also called the Program Outcome Model, Measuring for Results, etc.

Page 50: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Mario Rivera 50

Logic Models A logic model is a systematic and visual way to present

and share your understanding of the relationships among the resources you have to operate your program, the activities you plan to undertake, and the changes or results you hope to achieve

Provides stakeholders with road map describing the sequence of related events connecting the need for the planned program with the program’s desired results

A program design and planning tool A program implementation tool as the core of a focused

management plan Program evaluation and strategic reporting: presents

program information and progress toward goals, telling the story of the program’s commitments and successes.

Page 51: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Mario Rivera 51

Schematic Logic Model

Inputs

Activities

Outputs

Outcomes

Impacts

Page 52: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Mario Rivera 52

Logic Models and Planning Well established, easy-to-use tool for planning programs, eliciting

and using stakeholder input. Provide a common language among managers, program or project

principals, stakeholders including funders, and impacted communities.

Graphic way to articulate—make explicit—and communicate program theory to internal and external audiences.

Provides planners with a road map – asking them to determine where they want to end up (goals) and then chart their course of action. Logic modeling may help program managers in program planning as such, by bringing them to a fuller articulation of program goals and the connection among program aims, activities, and desired outcomes or impacts.

Page 53: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Logic Model example—Preventive Health Ed. for an

Immigrant Community

POPULATION

Characteristics; needs

INPUTS

Resources

ACTIVITIES

Strategies, services

OUTPUTS

program participation

OUTCOMES

desired changes in the population; impacts

–Low income, limited English-speaking community–Low use of health care coverage–Low use of preventive health services–Mostly employed in temporary and/or part-time positions–Community without an efficacious concept or custom of preventive health care in the way defined by the program (e.g., mammograms)

–Program and agency staffing, other resources–Funding–Existing curriculum and volunteer health educators–Prevention media–Verbal and written translation and interpreting resources

–Health care coverage review–Education about other available coverage–Prevention education sessions–Preventive health services in non-traditional locations–Focus groups–Regular tracking of health care coverage and preventive service use

–Number of new families signed up for coverage–Number of lapsed coverage renewed–Number attended education session about available resources–Number of contacts in non-traditional settings–Number of focus groups

–Immigrant families will better understand the importance of preventive health services–Participating families will schedule and complete an increased number of well-child visits, cancer screening, prenatal checkups, etc.–Immunization rates will increase among children in the target population–The number of workdays or school days missed due to illness will decrease

Page 54: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Another template: If-then sequencingInputs Assumption

or Needs or Underlying Condition

Activities Immediate Outcomes

Intermediate Outcomes

Long Term Outcomes,

Results,

or Impacts

Page 55: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Cascading Outcome-focused Logic Model

Workforce Development Outcome Model

Organizational-level Systemic Model

Systemic model outputs and outcomes cascade down to next program level, for example, as inputs and resources or program context.

Youth Development Outcome Model

Financial Literacy Training Outcome Model

•Human Capital•Technology

Page 56: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Logic Model for the New Mexico SPF SIGThe Strategic Prevention Framework State Incentive Grant (SPF SIG) is the

federal Substance Abuse and Mental Health Services Administration's (SAMHSA) major demonstration project of their new Strategic Prevention Framework (SPF) and a Center for Substance Abuse Prevention’s (CSAP) flagship initiative. The SIG is a five-year cooperative agreement from CSAP to states. States receive up to $2.35 million per year for five years of which 85% must go to communities and 15% for State administration activities including a state-wide needs assessment and evaluation. The required components to the SPF SIG are as follows):Create a State epidemiological workgroup and state advisory board Have data-driven planning set state/local priorities Have a funding mechanism for targeting communities Address underage drinking in needs assessment, focus on prevention In 2005, New Mexico was in the first cohort of states to receive a SPF SIG grant. In FY 2006, the state began in local communities. New Mexico had five years of funding and an additional 6th year of an unfunded extension. National & State SPF SIG Goals—the overarching national goals are to: prevent onset and reduce progression of substance abuse, including underage drinking; reduce substance-related problems in communities; build prevention capacities and infrastructure at State and community levels

Page 57: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

SPF SIG New Mexico Community Logic ModelReducing alcohol-related youth traffic fatalities—change modeling

High rate of alcohol-

related crash mortality

Among 15 to 24 year olds

Low or discount PRICING of alcohol

Easy RETAIL ACCESS to Alcohol for youth

Easy SOCIAL ACCESS to Alcohol

Media Advocacy to Increase Community

Concern about Underage Drinking

Restrictions on alcohol advertising in

youth markets

SOCIAL NORMS accepting and/or encouraging

youth drinking

PROMOTION of alcohol use (advertising, movies,

music, etc)

Low ENFORCEMENT of alcohol laws

Underage

DRINKING AND DRIVING

Social Event Monitoring and

Enforcement

Bans on alcohol price promotions and

happy hours

Young Adult

BINGE DRINKING

Enforce underage retail sales laws

InterveningVariables

Strategies(Examples)

Substance-Related

Consequences

SubstanceUse

Low PERCEIVED RISK of alcohol use

Young Adult

DRINKING AND DRIVING

Underage

BINGE DRINKING

Page 58: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Multiplicity of Contextual and Causative Factors That Influence a Program’s

Community ImpactMulti-System, Cross-sector Collaboration

Economy

Community Impact

Grant Funding

Inter-agency Community Partnership

Project Outcomes

Policy ChangesCitizen Efforts

Other Funding

Time

Page 59: Program Evaluation, Research Design, and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson Adapted, with additional material,

Creative Commons

This powerpoint is meant to be used and shared with attribution

Please provide feedback If you make changes, please share freely and send

me a copy of changes: [email protected]

Visit www.creativecommons.org for more information