an fra r&d evaluation framework for grade crossing and...

35
An FRA R&D Evaluation Framework for Grade Crossing and Trespass Prevention Evaluation Projects 2014 Global Level Crossing Safety & Trespass Prevention Symposium August 3 8, 2014 Urbana, IL MICHAEL COPLEN Senior Evaluator Office of Research and Development Office of Railroad Policy and Development Federal Railroad Administration

Upload: others

Post on 15-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

An FRA R&D Evaluation Framework for Grade Crossing and Trespass Prevention Evaluation Projects

2014 Global Level Crossing Safety & Trespass Prevention Symposium

August 3 – 8, 2014Urbana, IL

MICHAEL COPLENSenior Evaluator

Office of Research and DevelopmentOffice of Railroad Policy and Development

Federal Railroad Administration

Page 2: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

• Logic of Federal R&D Programs

• Roles of Evaluation

• Evaluation Standards

• CIPP Evaluation Framework

• Evaluation Framework for Suicide R&D

– Context

– Input

– Implementation

– Impact

• Evaluation as a key strategy tool

EVALUATION OVERVIEW

Page 3: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

FundedActivity“Family”______e.g.,

ScientificResearch

TechnologyDevelopment

Deliverables/Products

Technical Report(s)

ForecastingModel(s)

Application of Research

Data Use Adoption of Guidelines, Standards

or Regulations

ReducedAccidentsInjuries

ACTIVITIES OUTPUTS OUTCOMES IMPACTS

ChangingPracticesEmergent

Outcomes

NegativeEnvironmental Effects

PositiveKnowledge Gains

Why Evaluation? Assessing the logic of R&D Programs

Page 4: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

FORMATIVE SUMMATIVE

When: • Before or during R&D projects/programs

• After R&D projects/programs

Purpose: To guide:

• program planning• program design• implementation strategies

To assess:

• Completed projects or project lifecycles

• Accomplishments• Impacts

To meet accountability requirements

Primary Focus:

• To improve programs • To prove program merit or worth

Roles of Evaluation

Page 5: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Program Evaluation Standards:Guiding Principles for Conducting Evaluations

• Utility (useful)

• Feasibility (practical)

• Propriety (ethical)

• Accuracy (valid)

• Accountability (professional)

Note: The Program Evaluation Standards were developed by the Joint Committee on Standards for Educational Evaluation and have been accredited by the American National Standards Institute (ANSI).

Page 6: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

CIPP Evaluation Model:(Context, Input, Process, Product)

• Context

• Input

• Implementation

• Impact

Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model framework for use in guiding program evaluations of the Federal Railroad Administration's Office of Research and Development. For additional information, see Stufflebeam, D.L. (2000). The CIPP model for evaluation. In D.L. Stufflebeam, G. F. Madaus, & T. Kellaghan, (Eds.), in Evaluation models (2nd ed.). (Chapter 16). Boston: Kluwer Academic Publishers.

Stakeholder engagement is key

Types of Evaluation

Page 7: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Context Inputs Implementation Impact

FormativeEvaluation(proactive)

Identifies:

• Needs• Problems• Assets

Helps set:

• Goals • Priorities

Assesses:

Alternative approaches

Develops:

Program plans, designs,budgets

Monitors implementation

Documents issues

Guides execution

Assesses:+/- outcomes

Reassess:project and program plans;

Informs:Policy development Strategic planning

Summative Evaluation (retroactive)

Assesses:

Originalprogram goals & priorities

Assesses:

Original procedural plans & budget

Assesses:

Execution

Assesses:

OutcomesImpacts Side effectsCost-effectiveness

Evaluation Framework:Roles and Types of Evaluation

Page 8: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Example Evaluation Questions –Railway Suicides

Context Inputs Implementation Impact

FormativeEvaluation

Suicide prevalence?

Demographics?

Suicide trends?

Why rail?

Highest priority needs forreducing railway suicides?

Given rail suicide prevalence and trends, what potential countermeasures exist to reduce the frequency?

What are the strengths and weaknesses of each potential countermeasure?

What are the data gaps and how can they be filled?

What aspects of the original countermeasures were implemented as planned and what had to be changed?

To what extent is the cost of a countermeasure a barrier to implementation?

To what extent are railroad carriers using effective suicide countermeasures?

What are the emerging outcomes (+ and -) of using one or multiple types of countermeasures?

Summative Evaluation

To what extent did research efforts address highest priority needs?

What countermeasureswere selected and why?

How did they compare to other possible alternatives?

To what extent weresuicide countermeasures carried out as planned?

Were they revised with an improved approach?

What are the effects and impacts of the countermeasures on suicide prevention? Other areas (e.g. community involvement).

Were there any unintended consequences?

What are the lessons learned?

Page 9: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

CONTEXT EVALUATION

Page 10: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Context Evaluation: Preliminary studies No reporting requirements (prior to June, 2011)

Unknown prevalence

Root causes unknown

• Prevalence studies– Coroner records, railroad company records, media reports

• Psychological autopsies (retrospective and prospective)– Causal analysis

Page 11: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Context Evaluation: Key Findings• Prevalance:

– Trespass, suicide, and grade crossing fatalities make up over 96% of all rail related fatalities

– Minimum of 175 suicides per year on railway rights of way

– 25-50% trespasser fatalities likely to be suicides

– Railway suicides are underreported

• Poor quality data– Inconsistent procedures for cause of death determination

– Railway suicide rates are unreliable

• Clusters or possible “hot spots” of suicides– Geographic influences unclear (nearby mental facilities?)

– Possible media influences?

• Demographics of decedents similar to non-rail suicides

Page 12: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Context Evaluation: Key Findings

• High cost to railroad companies– Time and schedule delays/costs

– Impact on crews

• High need area for many RR companies– New interventions underway

• Untested countermeasures

• Uncertain as to why rail

Page 13: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Context Evaluation: Key Need Areas• Data Quality

– “You can’t manage what you can’t measure.”

– Need for more consistent and reliable system of suicide determination, nationally and internationally

– Need to better understand location and demographics of “hot spots”

• Media influence– Need for media reporting guidelines

• Impacts of Suicides– Need to better understand impact of railway suicides on individuals

and railway companies (schedule and time delays, impacts on crews exposed)

• Development and Use of Effective Countermeasures– Need to evaluate utilization, effectiveness and impact of suicide

countermeasures’ effectiveness

Page 14: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

INPUT EVALUATION

Page 15: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Input Evaluation:FRA R&D Suicide Prevention Program Areas

• Data collection improvements– GIS Mapping study

– Common criteria for cause of death determination

• Media reporting study– Current reporting of railway fatalities, esp. suicides

– Consistency with known guidelines

– Outcome: recommendations for improvement

• Potential Countermeasures (under review)

• Global Railway Alliance for Suicide Prevention (GRASP)

Page 16: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

• International– Rail Safety Standards Board (RSSB),

UK– TrackSAFE, Australia– Trafikverket (Swedish Transport

Administration), Sweden– Transport Canada– VTT Traffic Safety, Finland– University of Quebec at Montreal– Community Safety Partnerships (UK)– Network Rail (UK)

• US Government– Center for Disease Control (CDC)– Federal Railroad Administration (FRA)– Federal Transit Authority (FTA)– National Institute of Mental Health

(NIMH)– Substance Abuse and Mental Health

Safety Administration (SAMHSA)– Volpe Center (DOT)– Federal Working Group on Suicide

Prevention

• Academic– George Washington University– Harvard School of Public Health– Kansas City University of Medicine

• Non-Profit– American Association of Suicidology– Suicide Research Prevention Center

(SPRC)

• Railroad Industry– Amtrak– Association of American Railroads

(AAR)– Caltrain– Long Island Railroad– Massachusetts Bay Commuter

Railroad (MBCR)– Metra– Metrolink– New Jersey Transit– Norfolk Southern 16

Input Evaluation:GRASP – Stakeholders contacted

Page 17: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

IMPLEMENTATION EVALUATION

Page 18: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Implementation Evaluation:FRA R&D Suicide Prevention Program Areas

• Pilot project(s) for cause of death determination– Review of fatality reporting for improved cause of

death determinations

– Possible application of Ovenstone Criteria

• Evaluation(s) of on-going suicide countermeasure implementations

• Community based interventions

Page 19: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

IMPACT EVALUATION(TBD)

Page 20: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Conclusion: Evaluation as a Key Strategy Tool

• Quality evaluation asks questions that matter. About processes, products, programs, policies, and impacts

Then develop appropriate and rigorous methods to answer them.

• Evaluation measures the extent to which, and ways, programs goals are being met. What’s working, and why, or why not?

• Evaluations help refine program strategy, design and implementation. Inform others about lessons learned, progress, and program impacts.

• Evaluation improves likelihood of program success by:– Identifying and involving intended users

– Clarifying intended uses and potential misuses

– Measuring outcomes and impacts

– Anticipating potential outcomes (+ and -)

Page 21: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

QUESTIONS?

[email protected]

Page 22: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Evaluation Standards *Guiding principles for conducting evaluations

Utility(useful)

Feasibility(practical)

Propriety(ethical)

Accuracy(valid)

Evaluation Accountability(professional)

• Evaluator Credibility

• Attention to Stakeholders

• NegotiatedPurposes

• Explicit Values• Relevant

Information• Meaningful

Processes & Products

• Timely & Appropriate Reporting

• Concern for Consequences & Influence

• Project Management

• PracticalProcedures

• Contextual Validity

• Resource Use

• Responsive & Inclusive Orientation

• Formal Agreements

• Human Rights & Respect

• Clarity & Fairness• Transparency &

Disclosure• Conflicts of Interest• Fiscal

Responsibility

• Justified conclusions & decisions

• Valid Information• Reliable

Information• Explicit Program &

Context Description

• InformationManagement

• Sound Design & Analyses

• Explicit Evaluation Reasoning

• Communication & Reporting

• Evaluation Documentation

• Internal Metaevaluation

• External Metaevaluation

Note: The Program Evaluation Standards were developed by the Joint Committee on Educational Evaluation and have been accredited by the American National Standards Institute (ANSI).

Page 23: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Extra Slides

23

Page 24: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

24

• Evaluation Implementation Plan for FRA Office of Research and Development – http://www.fra.dot.gov/eLib/details/L04865#p3_z5_gD_kevaluation

• Demographic Profile of Intentional Fatalities on Railroad Rights-of-Way in the United States– An estimation of the yearly number of suicides on the railway and basic

demographics of those individual– Report: DOT/FRA/ORD-13/36– https://www.fra.dot.gov/eLib/Details/L04734

• Defining Characteristics of Intentional Fatalities on Railway Rights-of-Way in the United States, 2007–2010 – An A better understanding of the characteristics that make railway suicide

victims unique from other suicide victims– Report: DOT/FRA/ORD-13/25 – http://www.fra.dot.gov/eLib/Details/L04566

FRA Publications

Page 25: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Research Evaluation

Primary Purpose: - contribute to knowledge

- improve understanding

- program improvement

- decision-making

Primary audience: - scholars

- researchers

- academicians

- program funders

- administrators

- decision makers

Types of Questions: - hypotheses

- theory driven

- preordinate

- practical

- applied

- open-ended, flexible

Sources of Data: - surveys

- tests

- experiments

- pre-ordinate

- interviews

- field observations

- documents

- mixed sources

- open-ended, flexible

Criteria: - validity

- reliability

- generalizability

- utility

- feasibility

- propriety

- accuracy

- accountability

The Research-Evaluation Continuum

Page 26: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

American Evaluation Association (http://www.eval.org)• 3000 members in 2001• over 7700 members today• all 50 states• over 60 countries• $95/year membership, includes

– American Journal of Evaluation– New Directions in Evaluation – online access to full journal articles

Evaluation Resources

Page 27: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

• Affiliate Evaluation Associations– Washington Research and Evaluation Network (WREN)

– Federal Evaluator’s Network

• Evaluation Journals– American Journal of Evaluation (AJE)

– New Directions for Evaluation (NDE)

– Evaluation Review

– Evaluation and the Health Professions

• The Evaluator’s Institute (http://tei.gwu.edu/courses_dc.htm) – George Washington University

• The Evaluation Center (http://www.wmich.edu/evalctr/)– Western Michigan University

Evaluation Resources

Page 28: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Evaluation Resources

• Evaluation Theory, Models, and Applications. Daniel Stufflebeam and Chris Coryn. 2nd Edition. Jossey-Bass Publications, 2014. (In press).

• Research on Evaluation Use: A review of the Empirical Literature From 1986 to 2005. Johnson, K., Greenseid, L. et al. American Journal of Evaluation, 2009 vol. 30 no. 3 411-425

• A Practical Guide for Engaging Stakeholders in Developing Evaluation Questions. Robert Wood Johnson Foundation Evaluation Series (www.rwjf.org), Hallie Preskill and Nathalie Jones, 2009

• Utilization-Focused Evaluation. Michael Quinn Patton. 4th Edition. Sage Publications, 2008

• Theory Based Stakeholder Evaluation. Hansen, M. and Vedung, E.American Journal of Evaluation September, 2010 vol. 31 no. 3 295-313

• The Logic Model Guidebook. Better Strategies for Great Results. Phillips Wyatt Knowlton, Inc. Second edition, 2013.

Page 29: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

“Intended use for intended users”

4th edition, 2008

Page 30: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Program Evaluation Standards:Guiding Principles for Conducting Evaluations

• Utility (useful): to ensure evaluations serve the information needs of the intended users.

• Feasibility (practical): to ensure evaluations are realistic, prudent, diplomatic, and frugal.

• Propriety (ethical): to ensure evaluations will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results.

• Accuracy (valid): to ensure that an evaluation will reveal and convey valid and reliable information about all important features of the subject program.

• Accountability (professional): to ensure that those responsible for conducting the evaluation document and make available for inspection all aspects of the evaluation that are needed for independent assessments of its utility, feasibility, propriety, accuracy, and accountability.

Note: The Program Evaluation Standards were developed by the Joint Committee on Standards for Educational Evaluation and have been accredited by the American National Standards Institute (ANSI).

Page 31: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

Stakeholder Involvement: Ethical Guidelines

“When planning and reporting evaluations, evaluators should include relevant perspectives and interests of the full range of stakeholders.”

Guiding Principles from the American Evaluation Association

“Persons involved in or affected by the evaluation should be identified, so that their needs can bed addressed (Utility 1).

Program Evaluation Standards. Joint Committee on Standards for Educational Evaluation (1994).

Page 32: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

From Research to Impact:Knowledge for Action Theories in Evaluation

• Knowledge Utilization– How can the program be used?

• Diffusion– What methods should we use to communicate these programs?

• Implementation– What factors best support the implementation? Challenges/barriers?

• Transfer– How will knowledge transfer occur from the pilot site to other work

sites?

• Translation– How can we shape our communications to make them more accessible

to our target audiences?

Ottoson, J. & Hawe, P., Eds. (2009). Knowledge utilization, diffusion, implementation, transfer, and translation: Implications for evaluation. New Directions for Evaluation, 124.

Page 33: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

33

- Suicide- 21 years old- 9:45AM- March, 2013

Census data for this specific census tract

Input Evaluation: GIS Mapping

Page 34: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

• Goal: Development and use of common media guidelines to minimize copycat incidents

• Current Use: Guidelines exist in the US, but are often not followed (see www.sprc.org/sites/sprc.org/files/library/sreporting.pdf)

• Considerations: Identify types of reporting that result in copycat activity; identify ways to encourage use of guidelines; train rail staff to discuss incidents with media in a way that encourages better reporting practices

34

Metra Train Suicide AKA “Trespasser Fatality” or Metracide

- chevanstonrogerspark.blogspot.com – 12/16/2012

NYC Subway Suicide Pact: New York Romeo and Juliet Leap in Front of Train Rather than Separate

- Latin Post – 7/31/2013

Bluffton couple killed by train in suicide pact

- The Journal Gazette – 8/8/2013

Input Evaluation:Effects of Media Coverage

Page 35: An FRA R&D Evaluation Framework for Grade Crossing and ...railtec.illinois.edu/wp/wp-content/uploads/2018/12/... · Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model

35

1http://wwd.cdf.uqam.ca/fr/2http://www.rssb.co.uk/SPR/Documents/ASPR_2011-12_Keyfactsandfigures.pdf3http://www.who.int/mental_health/prevention/suicide_rates/en/

CountryRailway Suicide

Fatalities per YearRailway Trespass Fatalities per year

Suicides by all Means per year by gender3

Germany1 955 (11.8 per million)

Not availableMale: 179 per millionFemale: 60 per million

Sweden1 48(5.1 per million)

Not availableMale: 187 per millionFemale: 68 per million

United Kingdom2

220(3.5 per million)

40(0.6 per million)

Male: 109 per millionFemale: 30 per million

Canada1 42(1.3 per million)

46(1.2 per million)

Male: 173 per millionFemale: 54 per million

United States246

(0.8 per million)501

(1.6 per million)Male: 177 per millionFemale: 45 per million

Input Evaluation:GRASP –Suicide Prevalence by Country