evaluation tools for student assistance programs

25
Evaluation Tools for Student Assistance Programs M. Amos Clifford, M.A. Mary Davis, M.P.H., Ph.D. National Organization of Student Assistance Programs and Professionals NOSAPP is supported by Partners in Prevention, Solvent Abuse Foundation for Education, Beer Institute -and others. SOLVENT ABUSE FOUNDATION for EDUCATION

Upload: dinhcong

Post on 14-Jan-2017

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation Tools for Student Assistance Programs

Evaluation Tools forStudent AssistancePrograms

M. Amos Clifford, M.A.Mary Davis, M.P.H., Ph.D.

National Organization of Student AssistancePrograms and Professionals

NOSAPP is supported by Partners in Prevention, Solvent AbuseFoundation for Education, Beer Institute -and others.

SOLVENT ABUSEFOUNDATION forEDUCATION

Page 2: Evaluation Tools for Student Assistance Programs

© 1991

National Organization of Student Assistance Programs and Professionals

All rights reserved. Reproduction of any part of this manual isallowed only with written permission from NOSAPP, except forreproduction of the evaluation tools in chapter four and the feed-back and evaluation forms in the appendices. These forms may bereproduced by the original purchaser for the purpose of conduct-ing evaluations in settings where the purchaser is legitimatelyemployed to do so. Outcome Evaluation Form Three, StudentSw-vey oj Alcohol, Drug, and Solvent Use, is adapted from materialsin the public domain and may therefore be copied and used with-out restriction. Any reproduction of these materials other than thatdescribed is an infringement of copyright.

National Organization of Student Assistance Programs and Professionals4760 Walnut, Suite 106

Boulder, CO 80301(303) 443-5696(800) 972-4636

Page 3: Evaluation Tools for Student Assistance Programs

Contents

Table of Contents (Continued)

4. Evaluation Tools 78

Tool Selection Guides 79

Initial Contact Information Form 81

Program Completion Information Form 83

Identifying Stakeholders 85

Stakeholder Questionnaire 87

Choosing Priorities for the SAP 89

Characteristics of a Successful SAP 91

SAP Participant Satisfaction Survey 93

SAP: Staff Satisfaction Survey 95

School Climate Survey 99

Student Survey: What do Students Need? 102

Participant's Rating Sheet for Group Activity 105

Survey of Risk Factors and Protective Factors 108

Attitudes About Alcohol. Drugs. and Solvents 112

Student Survey: Alcohol. Drug. and Solvent Use 116

Survey of Opinions About School 121

Student Survey: Satisfaction with Academic Performance 126

Changes and Stress in Your Life 130

Beliefs. Activities. and Risks of Sexual Behavior 135

Perceptions About Appearance (Eating Disorders) 139

Student Survey: Self-Efficacy (Self-Esteem) 143

Weekly Review of Relapse Prevention Activities 147

Appendices

Random Numbers Table 148

Bibliography 149

National Evaluation Project Data Submission Guidelines 153

1991 National Survey of Student Assistance Programs: Selected Findings .. 159

Users Evaluation Form for Rating this Manual 163

Page 4: Evaluation Tools for Student Assistance Programs

Preface

Preface

Student assistance programs have proliferated at an astonishing rate. TheNational Organization of Student Assistance Programs and Professionals (NOS-APP) has played a leading role in the success of this exciting approach to help-ing troubled students attain their full educational potential. However, while fewdoubt the effectiveness of student assistance programs, much more can bedone towards systematically evaluating them.

This manual represents part of the commitment of NOSAPP and its supporters,including Partners in Prevention, the Solvent Abuse Foundation for Education,the Beer Institute, and many others, to continue developing effective studentassistance programs. The tools, ideas, and examples it contains will help youto answer the most frequently asked questions about student assistance pro-grams: is the program working?: how can we make it work better?: and howcan we make it work for more students?

A manual like this grows from the efforts of many individuals. From consultingwith and training those who are developing student assistance programsacross the country, we have gained an understanding of the needs of the prac-titioner in the field. Frequently, these dedicated professionals have given ushelpful suggestions and connected us with new resources. This manual istherefore in many ways a collaborative work of many individuals workingtogether in an emerging field.

The development of the manual took a number of unexpected turns along theway to fmal completion. A team of evaluation experts and SAP professionalsdeveloped the initial outline, and staff writers began fleshing it out. However,writing the manual has been more like watching something grow than simplyexpanding an outline. To ensure that the approach we chose was relevant to

i

Page 5: Evaluation Tools for Student Assistance Programs

Acknowledgments

Acknowledgmerrts

Many people helped develop this manual. Pam Sekaros, M.A.• and LindaFredericks Chatfield. M.A.contributed greatly to the outline for the manual.Linda reviewed each draft of the manual. and provided many helpful sugges-tions and much needed encouragement.

The evaluation tools were reviewed by professionals in many different loca-tions: in many cases they were also tested by students at these locations. Tothe followingindividuals we extend our special appreciation for outstandingcontributions: Michael Barfield. M.S.W.. of the Allegheny Intermediate Unit inPittsburgh. Pennsylvania: Ann Bennett and the students of Puyallup SchoolDistrict in Puyallup. Washington: Robin Nelson. Ph.D.. of the TexasCommission on Alcohol and Drug Abuse in Austin. Texas: Susan Wilger of theWorker's Assistance Program in Austin. Texas: Trey Anderson. the SAFETeammembers. and the students ofYuba City Unified School District. Yuba City.California: and Nicki Wolman ofAdams County School District Number 12 inNorthglenn. Colorado. The comments of those who have attended our evalua-tion workshops and seminars have deepened our insights and helped us torefine our framework for evaluation.

NOSAPPis also deeply grateful to Partners in Prevention. the Solvent AbuseFoundation for Education. the Beer Institute. and others whose generous sup-port made development of this manual possible.

ill

Page 6: Evaluation Tools for Student Assistance Programs

Chapter 1: Introduction

1Introduction

What is a Student Assistance Program?

"When l was young, I never had anyone to talk with or whowould listen to me. My life got in a big hole; I couldn't get out. Iheard about the student assistance program and went. Thingsaren't perfect now, but/feel Wee I have some support. I can lookforward to the future again. "

In the early 1980s a new approach to helping youth achieve their full educa-tional potential emerged. Increasing numbers of young people were adverselyaffected by abuse of alcohol, solvents, and drugs, by depression, by brokenhomes, and by other problems. For many, the natural result of this was failurein school. Based on the success of employee assistance programs. schoolsstarted to use a similar approach- student assistance programs.

Just as an employee assistance program seeks to increase productivity and tohelp employees stay on the job. the student assistance program (SAP)isdesigned to help students succeed in the school setting and improve the quali-ty of their lives. Student assistance programs bring together school, communi-ty, family, and youth in working partnerships which help students overcomeproblems that are interfering with education or with normal, healthy develop-ment.

Student assistance programs have proliferated at an astonishing rate. In 1980,there were perhaps a total of 300 programs: a decade later, there were over1,800. Few educational movements have matched this expansion rate. Thereasons for the increasing numbers of student assistance programs are many.

1

Page 7: Evaluation Tools for Student Assistance Programs

Chapter 1: Introduction

conditions. and resources. A key feature of a successful student assistanceprogram is that it is built by a process of partnerships among key individu-als and organizations within the school and community. These partnershipspull together existing resources and align them so that they work togethersynergistically. These aligned resources form the core of the program. Thisis often a great improvement over prior circumstances. especially when theexisting resources were operating in a fragmented and uncoordinated fash-ion.

Successful SAPs tend to be goal-driven. Strategies are selected on the basisof clearly defined goals. A common error is to select a strategy for overcom-ing a problem without first precisely defining what the program is expectedto accomplish. One way to do avoid this mistake is to ask at the outset."How will we know if we have been successful? What will have changed. andhow will it have changed?" A succinct answer to these questions will almostalways serve as a useful goal statement.

SAP Management Structures

There are as many management structures for SAPs as there as programs.However. three basic management approaches are commonly seen in suc-cessful SAPs. None of them is more "correct" than the others: each has its

strong points. and each must beadapted to the particular settingin which it is used.

There are a great many variationsin how these structures are imple-mented: many SAPs use combina-tions of these approaches orinvent their own. Typically. theactual structure of the program isinvented by each program.through an evolutionary process oftrial and error.

The three "typical" structures are: (1) the Core Team structure. which pullstogether a team of representatives including administrators. guidance staff.teachers. and community resources. The core team helps to plan SAP ser-vices. and core team members work to identify and screen troubled or at-risk students. A designated staff member. usually a psychologist or

3

Page 8: Evaluation Tools for Student Assistance Programs

Chapter 1: Introduction

Phoenix Union High School District (Arizona) has formed a public/privatesector partnership which supports an SAP that has demonstrated its effective-ness for: reducing alcohol and drug use by students: improving communica-tion skills: increasing self-esteem: and improving student retention. Theprogram has also motivated many staff to address their own problem behav-iors.

The Westchester County (New York) Student Assistance Program is built onstrong partnerships between national. state. and local agencies. schools. andbusinesses. This has resulted in a diverse funding base and broad enthusiasmfor the program. Program evaluations have shown that the program hasreduced alcohol and drug abuse.There is a strong component to sup-port parents whose children are hav-ing problems at school. Theevaluation design is strong. and thelevel of staff expertise is outstanding.

Lawton (Oklahoma) Public Schoolsdeveloped a program in response tocommunity concern about alcoholand drug use by students. The pro-gram development process involvedparents and other community mem-bers. During planning. the decisionwas made to address a much broaderrange of student concerns.Evaluation of the program has shownimprovement in student attitudes.academic performance. andself-esteem. Substance abuse hasbeen reduced.

The Banks (Oregon) School District Student Assistance Program has suc-ceeded in reducing alcohol and drug use and suspensions related to alcoholand drug use. It has a comprehensive K-12 focus. and is complemented by anemployee assistance program for staff. A strong commitment to staff training isanother feature of the program.

Deerfield (Illinois) High School's Reaching Out Program is very effective ateasing the transitions that students face when coming back into the school

5

Page 9: Evaluation Tools for Student Assistance Programs

Chapter 1: Introduction

• In a pre-test. 83% of students participants who reported use ofmarijuana, and 55% who reported use of alcohol, reported no useof either substance in the post-test (Westchester).

• Alcohol and drug use decreased from 87% of students during the1985-86 school year to 81% during the 1987-88 school year(Westchester) .

• The percentage of students who used alcohol or drugs during theschool day decreased from 12% to 7% (Westchester).

• 81% of students reported using alcohol or drugs prior to involve-ment in the SAP group. By the end of the school year, 21% report-ed they had stopped using, and 22 percent reported they haddecreased substance use (Lawton).

• Lowest rate of drug users compared to other campuses in the area(Pajaro Valley).

Outcome: Improved Problem-Solving, Communication Skills

• 60-70% of students indicated that groups had a positive effect onself-worth, ability to deal with problems, communicate, expressfeelings, and build healthy relationships (Phoenix).

• 73% of students believed the program helped them to develop posi-tive ways of dealing with problems (Phoenix).

• Students reported positive effect on: ability to communicate andexpress feelings in a positive way (43%); ability to cope with stress(46%); feelings of self-worth (49%); mental health (53%); relation-ships with family (51%); and relationships with other students(55%) (Lawton).

• In a statewide survey, students scored Significantly lower than thestate average in use of alcohol, marijuana, cocaine, LSD,amphetamine, and all forms of narcotics (Banks).

Outcome: Increased Student Retention

• 49% of students indicated that support groups helped them stay inschool (phoenix).

• 49% of student participants reported a positive effect on schoolattendance (Lawton).

• 43% of student participants reported a positive effect on the gener-al attitude towards school (Lawton).

7

Page 10: Evaluation Tools for Student Assistance Programs

Chapter 1: Introduction

seen by the Department of Education as a component of successful preventionprograms. and their development is actively encouraged by the Department.

• In 1986. few key opinion leaders- politicians. business leaders. and govern-ment department heads- knew about SAPs. Now agencies including the U.S.Treasury Department. U.S. Department of Housing and Urban Development.U.S. Bureau of Justice Assistance. U.S. Office of Substance Abuse Prevention.U.S. Department of Education. and many state officials have become activeparticipants and are supporting SAPs nationally. statewide. and locally.

NOSAPP continues to expand its efforts to develop. promote. and support stu-dent assistance programs. TIlls manual is part of NOSAPP's strategy for inves-tigating the effectiveness of SAPs and providing SAP staff with soundinformation that can be used to support program funding proposals and helpmaintain the political will to keep the SAP going.

Need for Better Evaluation of Student Assistance Programs

It has become common to hear statements such as. "Student assistance pro-grams are the one of the best approaches for reducing demand for drugsamong young people" and "Student assistance programs are a very cost-effec-tive way to help high-risk students achieve their full educational potential."The National Organization of Student Assistance Programs and Professionals(NOSAPP)certainly supports these statements. as do the data we have gath-ered from SAPs all across the country.

However. there is still much that we don't know. While programs have beenvigorously and successfully implemented. evaluation and other data collectionefforts related to program outcomes have been sporadic. In the past. every pro-gram has invented its own evaluation scheme. Results of evaluations havebeen encouraging but are difficult to compare from site to site. If trends exist.they are difficult to detect.

There are four key benefits to be gained from improved evaluation:

• Improved Program Effectiveness: Evaluation will provide programmanagers with the feedback they need to assess how effective theprograms are. and what opportunities exist to improve effective-ness.

• Increased FUnding: As evaluation data accumulate. they will almostcertainly provide mounting evidence for the cost-effectiveness ofstudent assistance programs. TIlls evidence will provide an elo-quent argument for continued and increased funding.

9

Page 11: Evaluation Tools for Student Assistance Programs

Chapter 1: Introduction

more on its common-sense appeal and the positive experiences ofstaff and participants than on solid evaluation findings. Not tobelittle the value of the former. we aim to strengthen the position ofSAPs by gathering data from programs around the country.

3. To provide SAP program managers with tools for decision making.Each tool provided in this manual is intended to help with under-standing what makes a program work well. and discovering possi-bilities for program improvement. We know from our work withthousands of SAP professionals from across the country that is safeto assume a profound dedication to quality. These tools will, webelieve. support efforts to steadily improve program quality.

4. To provide a framework for a national SAP evaluation project.

The NOSAPP National Evaluation Project is a long-term effort to gather consis-tent information about effective SAP processes and what outcomes SAPs areaccomplishing. One of the strategies used by the project is to encourage allSAPs to use the same set of evaluation instruments and procedures. and toshare their fmdings with NOSAPP. In this way NOSAPP can systematicallytrack SAP trends. needs. and outcomes. This information will be shared withthe field. This manual contains those instruments and procedures.

Participating in the National Evaluation Project

To participate in the national evaluation project. SAP professionals simplyneed to use tools and procedures from this manual. choosing those which bestmatch their program goals. NOSAPP has identified. through its annual nation-al survey. the most common goals of student assistance programs and hasdeveloped tools specifically for assessing how well these goals are being met.Rather than creating a single instrument, we have recognized that each SAP isunique and has unique evaluation needs. Therefore. an array of different toolsis needed and is provided in this manual, Programs may use one. two. or sev-eral of them. depending upon their needs and resources. Whether you shareyour data with NOSAPP or not. you will find useful and practical evaluationtools in this manual.

A summary of how to conduct an evaluation that meets your needs and thatprovides data that fits with the national evaluation project is as follows:

1. Notify NOSAPP of your intent to conduct an evaluation. Ourresource specialists will be able to help you clarify your evaluationquestions. select appropriate evaluation strategies. and develop anevaluation plan.

11

Page 12: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

2Prelim.inaries:What toConsiderBeforeStarting

This chapter covers what needs to be considered before you get started withyour evaluation. It discusses various evaluation designs, cultural issues, ethi-cal considerations, and a few technical points to consider when planning anevaluation.

Definitions of Evaluation

Perhaps the simplest definition of evaluation is that it is systematic cwiosity.Most SAP workers are genuinely curious about how much impact the programhas. This curiosity can be satisfied by a systematic approach to asking theright questions in the right way. That is when effective evaluation occurs.

To evaluate is to make an explicit judgment about the worth of all, or part of,the student assistance program. Evaluation entails collecting evidence todetermine if certain acceptable standards have been met. Evaluation is thesystematic process of collecting, analyzing, and interpreting information aboutprogram effectiveness and efficiency. Evaluation is best viewed as an integralpart of the SAP, inseparable from program management.

13

Page 13: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

a condition which existed prior to exposure to the program either no longerexists or has changed significantly after exposure to the program. In somecases, cost data and cost-effectiveness analyses are also important.

When an SAPis being started in a school district where there is substantialskepticism about what the program can do, a more complex evaluation thataccomplishes some of the goals of a demonstration project may be warranted.

In most cases, however, the key to a successful evaluation is to keep it simple.Do not undertake a research project, and remember to educate those who areconfused about the differences between research and evaluation so they willunderstand that your project is appropriately limited.

On the Job Learning

Through aprocess of trying new ideas and tactics, refininq thosewhich are promising and abandoning those which are not, effec-tive student assistance programs remain responsive to shiftingneeds and circumstances. Evaluation activities are those bywhich the program remains systematically alert to these shifts ofcircumstance. The key questions are how program activities influ-ence students, the school; and the community, and how students,the school. and the community in turn influence the activities ofthe program. Well-designed evaluation strategies provide asteady stream of information by which these influences can betracked. 3

Each school and community is unique. Programs that attempt to influence acommunity are influenced by this fact, and each program is also unique.While certain standard components of an SAPhave emerged, the manner inwhich these are organized varies considerably from district to district and com-munity to community.

An important reason for this variation is that each SAPconstantly adjusts itsdesign in an attempt to better fit its particular setting. This happens becausesome type of evaluation is being conducted- be it formal or informal, or evenunconscious- and SAPstaff and planners are responding to informationobtained through the evaluation process. The result is continual "tinkering"with the program to improve its services.

Implicit in this ongoing process of "tinkering" with program design is the recog-nition that student assistance programs are dynamic processes, characterizedby an ongoing investigation of opportunities to provide support for troubled

15

Page 14: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

B. To Improve the Program

Programs benefit from constant improvement. It is important tobase efforts to improve the program on reliable evaluation data. Ifthis is the primary purpose for evaluation. you will choose mea-sures that let you know what activities or strategies were wellliked. and which activities worked or did not work with whichgroups. You will also try to determine which staff are effective.and in what areas they are most effective. Just as important. youwill identify activities which do not work. and determine whetherefforts should be made to improve them or if you should simplyeliminate them.

C. To Justify Program Continuation/Expansion

Funding sources want to know that a program is working. So doschool administrators and others whose political and professionalaspirations are connected to the program. Effective evaluationcan serve to assure them that the program is worth continuing. Itcan demonstrate the popularity of the program. or its effective-ness for reducing problems. or it can demonstrate cost-effective-ness.

If this is your primary purpose for evaluation. it is important todetermine exactly who makes funding decisions. Know them byname. and understand what they value in a program. Know whatit takes to make them say "yes."

D. To Add to Knowledge in the Field

The field is still developing. and every program evaluation can bean opportunity to learn more about what makes student assis-tance efforts effective. A good evaluation can provide a basis forarticles in professional journals and for presentations at semi-nars and conferences. It is important to publish and presentresults. even when an evaluation shows that a strategy did notwork out quite as planned. Evaluation is about learning. and oth-ers will benefit from what you have learned.

Besides the four reasons for evaluation listed above. it is a good idea to reflectupon your personal motivations for conducting an evaluation. What's in it foryou? Allow yourself to conduct an evaluation that meets your needs. as well asthe needs of funding sources. For example. you may wish to exercise your

17

Page 15: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

ed because they provide funding or because the SAP affects how they work orlive. Anyone whose life is significantly affected by the SAP is a stakeholder.Some are more interested or more strongly affected than others. These stake-holders are generally more interested in the SAP- and in evaluation findingsrelated to the SAP- than are those who are less affected by the SAP.

The reason stakeholders are important to consider when planning an evalua-tion is because it is their questions that the evaluation should seek to answer.We have provided worksheets on the following pages to help you anticipatethese questions. Some of the most common categories of stakeholders, andcommon interests of each group, are:

SAP Staff: Staff are most likely to be interested in how effecttve theirprogram is, how favorably they are perceived by others,and how secure funding is for the program and for theirpositions.

Administrators: School administrators are most likely to be interested incost-effectiveness, political security, management effective-ness, liability, impact of the SAP on overall school opera-tions, and program effectiveness.

School Board: The school board is most likely to be interested in theimpact of the SAP on education, cost-effectiveness, and therelationship of the SAP to district priorities and its effecton community relations.

Teachers: Teachers are most likely to be interested in program out-comes, and in how effective referral and other operationalprocedures are. They are alert to specific tasks that theSAP may request of them and want to know what the con-crete benefits of these tasks are.

Students: Students are likely to be curious about the opinions oftheir peers regarding the program. They like to know ifparticipants in the SAP have a favorable opinion. They aregenerally aware of behavior changes that occur when apeer participates in the SAP and may be interested in whatimpact these changes have on the overall situation of thepeer.

Parents: Parents generally become stakeholders when their child isreferred to the SAP. They are interested in the processes by

19

Page 16: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

which the SAPis run and in the benefits that past partici-pants have gained.

Public Agencies: These include public agencies, such as county and statealcohol and drug commissions, family planning offices,and so on. Government agencies are often interested inhow effectivea program is in reducing a specific problemin society and resulting reductions in costs to the commu-nity. They also seek information on how effectivepro-grams can be replicated for other groups or in othercommunities. At a local level they may be concerned abouthow the SAPis integrated with other services.

Private Treatment: Private alcohol and drug agencies may feel a sense of com-petition with the SAP,or they may see the SAPas a majormarket for consulting services and referrals. They areinterested in changes in drug and alcohol use patternsamong students who participate in the SAP.They may alsobe curious about SAPstaff qualifications. Of particularinterest will the type of health insurance coverage ofprospective referrals.

Funding Sources: When private industry or foundations are involved in fund-ing the SAPthey may want to know how well the programis achieving the objectives stated in the funding proposal.If the SAPcan help corporate or foundation decision mak-ers connect program outcomes with a particular interest orpriority of theirs, prospects for continued funding will beenhanced.

Law Enforcement: Police agencies are interested in what impact the SAPhason crime. They may wish to test referral to SAPservices asa viable alternative to prosecution of first offenders.

Wehave discussed these stakeholders in terms of groups. In reality, of course,stakeholders are individuals, each ofwhom has unique interests. Often indi-vidual interests are different from those of the group they represent.

For example, an officerwho represents the police department on a communitySAPadvisory board may be a recovering alcoholic. Her major interest maytherefore be in recovery rates for alcohol students who participate in the SAP.Or a student leader may be interested in promoting projects that help students

21

Page 17: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

can prevent a program from basing decisions on faulty data that might havebeen detected had other types of data been available. Divergent data often indi-cate that more evaluation is needed before clear conclusions can be drawn.

Process and Outcome: Process data focus on what things got done; and out-come data focus on changes.

For example. when evaluating the referral process of an SAP.process-orientedquestions would include. "Howmany students were referred?" "What were thereasons for referral?" "Whomade the most referrals?" and "Whomade thefewest referrals?"

In contrast. related outcome questions might include. "Howmany of those stu-dents who were referred became significantly involved in the SAPor in commu-nity services to which they were referred by the SAP?"or even moreoutcome-oriented. "Howdid referral to the SAPaffect the beliefs. attitudes.skills. or behaviors of students?"

These two types of data are most powerful when combined. For example. anoutcome evaluation might show that 50% of students made desirable changes

23

Page 18: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

affected them- influences that personal stories capture much better than aver-age scores and standard deviations.

In-Depth and Broad-Based: With limited resources for evaluation, a choicemust be made between getting a little bit of information from a large group orgetting a lot of information from a small group. Brief surveys of the entire stu-dent body are an example of an approach that favors breadth; pre- and post-tests of a group of program participants favor gathering more in-depthinformation.

To determine what general changes are occurring, it is typically a good strategyto gather information from a large group; for example, a survey of the entirestudent body may help to track changes in school climate. However, in-depthinterviews with a small sample of students will help with understanding whychanges are occurring. Thus, a balance of in-depth and broad-based evalua-tion strategies is perhaps the best way to gain a useful perspective.

The Evaluation Team

Whether to have the program evaluated by staff or by an independent consultantis an important decision. While this manual has been developedspecificallytogivestaff the tools they need to conduct a credible evaluation, some programsmay still want to consider workingwith a consultant. But it doesn't have to be aneither/or proposition. An excellent alternative is to form an evaluation team.

The evaluation team can be include both staff and consultants. In addition,others can be included whose expertise would be particularly helpful for oneaspect of the project or another. Here are some of the qualities for which teammembers might be selected:

• Someone who understands evaluation design;• Someone who is able to design or modify instruments, such as sur-

veys and interview protocols;

• A person to manage the overall evaluation project;• A detail-oriented person;

• Someone who can see the "big picture";

• An individual with some knowledge of statistics;

• People who have expertise regarding the program component thatis to be evaluated;

• People who understand and can represent the needs and interests

25

Page 19: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

3. Compare different kinds of information and identify areas of con-vergence. Convergence occurs when different indicators are con-gruent and point to a single conclusion; for example, kids aresmiling in the hallways, and surveys indicate that they feel happy.

4. Inform evaluation planners and those to whom the evaluation willbe presented of the credibility dilemma:

Work with others to develop an evaluation plan that strikes a rea-sonable balance between reliable but impossible and possible butof questionable worth.

5. Tie evaluation findings to concrete, sensible, and practical recom-mendations for program improvements. Show how these recom-mendations relate to the overall goals of the program.

What You Can Learn From Evaluation

Not Causes, but Conjectures: If a survey indicates that student drug use issignificantly reduced two years after the SAP was started, is it reasonable tosay that the SAP caused the reduction? It would in most cases be a mistake tomake such a claim. There are too many other circumstances that might affectshort-term trends in student reports of drug use. What if, two weeks before thesurvey, an undercover operation resulted in the arrest of 35 students for sell-ing and/or using illegal drugs? Might this not affect how secure students feltabout responding to a "anonymous" survey of their own drug use patterns?

This example is obvious. Many other, more subtle factors can influence behav-iors and attitudes. Consequently, any claim that evaluation data prove that anoutcome was caused by a specific action is suspect. The best that can beshown from statistical data is correlation- that two or more events occur insome relationship to each other. Correlation is often mistaken for causality,and those who read the evaluation findings may be tempted to make this mis-take.

However, it is often reasonable to make conjectures based upon the data. TheSAP may have contributed to the decline in student drug use. The data suggestthat a relationship exists between SAP activity and reduced student drug userates. The results of the surveys, combined with statements by several stu-dents who participated in evaluation interviews, tend to support the evalua-tor's opinion that the SAP is an important positive influence on student druguse choices.

27

Page 20: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

wishes to investigate correlation or to establish margin oferror, it is often sufficient to consult with a statistician orevaluation consultant when designing the evaluation pro-ject, and again after data have been collected and arebeing prepared for analysis.

Misconception: Being subjected to the scrutiny of an evaluation mightdamage program credibility.

An Observation: Those who evaluate their efforts are generally perceived asmore credible and capable than those who do not. This istrue even when the findings suggest that the program isnot achieving the outcomes it seeks. The catch is this:credibility is maintained only when the findings areresponded to in an open and flexible manner. Credibilitywill indeed suffer if a manager dismisses evaluation find-ings because their results did not match the hopes of pro-gram staff.

Faulty Thinking: Our program design is based on a strategy with soundresearch support. We therefore don't need to evaluate.

Clear Thinking: It is an excellent idea to incorporate into your programthose features and strategies that have the strongestresearch support. However, ongoing evaluation is stilldesirable to provide a check on how well the model strate-gies are functioning outside the setting in which theresearch was conducted. Real-world application of princi-ples that have been tested in highly controlled settings areoften problematic. Besides, those who understand thenature of science always treat current views as tentativeand subject to revision as new evidence emerges.

Convenient Excuse: We don't have time to evaluate.

Convenient Fact: If evaluation is well designed and incorporated into projectmanagement, information collection is generally continu-ous and does not require time-intensive special efforts.Tallying data collected over time can often be accomplishedby volunteers or secretarial staff. A lack of evaluation mayinvolve a huge loss of productive time and resources, if theservices you provide do not in fact accomplish their intend-

29

Page 21: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

serves no practical purpose. This could reduce organiza-tional commitment to further evaluation. This manualincludes several worksheets that will help you carefullydevelop relevant questions.

Integrity Survey forms, observation checklists, and interview proto-cols should be carefully designed to avoid gathering onlydata that supports a particular position. The instrumenta-tion section of this manual explains some important andsubtle design considerations.

Analyzing the information gathered during an evaluationmust also be done with an open mind, or the "findings" arelikely to be biased. Become aware of and master any moti-vation you may have to do this.

Evaluation findings may be taken out of context or pre-sented piecemeal to support a conclusion that would becontradicted if the entire set of flndtngs were considered.This is most likely to happen in an attempt to sustainenthusiasm while givinga project more time to "proveitself." Rationalizations such as this undermine the verypurpose of evaluation, not to mention to credibility of theirperpetrators.

Confidentiality Sometimes questions arise about the confidentiality ofinformation, disclosure of sources, consent to participate,and how findings might affect the personal or professionalreputation of individuals. The evaluator has a responsibili-ty to be aware of these issues and to be sure that they aremanaged ethically.

Finally, fmdings should be effectivelyreported to thosewho have an interest in them. There are a variety ofmeth-ods that can be used for reporting. They are discussed inchapter two, under step twelve.

Reporting

Cultural Sensitivity and Cultural Appropriateness

It is important to understand how the ethnic and cultural background ofrespondents may affect the way they respond to an item. For example, a

31

Page 22: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

priate. InteIViews- face-to-face. friendly conversations- in a com-fortable setting may be better.

• Avoidcollecting data that might be used by others to fabricaterationales for reducing services or imposing new restrictions. Thisrequires sensitivity to the political context in which the evaluationis conducted.

• Incorporate representatives of the target populations in the evalua-tion team. When Interviewing, for example. gang members. getinput from gang members on workable methods for gaining thecooperation of the sample. Consider training gang members to beinteIViewers.

• When extracting data from written records. be alert for ways inwhich these records might systematically- although unintentional-ly- skew data to misrepresent various groups. For example.according to arrest records one might conclude that many moreboys use illegal drugs than do girls. Researchers have found. how-ever. that this is not necessarily the case; a more accurate fmding(that would not be discovered only through reviewingarrestrecords) would be that girls are suspected less frequently andwhen apprehended are more likely to be released with an unofficialwarning.

Additional Principles for Program.Evaluators2

• Questions investigated are those of interest of stakeholders; that is.any person or group that has an interest in the evaluation.

• Evaluation is a partnership between evaluators. program staff.school staff. students. and community members. who engage in anactive exchange during the planning and implementation of theevaluation.

• Evaluation is specific to a program. its purpose. and its environ-ment.

• Evaluation is an integral and continuous part of program planning.and it influences program activities.

• Evaluation design balances what is practical with what is ideal.

• Evaluation does not have to be difficult or complicated. but shouldbe well planned.

33

Page 23: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

A needs assess-ment toot PlanningForm Eight, canhelp to determinestudent needs. Herepage one has beenscored to show howa sample of 105students describedtheir needs.

Student Smvey: What Do Students Need?Planning Form Eight

Purpose: This survey Is to help school staff detennlne what kinds ofservices students are most Interested In recetvmg,

DataGatheredOu.mo.r JJ & 12,1990;N~I05Stu..,_Todays Date:

Grade: 10thn=.36 (.34:1.);11thn=45 (4.3:');12thn=24 (2.3:')

Sa: 0 Male 0 Female49:' 51:'

Ethnic Group: .37:. 0 WhIte26:' 0 Htsparuc21:' 0 Afr1can American05:1. 0 Native American09:1. 0 AsIan02:1. 0 Other:

Check the b"" to indicate how important you thlnlr. eachurvice Is. Not Somewhat Very

hnportant hnportant hnportant

064:' 02l:L 0'5:'

Instructions:

1. individual counseling with a trained adult forpersonal problems.

2. Group counsel1ng, led by a trained adult, forpersonal problems.

3. Training for students In how to help each otherwith personal problems.

4. Tutoring and help with understanding school work.

5. Help for problems related to alcohol, drugs, andsolvents.

6. Help for problems related to stress and anxiety.

7. Help for problems related to making a career orcollege choice.

8. Help for problems with self-esteem.

9. Help for famUy problems and getting along withparents.

10. Help for problems related to getting along withfriends.

11. Help for problems related to eating and weIghtcontrol.

12. Help to reduce depressIon.

13. Help for problems related to sex, bIrth control, andsexually transmitted diseases such as Herpes andAIDS.

071:1. 022:' 0077.

0.31:' OM:' 0.35:'

02.3:' 021:' 056:'

0'6:' 0.35:1. 049:'

05O:t 02.3:' 027:.0'2:1. 029:' o59%.

041:' 021:' 0281.

02.3%. 0.30:' 047:.

029:' 04O:t 0.31:'

0751. 0'81. 07:.

042:' 0251. 0.3.3:'

025:' 028:' 048:'

ing, most stressful, and so on. They could be asked forsuggestions about what would make it easier for them tobecome enthusiastically involved in the school or commu-nity. They could be asked, 'What do you feel are your mostimportant needs?" There are several instruments in thismanual specifically for this purpose: Stakeholder Question-naire (Planning Form Two), Choosing Prioritiesfor the StudentAssistance Program (Planning Form Three), School ClimateSurvey (Planning Form Seven), Student Survey: What doStudents Need? (Planning Form Eight, example shownabove), and Student Survey: Changes and Stress in Your Life(Outcome Evaluation Form Six).

35

Page 24: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

These data are typ-ical oj apre-

test/post-testdesign. Thejorm is

page two ojOutcome

Evaluation FormFive, Student

Attitudes AboutSchool. Large

changes betweenpre-test and post-

test scores suggestthat the interven-

tion providedbetween the two

tests (attendance oj10 insight dasses)

had an impact.

change an environmental condition (such as school cli-mate) that may affect the group.

4. The group is given a pre-test. The pre-test containsitems that are indicators related to the measurable objec-tive for change. For the example given in step two, an itemthat indicates increased knowledge might be, "What is thename of the virus that causes AIDS?" Outcome EvaluationForm Seven, Beliefs, Activities, and Risks oj SexualBehavior would be appropriate for this particular evalua-tion question.

5. The group is exposed to the intervention. It is importantto conduct the intervention according to plan, or, if the

Average pre-test and post-test ecoree: 17students who attended at least 10 Insight classsessions between tests, February through April, 1990.

In.trw:tlon.:For each statement. circle a number to express your optn1on.

1- Strol1ll 2- Weak. s- NotNo No Sure

26. I have difficulty understandtngmuch of what the teacherscover in class.

27. Comments about understanding classwork:See attachment

1 2Pre:2.8Pre:3.6

3 4 5Post: 2.8Fbst:3.1

37

Page 25: Evaluation Tools for Student Assistance Programs

Chapter 2: Preliminary Considerations

• Crime and rule violation reports

• Occurrence and content of graffiti

• Parent attendance at school! community meetings

Many other indicators exist. With practice, you will become more proficient atchoosing and measuring them.

Implementing the Evaluation

The next chapter contains step-by-step instructions for implementing an eval-uation. It builds upon the principles covered in this chapter, and will help youmore precisely understand how to use the evaluation tools in chapter four.

Chapter Two Footnotes

1. Adapted from: Proposal to Colorado Department of Health. Alcohol and DrugAbuse Division. for an Evaluation Specialist. Evaluation TrainingConsortium, Denver, CO, April 1991.

2. op. cit.

39