ej 957107

29
© Journal of Higher Education O utreach and Engagement , Volume 15, Number 4, p. 57, (2011) Using the Context, Input, Process, and Product Evaluation Model (CIPP) as a Comprehensive Framework to Guide the Planning, Implementation, and  Assessment of Service-learning Pr ograms Guili Zhang, Nancy Zeller, Robin Grifth, Debbie Metcalf,  Jennifer Williams, Christine Shea, and Katherine Misulis Abstract Planning, implementing, and assessing a servic e-learning project can be a complex task because service-learning projects ofen involv e multiple constituencies and aim to meet both the needs o service providers and community partners. In this article, Stuebeam ’s Context, Input, Process, and Product (CIPP) evalu- ation model is recommended as a ramework to systematically guide the conception, design, implementation, and assessment o ser vice-learning projects, and provide eedback and judgment o the project’s eectiveness or continuous improvement. Tis article (1) explores the CIPP evaluation model’s theoretical roots and applications, (2) delineates its our components, (3) analyzes each component’s role in a service-learning project’s success, and (4) discusses how the model eectively addresses Service- Learning Standards or Quality Practice. Tis article illustrates the application and evaluation o the model in a teacher-educa- tion service-learning tutoring pro ject. Introduction and Review of Literature S  ervice-learning in education al settings invol ves the integra- tion o community service into the academic curriculum (Koliba, Campbell, & Shapiro, 2006) . Service providers achieve curricular goals while providing serv ices that meet a community’s needs (Zhang, Grith, et al., 2009; Zhang, Zeller, et al., 2008) . A suc- cessul ser vice-learni ng pro ject thus requires that a aculty member identiy the needs o service providers and community partners, design a project that can eectively address both needs, and imple- ment the project in a manner that generates the desired outcome. Each o these steps is vital to the success o the service-learning project; each requires, thereore, careul monitoring to ensure its eective execution. Moreover, service-learning projects ofen involve multiple stakeholders. Tey can generate unanticipated outcomes as well as intended outcomes. Although assessments aiming at one or several impacts, and ocusing on a single stage o Copyright © 2011 by the University of Georgia. All rights reserved. ISSN 1534-6104

Upload: sujin378

Post on 07-Jan-2016

212 views

Category:

Documents


0 download

DESCRIPTION

Ej 957107

TRANSCRIPT

Page 1: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 1/28

© Journal of Higher Education Outreach and Engagement, Volume 15, Number 4, p. 57, (2011)

Using the Context, Input, Process, andProduct Evaluation Model (CIPP)

as a Comprehensive Framework toGuide the Planning, Implementation, and Assessment of Service-learning Programs

Guili Zhang, Nancy Zeller, Robin Grifth, Debbie Metcalf, Jennifer Williams, Christine Shea, and Katherine Misulis

AbstractPlanning, implementing, and assessing a service-learning project

can be a complex task because service-learning projects ofeninvolve multiple constituencies and aim to meet both the needso service providers and community partners. In this article,Stufflebeam’s Context, Input, Process, and Product (CIPP) evalu-ation model is recommended as a ramework to systematicallyguide the conception, design, implementation, and assessmento service-learning projects, and provide eedback and judgmento the project’s effectiveness or continuous improvement. Tisarticle (1) explores the CIPP evaluation model’s theoretical rootsand applications, (2) delineates its our components, (3) analyzeseach component’s role in a service-learning project’s success,and (4) discusses how the model effectively addresses Service-Learning Standards or Quality Practice. Tis article illustratesthe application and evaluation o the model in a teacher-educa-tion service-learning tutoring project.

Introduction and Review of Literature

S  

ervice-learning in educational settings involves the integra-

tion o community service into the academic curriculum(Koliba, Campbell, & Shapiro, 2006). Service providers achievecurricular goals while providing services that meet a community’sneeds (Zhang, Griffith, et al., 2009; Zhang, Zeller, et al., 2008). A suc-cessul service-learning project thus requires that a aculty memberidentiy the needs o service providers and community partners,design a project that can effectively address both needs, and imple-ment the project in a manner that generates the desired outcome.Each o these steps is vital to the success o the service-learning

project; each requires, thereore, careul monitoring to ensureits effective execution. Moreover, service-learning projects ofeninvolve multiple stakeholders. Tey can generate unanticipatedoutcomes as well as intended outcomes. Although assessmentsaiming at one or several impacts, and ocusing on a single stage o

Copyright © 2011 by the University of Georgia. All rights reserved. ISSN 1534-6104

Page 2: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 2/28

58  Journal of Higher Education Outreach and Engagement

a service-learning project, can be inormative in answering specificquestions on the value o service-learning, they cannot systemati-cally guide the planning and implementation o service-learningprojects. o date, no evaluation model appears to be widely adopted

by aculty members to guide service-learning projects. Te authors,thereore, posit in this article that Stufflebeam’s (2003) Context,Input, Process, and Product (CIPP) model can serve as a guidingramework or service-learning projects.

Approaches for Evaluating ProjectsTere appear to be some 26 approaches ofen employed to

evaluate projects. Tese 26 may be grouped into five categories:

pseudoevaluations, quasi-evaluation studies, improvement- andaccountability-oriented evaluation, social agenda and advocacy, andeclectic evaluation. Te first category, pseudoevaluations, includesfive approaches that are ofen motivated by political objectives:public relations–inspired studies, politically controlled studies,pandering evaluations, evaluation by pretext, and empowermentunder the guise o evaluation. Te other 21 approaches are typicallyused legitimately to judge projects. Te quasi-evaluations include14 approaches that either ocus on answering one or several ques-

tions or use a single methodological approach: objectives-basedstudies; accountability, particularly payment-by-results studies;success case method; objective testing programs; outcome evalua-tion as value-added assessment; perormance testing; experimentalstudies; management inormation systems; benefit-cost analysis;clarification hearing; case study evaluations; criticism and connois-seurship; program theory–based evaluation; and mixed-methodsstudies. Te improvement/accountability category is orientedtoward determining the merit and worth o the project or entitybeing evaluated, and encompasses three approaches: decision- andaccountability-oriented studies, consumer-oriented studies, andaccreditation and certification. Te social agenda/advocacy cat-egory dedicates evaluation efforts to pursuing social justice andincludes three approaches: responsive evaluation or client-centeredstudies, constructivist evaluation, and deliberation democraticevaluation. Finally, the eclectic evaluation category includes theutilization-ocused evaluation approach, and draws selectively

rom all available evaluation concepts and methods to serve theneeds o a particular user group (Stufflebeam & Shinkfield, 2007).

When compared with proessional standards or project evalu-ation, and afer being rated by their utility, easibility, propriety,and accuracy, the best approach that has suraced is the Context,

Page 3: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 3/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 59

Input, Process, and Product evaluation model. Te CIPP evaluationmodel belongs in the improvement/accountability category, and isone o the most widely applied evaluation models. Unlike moretraditional evaluation approaches such as the ylerian Evaluation

Rationale (yler, 1942), which is an objectives-based approach in thequasi-evaluation category mainly concerned with the final retroac-tive evaluation o whether a set o objectives has been met, the CIPPevaluation model is designed to systematically guide both evalua-tors and stakeholders in posing relevant questions and conductingassessments at the beginning o a project (context and input evalu-ation), while it is in progress (input and process evaluation), andat its end (product evaluation). A survey by American Society orraining and Development members ound that the CIPP modelwas preerred over other evaluation models (Galvin, 1983).

 What the CIPP Evaluation Model Can DoSpecifically, the context evaluation component o the Context,

Input, Process, and Product evaluation model can help identiyservice providers’ learning needs and the community’s needs. Teinput evaluation component can then help prescribe a responsiveproject that can best address the identified needs. Next, the process

evaluation component monitors the project process and potentialprocedural barriers, and identifies needs or project adjustments.Finally, the product evaluation component measures, interprets,and judges project outcomes and interprets their merit, worth, sig-nificance, and probity.

Planning, Implementing, and Assessing Service-Learning Projects: A Multifaceted Task in Need

of a Guiding Framework Te challenge o carrying out service-learning projects lies inthe complexity resulting rom multiple project objectives and mul-tiple participating groups (e.g., aculty and community members,and students). Te challenges are intensified by the lack o a reliableevaluation model that systematically guides the service-learningprojects (Zhang, Zeller, et al., 2008; Zhang, Griffith, et al., 2009).

Te need or rigorous and authentic assessment o service-learning outcomes has been increasingly recognized, and themany challenges in assessing service-learning have been enumer-ated (Butin, 2003; Gelmon, 2000a; Holland, 2001). Service-learningis a complex approach to teaching and learning; it needs anddeserves approaches to assessment, evaluation, and reporting

Page 4: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 4/28

60  Journal of Higher Education Outreach and Engagement

that are capable o capturing that complexity (Eyler & Giles, 1999;

Karayan & Gathercoal, 2005; Mabry,

1998; Moore, 1999; Pritchard, 2002;

Steinke & Buresh, 2002; roppe,

1995).A number o effective ser-

 vice-learning project assess-ments ocusing on specificaspects o service-learning out-comes have been conducted(Bringle & Kremer, 1993; Hamm,

Dowell, & Houck, 1998). Tese

assessments represent the per-ormance testing approach in thequasi-evaluation category. Forexample, Furco (2002) developed

the Sel-Assessment Rubric or Institutionalizing Service-Learningin Higher Education, a tool that enables colleges and universities tomeasure the degree to which service-learning is part o the institu-tion’s culture. Marchel (2004) discussed evaluating reflection andsociocultural awareness, and Peterson (2004) discussed assessing

perormance in problem-based service-learning projects. Te pre-dominantly used data collection mechanisms include survey meth-odology (Kezar, 2002) and reflection (Ash & Clayton, 2004). Portolioshave also been recommended (Banta, 1999).

A ew studies have examined the complexity o service-learningby ocusing on the various groups o people involved, a methodresembling the client-centered evaluation approach. A case studymodel o assessment was developed at Portland State University,to measure the impact o service-learning on our constituencies:students, aculty, community, and the institution (Driscoll, Holland,

Gelmon, & Kerrigan, 1996). Subsequent work by Driscoll, Gelmon,et al. (1998) has provided an assessment model or service-learningprojects specifically in education that ocuses on the our constitu-encies o service-learning. Te model provides both quantitativeand qualitative measures at three levels o assessment: diagnostic,ormative, and summative. Additionally, Gelmon, Holland, et al.(2001)  have offered useul principles, techniques, and tools or

assessing service-learning and civic engagement.Holland (2001)  suggested a more comprehensive evaluation

model or assessing service-learning based on a goal-variable-indicator-method design, which can be best characterized as anobjectives-based evaluation approach. Its strength is its attention to

“Service-learning isa complex approachto teaching andlearning; it needs anddeserves approaches toassessment, evaluation,and reporting that arecapable o capturing

that complexity.” 

Page 5: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 5/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 61

the complex dynamics behind service-learning—the collaborativework o students and aculty members (within their institutionalcontext) with their community partners. Holland’s work serves asthe first step toward providing an evaluation model or assessing

service-learning. However, the lack o a sense o sequence andintertwined nature may limit its useulness. Currently, it appearsthat no specific evaluation model has emerged as an easy-to-use,systematic guide to a service-learning project’s planning and imple-mentation (e.g., Baker-Boosmara, Guevara, & Balour, 2006; Bordelon,

2006; Borges & Hartung, 2007).

Stufebeam’s Context, Input, Process, and

Product Evaluation Model: An ImprovementAccountability ApproachStufflebeam’s Context, Input, Process, and Product evalua-

tion model is “a comprehensive ramework or conducting orma-tive and summative evaluations o projects, personnel, products,organizations, and evaluation systems” (Stufflebeam & Shinkfield,

2007, p. 325). Te model originated in the late 1960s to providegreater accountability or the U.S. inner-city school district reormproject. It was to address the limitations o traditional evaluation

approaches (Stufflebeam, 1971). Te CIPP evaluation model “is con-figured especially to enable and guide comprehensive, systematicexamination o social and educational projects that occur in thedynamic, septic conditions o the real world . . .” (Stufflebeam &

Shinkfield, 2007, p. 351). Over the years, the model has been refined(Alkin, 2004) and used by a wide range o disciplines (Stufflebeam &

Shinkfield, 2007).

In education settings, the CIPP evaluation model has been

used to evaluate numerous educational projects and entities(Zhang, Griffith, et al., 2009; Zhang, Zeller, et al., 2008) . For example,Felix (1979) adopted the model to evaluate and improve instruc-tion in Cincinnati, Ohio, school systems. Nicholson (1989) recom-mended the CIPP evaluation model to evaluate reading instruc-tion. Matthews and Hudson (2001) developed guidelines or theevaluation o parent training projects within the ramework o theCIPP evaluation model. A aculty development project designedto support the teaching and evaluation o proessionalism o med-

ical students and residents was examined using the CIPP evalu-ation model (Steinert, Cruess, Cruess, & Snell, 2005). Te model wasused to construct aiwan’s national educational indicator systems(Chien, Lee, & Cheng, 2007). Te model also served as the evaluationmodel or Osokoya and Adekunle (2007) to assess the trainability

Page 6: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 6/28

62  Journal of Higher Education Outreach and Engagement

o enrollees in the Leventis Foundation (Nigeria) AgriculturalSchools’ projects. Moreover, Combs, Gibson, et al. (2008) deriveda course assessment and enhancement model based on the CIPPevaluation model because o its flexibility in providing ormative

and summative results.Over the years, exemplary applications o the model within

education occurred in numerous evaluations by Bill Webster othe Dallas, exas, Independent School District; Howard Merrimano the Columbus, Ohio, school district; Gary Wegenke and hisevaluators o the Des Moines, Iowa, school district; Jerry Bakero the Saginaw, Michigan, school district; Jerry Walker o TeOhio State University National Center or Research on Vocational

Education; Bob Randall o the Southwest Regional EducationalResearch Laboratory; Carl Candoli and his evaluators o theLansing, Michigan, school district; Stufflebeam and colleagues othe Evaluation Centers (first at Te Ohio State University and sub-sequently at Western Michigan University); and others. Many othe reports rom these applications o CIPP were archived in ERICcenters, and some appeared in dissertations (D. L. Stufflebeam, per-

sonal communication, April 16, 2010).

Te CIPP evaluation model emphasizes “learning-by-doing”

to identiy corrections or problematic project eatures. It is thusuniquely suited or evaluating emergent projects in a dynamicsocial context (Alkin, 2004). As Stufflebeam has pointed out, the mostundamental tenet o the model is “not to prove, but to improve”(Stufflebeam & Shinkfield, 2007, p. 331). Te proactive application othe model can acilitate decision making and quality assurance,and its retrospective use allows the aculty member to continuallyrerame and “sum up the project’s merit, worth, probity, and sig-nificance” (Stufflebeam & Shinkfield, 2007, p. 329).

Te link between the unique eatures o the CIPP evalua-tion model and the need or a systematic comprehensive guidingramework or service-learning projects is strong. Stufflebeam andShinkfield illustrate this link with this observation:

Te Context, Input, Process, and Product evaluationmodel has a strong orientation to service and the prin-ciples o a ree society. It calls or evaluators and clients

to identiy and involve rightul beneficiaries, clariytheir needs or service, obtain inormation o use indesigning responsive projects and other services, assessand help guide effective implementation o service, andultimately assess the services’ merit, worth, significance,

Page 7: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 7/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 63

and probity. Te thrust o CIPP evaluations is to providesound inormation that will help service providers regu-larly assess and improve services and make effective andefficient use o resources, time, and technology in order

to serve the well-being and targeted needs o rightulbeneficiaries appropriately and equitably. (2007, p. 330)

In summary, the authors believe that the model can help guideservice-learning project needs assessment and planning, monitorthe process o implementation, and provide eedback and judg-ment o the project’s effectiveness or continuous improvement.

Understanding the ModelTe our components o the Context, Input, Process, andProduct evaluation model are useul in guiding the stages o aservice-learning project. Tis section delineates the our compo-nents o the model and demonstrates each component’s role asapplied to a project. Te authors discuss how the model addressesthe Service-Learning Standards or Quality Practice (National Youth

Leadership Council, 2008). Finally, the authors describe the appli-cation o the model using a teacher-education service-learning

tutoring program.

The Four ComponentsAll our components o Stufflebeam’s CIPP evaluation model

play important and necessary roles in the planning, implemen-tation, and assessment o a project. According to Stufflebeam(2003), the objective o context evaluation is to assess the overallenvironmental readiness o the project, examine whether existing

goals and priorities are attuned to needs, and assess whether pro-posed objectives are sufficiently responsive to assessed needs. Tepurpose o an input evaluation is to help prescribe a program bywhich to make needed changes. During input evaluation, experts,evaluators, and stakeholders identiy or create potentially relevantapproaches. Ten they assess the potential approaches and helpormulate a responsive plan. Process evaluation affords opportuni-ties to assess periodically the extent to which the project is beingcarried out appropriately and effectively. Product evaluation identi-

fies and assesses project outcomes, both intended and unintended.

Page 8: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 8/28

64  Journal of Higher Education Outreach and Engagement

Component I.Context evaluation is ofen reerred to as needs assessment. It

asks, “What needs to be done?” and helps assess problems, assets,and opportunities within a defined community and environmental

context (Stufflebeam & Shinkfield, 2007). According to the authors,the objective o context evaluation is to define the relevant context,identiy the target population and assess its needs, identiy oppor-tunities or addressing the needs, diagnose problems underlyingthe needs, and judge whether project goals are sufficiently respon-sive to the assessed needs. Te methods or the context evaluationinclude system analyses, surveys, document reviews, secondarydata analyses, hearings, interviews, diagnostic tests, and the Delphi

technique (Dalkey & Helmer, 1963).Te context evaluation component addresses the goal identi-

fication stage o a service-learning project. An effective service-learning project starts with identiying the needs o service pro- viders (students) and the needs o the community. Many pitallsare associated with needs assessments. Most can be attributed tothe ailure o adequate identification and articulation, in advance,o crucial indicators (e.g., purpose, audience, resources, and dis-semination strategies). Application o the context evaluation com-

ponent o the CIPP evaluation model could potentially preventthese pitalls.

Component II.Input evaluation  helps prescribe a project to address the

identified needs. It asks, “Howshould it be done?” and identi-fies procedural designs and edu-

cational strategies that will mostlikely achieve the desired results.Consequently, its main orienta-tion is to identiy and assess cur-rent system capabilities, to searchout and critically examine poten-tially relevant approaches, and torecommend alternative projectstrategies. Te result o the

input evaluation step is a projectdesigned to meet the identifiedneeds. Te success o a service-

learning project requires a good project plan that, i implemented

“Te success o aservice-learning project requires a good project plan that, iimplemented correctly,will benefit both service providers (students)and service recipients

(community members).” 

Page 9: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 9/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 65

correctly, will benefit both service providers (students) and ser- vice recipients (community members). Methods used to executean input evaluation include inventorying and analyzing availablehuman and material resources, proposed budgets and schedules,

and recommended solution strategies and procedural designs.Key input evaluation criteria include a proposed plan’s relevance,easibility, superiority to other approaches, cost, and projectedcost-effectiveness. Literature searches, visits to exemplary proj-ects, employment o advocate teams, and pilot trials are all appro-priate tools to identiy and assess alternative project approaches.Once a project plan is developed, it can be evaluated (using tech-niques such as cost analyses, logic models, Program Evaluation andReview echniques [PER], and various scales) according to thecriteria that were identified in the input evaluation step (Stufflebeam

& Shinkfield, 2007).

Component III.Process evaluation monitors the project implementation pro-

cess. It asks, “Is it being done?” and provides an ongoing checkon the project’s implementation process. Important objectives oprocess evaluation include documenting the process and providing

eedback regarding (a) the extent to which the planned activitiesare carried out and (b) whether adjustments or revisions o theplan are necessary. An additional purpose o process evaluationis to assess the extent to which participants accept and carry outtheir roles.

Process evaluation methods include monitoring the project’sprocedural barriers and unanticipated deects, identiying neededin-process project adjustments, obtaining additional inorma-

tion or corrective programmatic changes, documenting theproject implementation process, and regularly interacting withand observing the activities o project participants (Stufflebeam

& Shinkfield, 2007). Process evaluation techniques include on-siteobservation, participant interviews, rating scales, questionnaires,records analysis, photographic records, case studies o partici-pants, ocus groups, sel-reflection sessions with staff members,and tracking o expenditures.

Process evaluation can be especially valuable or service-

learning projects because (a) it provides inormation to makeon-site adjustments to the projects, and (b) it osters the develop-ment o relationships between the evaluators (in this case, the twotask orce members in research and evaluation methodology) andthe clients/stakeholders that are based on a growing collaborative

Page 10: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 10/28

66  Journal of Higher Education Outreach and Engagement

understanding and proessional skill competencies, which can pro-mote the project’s long-term sustainability.

Component IV.Product evaluation identifies and assesses project outcomes. It

asks, “Did the project succeed?” and is similar to outcome evalua-tion. Te purpose o a product evaluation is to measure, interpret,and judge a project’s outcomes by assessing their merit, worth, sig-nificance, and probity. Its main purpose is to ascertain the extentto which the needs o all the participants were met.

Stufflebeam and Shinkfield (2007) suggest that a combina-tion o techniques should be used to assess a comprehensive set

o outcomes. Doing so helps cross-check the various findings. Awide range o techniques are applicable in product evaluations,and includes logs and diaries o outcomes, interviews o beneficia-ries and other stakeholders, case studies, hearings, ocus groups,document/records retrieval and analysis, analysis o photographicrecords, achievement tests, rating scales, trend analysis o longitu-dinal data, longitudinal or cross-sectional cohort comparisons, andcomparison o project costs and outcomes.

Providing eedback is o high importance during all phases othe project, including its conclusion. Stufflebeam and Shinkfield(2007) suggest the employment o stakeholder review panels andregularly structured eedback workshops. Tey stress that thecommunication component o the evaluation process is absolutelyessential to assure that evaluation findings are appropriately used.Success in this part o the evaluation requires the meaningul andappropriate involvement o at least a representative sample o stake-holders throughout the entire evaluation process.

Product evaluation used in service-learning projects can serveat least three important purposes. First, it provides summativeinormation that can be used to judge the merits and impacts othe service-learning project. Second, it provides ormative inor-mation that can be used to make adjustment and improvement tothe project or uture implementation. Tird, it offers insights onthe project’s sustainability and transportability, that is, whether theproject can be sustained long-term, and whether its methods canbe transerred to different settings.

Page 11: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 11/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 67

The Context, Input, Process, and ProductEvaluation Model’s Linkage to Service-LearningStandards for Quality Practice

In 2008, the National Youth Leadership Council devised theK-12 Service-Learning Standards or Quality Practice (National

Youth Leadership Council, 2008). Te standards, which were vettedthrough a series o “reactor panels” convened nationwide bythe National Youth Leadership Council and RMC Research

Table 1. Applying the Context, Input, Process, and Product Evaluation

Model to the Service-Learning Standards for Quality Practice

Standards for Quality Practice Context, Input, Process, and Product

Framework 

Service-learning actively engages participants in mean-ingful and personally relevant service activites.

• Context evaluation: Identify participants’needs.

• Input evaluation: Design project that isengaging and targets partcipants’ needs.

Service-learning is intentionally used as an instruc-tional strategy to meet learning goals and/or contentstandards.

• Context evaluation: Identify learninggoals.

• Input evaluation: Design project as aneffective instructional strategy to meetlearning goals.

Service-learning incorporates multiple challengingreection activities that are ongoing and that promptdeep thinking and analysis about oneself and one’srelationship to society.

• Input evaluation: Design project thatincludes multiple challenging reectionactivities.

• Process evaluation: Assess reection

activities through reective journals,focus group interviews, and surveys onself-perceptions.

Service-learning promotes understanding of diversityand mutual respect among all participants.

• Input evaluation: Design project that willpromote understanding of diversity andmutual respect among all participants.

• Process evaluation: Formatively andsummatively assess whether the projectpromoted understanding of diversity andmutual respect among all participants.

Service-learning provides youth with a strong voice inplanning, implementing, and evaluating service-learningexperiences with guidance from adults.

• Context, input, process, and productevaluation: Involve participants in plan-ning, implementing, and evaluatingservice-learning project.

Service-learning partnerships are collaborative, mutuallybenecial, and address community needs.

• Context evaluation: Identify participants’and community needs.

• Input evaluation: Design project that ismutually benecial and allows partici-pants to work collaboratively to addresscommunity needs.

Service-learning engages participants in an ongoingprocess to assess the quality of implementation andprogress toward meeting specied goals, and usesresults for improvement and sustainability.

• Process and product evaluation: Engageparticipants in an ongoing process toassess the quality of implementation andprogress toward meeting specied goals,and use results for improvement andsustainability.

Service-learning has sufcient duration and intensityto address community needs and meet speciedoutcomes.

• Context evaluation: Identify communityneeds, and specify intended outcomes.

• Input evaluation: Design project withsufcient duration and intensity.

• Process and product evaluation: Assesswhether community needs and speciedoutcomes are met.

Page 12: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 12/28

68  Journal of Higher Education Outreach and Engagement

Corporation, serve as a yardstick or judging the quality o K-12service-learning practices. able 1 outlines how the CIPP evalua-tion model can serve as an organizing ramework or systematicevaluation o service-learning projects to help meet the Standards

or Quality Practice.It is evident that the CIPP evaluation model has unique ea-

tures that can help effectively address the K-12 Service-LearningStandards or Quality Practice. Tese unique eatures includecontext evaluation, ongoing process evaluation, and the model’semphasis on engaging participants in the evaluation process. TeCIPP evaluation model can help provide youth with a strong voicein planning, implementing, and evaluating service-learning expe-

riences; engage participants in an ongoing process to assess thequality o implementation and progress toward meeting specifiedgoals; and use evaluation results or improvement and sustainability.

Applying the Model to a Teacher-EducationService-Learning Tutoring Project

During spring semester 2008, a service-learning tutoringproject was initiated to address the learning needs o pre-serviceteachers in the Elementary Education project at a public research

university in the southeastern United States, as well as the needso at-risk readers in the local school system. wenty-six pre-ser- vice teachers taking a course in diagnostic/prescriptive teachingo reading completed a service-learning assignment by tutoring 26Response-to-Intervention students (RI at-risk readers) in kinder-garten, first grade, and second grade. University Internal ReviewBoard (IRB) human subjects approval was secured or this project.Te CIPP evaluation model was used to guide the conception,

design, implementation, and assessment o the tutoring project.Te authors believe that its use led to the project’s achieving thedesired outcomes. Te CIPP components were implemented asshown in able 2 and in the narrative below.

Page 13: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 13/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 69

Table 2. Using the Context, Input, Process, and Product Evaluation Model

to Guide a Service-Learning Tutoring Project

Context, Input, Process, and Product

Evaluation Model Components

Methods Used in the Service-Learning

Tutoring Program

Component I: Context evaluation

Identify the needs, and the assets and opportunities

for addressing the needs.

• Assessed the setting for the intendedservice.

• Interviewed school principal, teachers, andreading specialists.

• Reviewed school records.

• Identied at-risk readers and their needs.

• Administered diagnostic tests given toat-risk readers.

• Conducted initial quantitative assessmentof at-risk readers.

• Conducted pre-service teacher focusgroup interviews.

• Conducted initial quantitative assessmentsof pre-service teachers.

Component II: Input evaluation

Prescribe a project to meet the identied needs,

and identify and assess project strategies and pro-

cedural designs.

• Reviewed relevant literature.

• Interviewed school principal, teachers, andreading specialists.

• Consulted university reading faculty mem-bers and other experts.

• Viewed exemplary projects.

• Consulted Learn and Serve America.

• Formed advocate teams.

• Service-learning task force members metbiweekly.

• Conducted pre-service teacher focusgroup interviews.

Component III: Process evaluation

Monitor project’s process and potential procedural

barriers, and identify needs for project adjustments.

• Identied what activities should bemonitored.

• Received biweekly update from service-learning task force.

• Observed service-learning activities.

• Kept a log of the activities.

• Interviewed at-risk readers.

• Interviewed pre-service teachers.

• Interviewed school principal, teachers, andreading specialists.

• Reviewed pre-service teachers’self-reections.

• Reviewed students’ work samples.

• Conducted debrieng with pre-serviceteachers.

Component IV: Product evaluation

Measure, interpret, and judge project outcomes,

and interpret their merit, worth, signcance andprobity.

• Conducted post-project quantiativeassessments of pre-service teachers.

• Conducted post-project focus groupinterview of pre-service teachers.

• Conducted post-project quantitativeassessment of at-risk readers.

• Administered at-risk readers’ survey.

• Interviewed or surveyed other stake-holders, including faculty instructor,principal, teacher, reading specialist, andparents of at-risk readers.

Page 14: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 14/28

70  Journal of Higher Education Outreach and Engagement

Assessing Community Needs Using ContextEvaluation

Te community involved in the teacher-education service-learning project included the university initiating the project and

the local school system. Te university was assessed as to its needsor and capability o carrying out the service-learning project. Teuniversity’s college o education offered both undergraduate andgraduate projects o study in teacher preparation. All o its projectsincorporated internship opportunities or capstone projects in col-laboration with K-12 school proessionals and other communitypartners. Service-learning became part o the culture at this univer-sity. Te creation o a new position, the vice chancellor or service-

learning, generated a surge o interest in service-learning projectsand service-learning opportunities on and off campus. Te uni- versity had a well-established inrastructure or service-learningresearch activities. University curricula included well-integratedprojects that enable students to spend a semester or more in a con-nected series o courses linked to service-learning projects outsidethe university.

The needs within the university.

Following a review o curriculum vitae and discussions amongaculty members, a service-learning aculty task orce was createdor this project; it consisted o one university administrator, eightexpert content area aculty members, and two aculty members inresearch and evaluation methodology who served as evaluators othe project. Te service-learning task orce examined the univer-sity’s mission, curriculum, proessional teaching standards, classexperiences, literature, and eedback rom school systems, and then

identified the need within the college to work toward improvingteacher retention. It was urther ascertained that pre-exposure tothe school environment decreases teacher attrition. Tat is, i pre-service teachers have a better understanding o the teaching pro-ession and the reality o working with diverse student populationsthrough hands-on practice, they are more likely to acquire neededproessional pedagogical skills and to stay in the teaching field oncethey enter it.

The needs within the local school system.o identiy community needs within the local school system,

the task orce communicated with and interviewed adjunct acultymembers who were also teachers in the school system. Assistance

Page 15: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 15/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 71

to elementary-level at-risk readers topped the needs list. Followingthese meetings, the task orce asked, “What kind o service-learningproject will best meet both the needs o the elementary-level at-riskreaders and the need o our pre-service teachers to gain firsthand

experience with diverse student populations?” and “Which schoolhas the greatest needs?” Te task orce looked or a school situ-ation that provided the best fit between the pre-service teachers’needs and the needs o children. Once a potential site was iden-tified, the researchers met with the principal and discussed theirproposal. Students in the Response-to-Intervention process (RIat-risk readers) were selected to be the service-learning recipi-ents because they were working below grade level in reading andwriting. Tey were considered at-risk readers and writers, but werenot receiving special education services. Tis target population oat-risk readers needed, but had not been receiving, individual assis-tance in reading.

Te service site was an elementary school that is very repre-sentative o a typical elementary school in the county; the school’sracial and socioeconomic balance was the result o countywidedistrict school equity policies. Te principal was very receptive tothe proposed service-learning project. Te elementary teachers

involved were positive about the potential benefits o the projector their students and were pleased to work with the pre-serviceteachers and the university aculty.

Assessment of pre-service teachers’ readiness.Initial assessments o the pre-service teachers were conducted

in January, beore the service-learning intervention, in five ses-sions o ocus group interviews to explore their initial attitudes

and dispositions about this project. Te interviews revealed thatthe pre-service teachers were equipped with the knowledge andskills needed to provide the service. More importantly, theyexpressed curiosity and a strong desire to participate in this project.Quantitative instruments were also used prior to project imple-mentation to assess the pre-service teachers’ initial level regardingthe ollowing constructs: community service sel-efficacy, moti- vations regarding volunteer activity, sel-esteem, and confidencein making a clinically significant contribution to the community

through service. Tese quantitative research instruments includedthe ollowing:

• Te Sel-esteem Scale (Rosenberg, 1965)

Page 16: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 16/28

72  Journal of Higher Education Outreach and Engagement

• Te Community Service Sel-efficacy Scale (Reeb,

Katsuyama, Sammon, & Yoder, 1998)

• Te Volunteer Functions Inventory (Clary, et al., 1998)

• Te Personal Social Values Scale (Mabry, 1998)

Assessment of at-risk readers’ needs.Similarly, additional context evaluation–related assessments

were completed on the initial status o elementary school students’sel-esteem, and their steps toward independence and academicachievement in reading and oral and written language skills. Tepre-service teachers administered literacy assessments, including

running records (Clay, 1993), the Qualitative Reading Inventory-4(Leslie & Caldwell, 2006), the Elementary Reading Attitude Survey(McKenna & Kear, 1990), and the Burke Reading Interview (Burke,

1980). Based upon these assessments, the pre-service teachersdesigned and taught lessons that targeted the students’ needs whilebuilding upon their strengths.

Formulating Plans Using Input Evaluation

Input evaluation was completed in order to prescribe a soundservice-learning project. Meetings were conducted with universityreading aculty members, reading specialists in the school, andpotential collaborating classroom teachers to discuss what kindo service-learning projects would best meet the students’ needs.Based on inormation gathered rom the input evaluation process,a tutoring project that joined pre-service teachers in a readingmethods course with a selected cohort o RI at-risk readers wasprescribed. Each week during the 15-week semester course, the

pre-service teachers were assigned to spend 3 hours and 30 min-utes on preparation or their tutoring experience and 1 hour and30 minutes providing direct tutoring service to an identified RIat-risk reader.

Next, an extensive literature review on best practices orworking with at-risk readers was conducted as part o the initi-ated service-learning project. Te task orce members spoke withaculty members in the area o reading education in this university

and other universities. Tey also discussed the plan with and soughteedback rom reading specialists, in the targeted school. Finally,the task orce watched videos o exemplary service-learning proj-ects, visited leading service-learning websites, and discussed ele-ments that would be important to include in this project.

Page 17: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 17/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 73

As an important part o the input evaluation, expert inputwas sought to judge the easibility o the service-learning tutoringproject beore its implementation, and adjustments were madeto improve the project. Contributing experts included mem-

bers o Learn and Serve America (the unding agency o theproject), several nationally recognized experts in service-learning,the university’s vice chancellor or service-learning, and theDepartment o Curriculum and Instruction chairperson. Basedon the input received, the task orce held ace-to-ace discussionsas well as Delphi studies, a structured communication technique(Dalkey & Helmer, 1963), to refine the plan. Te improved plans werethen shared with the principal and cooperating teachers or theirinput.

Monitoring Progress Using Process Evaluationo assess the process, the task orce members held biweekly

meetings to give updates on the project’s implementation. Teyalso shared positive stories and discussed any potential problemsthat needed to be addressed. Te task orce members held regulardiscussions with the collaborating teachers, the principal, and thereading specialists.

Te university aculty member who taught the reading courseand her graduate assistant observed the service-learning activi-ties regularly and kept a detailed log. Feedback regarding neededadjustment was gathered and acted upon. For example, the needsor guidance on administering an assessment and modiyinginstruction or English learners were identified and promptlyaddressed. Assessment guidance was provided by reviewing withthe pre-service teachers step-by-step instructions on how to prop-

erly administer the assessment. Instruction or English learners wasmodified by providing slower paced instruction that was accom-panied with explanation o unusual words. Te aculty memberalso held weekly in-class debriefings with the pre-service teacherson the service-learning project. Te pre-service teachers verballyreported on the activities, progress, and challenges in the service-learning project, and the instructor led class discussions to addressissues that arose.

Pre-service teachers’ sel-reflection served as an important

window into the operation o the project and its impact on them.Sel-reflection has been recognized as an essential link betweencommunity experience and academic learning (Ash & Clayton, 2004;

Felten, Gilchrist, & Darby, 2006). Reflection can also serve as a mirror

Page 18: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 18/28

74  Journal of Higher Education Outreach and Engagement

that makes pre-service teachers’ inner changes visible to the acultyinstructor and the project evaluators. Following each o the tutoringsessions, the pre-service teachers spent 15-20 minutes reflecting onwhat they had gained rom the tutoring session, and what they had

contributed to the students and the cooperating teacher.o monitor the ongoing impacts o the tutoring sessions on

the RI at-risk readers, ormal and inormal academic assess-ments, structured observations, and curriculum-based measure-ments were employed during the project. Formal assessments aremore “standardized” assessments: ests are uniorm in proceduresor being administered, amount o time allowed or completion,and method o grading. Inormal assessments are more flexible

in their usage. Tey are designed by teachers specifically or theirclassrooms/students and/or or a certain lesson or topic. Inormalassessments can also be things that are part o the daily classroomroutine. Structured observation is an observation in which theobserver completes a questionnaire or counts the number o timesan activity occurs. Te curriculum-based measure is a method omonitoring student educational progress through direct assess-ment o academic skills (Marston, Mirkin, & Deno, 1984). When usingcurriculum-based measures, the instructor gives the student brie

samples, or “probes,” made up o academic material taken romthe child’s school curriculum. Tese curriculum-based measureprobes are timed and may last rom 1 to 5 minutes. Te child’sperormance on a curriculum-based measure probe is scored orspeed, or fluency, and or accuracy o perormance. Te results arethen charted to offer the instructor a visual record o a targetedchild’s rate o academic progress. Te cooperating teachers as wellas the aculty instructor regularly observed the service-learningtutoring activities and provided oral and written eedback to thepre-service teachers.

Assessing Impact Using Product EvaluationTe product evaluation was centered on two overarching ques-

tions: (a) Did the service-learning experience meet the pre-serviceteachers’ learning needs? (b) What impacts did the service-learningreading skills tutoring project have on the at-risk readers? Teimpact o the program on pre-service teachers was assessed using

 various data, including pre-service teachers’ own reflections; directquantitative assessments using survey research scales; ocus groupinterviews o pre-service teachers; perormance on the assignmentsor the reading methods course in which the pre-service teacherswere enrolled; aculty observations o tutoring sessions; and input,

Page 19: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 19/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 75

such as interview responses and other verbal eedback receivedinormally, rom university aculty advisors, the elementary schoolprincipal, participating teachers, reading specialists, and RI at-risk learners.

Assessing pre-service teachers’ learning

Reection.Te pre-service teachers’ reflective journal entries, which

were part o the process evaluation component o the Context,Input, Process, and Product mode, were collected and enteredinto QSR International’s NVivo 8 qualitative data analysis sof-

ware (NVivo Qualitative Data Analysis Sofware, 2008)  and ana-lyzed qualitatively by three experienced researchers. Tese threeresearchers included the two task orce members in researchmethodology and another task orce member whose area oexpertise is reading education. Te researchers are not involvedin the project’s implementation process. Following the work oMacQueen, McLellan, Kay, and Milstein (1998); Fernald and Duclos(2005); and Fonteyn Vettese, Lancaster, and Bauer-Wu (2008), thethree researchers adopted a codebook structure and the iterative

process o discussing each code until agreement was reached.Tey then worked independently and collaboratively, gave eachcode a definition, set inclusion and exclusion criteria, and iden-tified sample text reerences rom the transcripts. Each reflective journal entry was independently coded by the three members othe team. Disagreements were resolved through discussion at ourweekly meetings so that the codes were urther refined (Fonteyn, et

al., 2008). Te authors believed that this process enhanced inter-

coder consistency. Wherever possible, team members attemptedto use participants’ own words “to guide the construction o codesand their definitions or in-depth analysis,” a process reerred toby MacQueen, et al. (1998) as emic or nonstructural coding (p. 33).Findings rom the reflection logs revealed an increase over thecourse o the semester in the pre-service teachers’ understanding othe reading process and, more importantly, o their roles in helpingstudents develop effective reading processes.

Focus group interviews.Five sessions o ocus group interviews were conducted beore

the service-learning intervention (as part o the context evaluation)and again afer the service-learning intervention (as part o the

Page 20: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 20/28

76  Journal of Higher Education Outreach and Engagement

product evaluation). Tese interviews with the pre-service teachersexplored whether the service-learning tutoring experience hadchanged their level o confidence in “making a difference,” any otheir personal social values in regard to service-learning and com-

munity service, or their attitudes and dispositions toward workingwith students rom diverse backgrounds. Tese interviews were video-recorded directly to DVD, transcribed, and analyzed usingNVivo 8 (NVivo Qualitative Data Analysis Sofware, 2008). Te resultsrevealed that the service-learning group had positive changes overthe course o the semester in terms o their levels o confidence in“making a difference,” their personal social values in regard to ser- vice-learning and community service, and their attitudes and dis-positions toward working with students rom diverse backgrounds.

Quantitative assessments of affective learningusing standardized research scales.Quantitative instruments were used beore the service-

learning project (as part o the context evaluation) and afer theservice-learning project (as part o the product evaluation) toassess changes in pre-service teachers regarding their communityservice sel-efficacy, motivations or volunteer activity, sel-esteem,

and confidence in making a considerable contribution to a com-munity through service. Te ollowing data collection instru-ments were used beore and afer the project: the Sel-esteem Scale(Rosenberg, 1965), the Community Service Sel-efficacy Scale (Reeb,

et al., 1998), the Volunteer Functions Inventory (Clary, et al., 1998),and the Personal Social Values Scale (Mabry, 1998). Te pre-serviceteachers’ responses on these research scales were statistically ana-lyzed, and positive changes were ound afer the service-learningexperience regarding sel-esteem, community service sel-efficacy,motivation to volunteer, and personal social values as they relateto community service.

Other assessments of pre-service teachers’learning.Using process and product evaluation, pre-service teachers’

academic perormances were monitored throughout the service-learning experience. Te university aculty instructor regularly

conducted observations o the tutoring sessions. Samples o pre-service teachers’ coursework, aculty observation field notes, cur-riculum-based measures, and reflective journals were collected andassessed by the university aculty instructor to explore the pre-ser-

Page 21: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 21/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 77

 vice teachers’ understandings and mastery o the reading processand reading instruction. Tesepre-service teachers demon-strated more diligence and better

quality in their academic workthan the pre-service teachers inanother section o the course thesame instructor was teaching,who were not involved in service-learning. Community memberswho worked closely with them(e.g., the principal, cooperatingteachers, and reading special-ists) were interviewed regardingtheir perceptions o these pre-service teachers. Teir responsesincluded descriptions such as“responsible,” “caring,” “skilled,” “patient,” and “the children lovethem.”

Assessing Impact on At-Risk Readers

Te effect o the service-learning project on the at-riskreaders’ sel-esteem; steps toward independence; and academicachievement in reading, and oral and written language skills, wasassessed through ormal and inormal academic assessment, struc-tured observations, curriculum-based measures, and students’reflective journals. Te assessment measures were employed duringthe project (during process evaluation) and at the end o the project(during product evaluation). Elementary students’ perceptionso themselves as readers, oral communicators, and writers wereassessed prior to their participation in the project (context evalu-ation) and afer the conclusion o the project (product evaluation)by using the Elementary Reading Attitude Scale (McKenna & Kear,

1990). A survey o the participating elementary school teachersregarding their assessment o the impact o the project on the at-risk readers was also administered. Tese results indicated thatthe at-risk readers benefited rom the project through increases inreading ability, sel-esteem, and sel-perception o themselves as

readers, as well as improved attitudes toward reading.Te CIPP evaluation model served as a guiding ramework and

systematically guided the conception, design, implementation, andassessment o this service-learning project in teacher-education.First, the Context Evaluation component identified the pre-service

“Tese pre-serviceteachers demonstrated

more diligence andbetter quality in

their academic workthan the pre-service

teachers. . . whowere not involved in

service-learning.” 

Page 22: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 22/28

78  Journal of Higher Education Outreach and Engagement

teachers’ need to gain firsthand experience working with studentsrom diverse backgrounds and the elementary school RI at-riskreaders’ need or individualized assistance in reading. Next, InputEvaluation incorporated input rom experts, practitioners, and

 various stakeholders, and prescribed an effective tutoring project.Ten Process Evaluation helped monitor the project implementa-tion process and provided ongoing eedback or needed adjust-ments. Finally, Product Evaluation assessed the service-learningproject’s impacts and provided eedback and judgment o the proj-ect’s effectiveness.

Conclusion

University-based service-learning projects involve multiplestakeholders and aim to meet the needs o service providers andcommunity partners. Teir complex and dynamic nature calls oran evaluation model that can operationalize the process and pro- vide step-by-step systematic guidance. Effectiveness is essentialto the continued viability and growth o service-learning projectsthroughout the United States.

Te issue o multiple goals and multiple constituencies is amajor challenge in evaluating service-learning. Without a guiding

evaluation model that is well-aligned with the unique eatures o aservice-learning project, assessing the project may be challenging.Service-learning projects are too ofen subject to suboptimal assess-ments despite the collection o massive amounts o data, becauseresearchers lack both the knowledge o key elements to assess,and access to a reliable evaluation model to organize the data andpresent the findings to various stakeholders in meaningul ways.

Without effective evaluation, service providers cannot make

their projects and services better (Stufflebeam and Shinkfield, 2007).For example, service providers cannot be sure their goals are worthyunless they validate the goals’ consistencies with sound values anda structured responsiveness to service recipients’ assessed needs.Service providers cannot plan effectively and invest their timeand resources wisely i they do not identiy and assess options.It may be difficult to sustain university-community partnershipsi the leaders o service-learning projects cannot show that theyhave responsibly carried out the project plan, produced beneficial

results, and met the needs o the community partner.Tough not a new model, the CIPP evaluation model intro-

duced in this article can be especially useul or guiding the plan-ning, implementation, and assessment o service-learning projects.

Page 23: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 23/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 79

Te model employs multiple methods, has been tested in a widerange o contexts, has evolved and strengthened over time, andis supported by theoretical and pragmatic literature (Stufflebeam

& Shinkfield, 2007). Te model not only assesses the impact o a

service-learning activity, but also helps to identiy communityneeds by working with the community to identiy needs and goalsto be addressed and to ormulate a project targeted to best meetthose identified community needs, monitor project implementa-tion, evaluate project outcomes, and provide recommendations orproject improvement.

Without the guidance o the Context, Input, Process, andProduct Evaluation Model, oversight or ailure can easily occur in

any part o the process, which could seriously hinder the service-learning project’s operation and diminish its effectiveness. Forexample, without the model’s regulation, the needs may not be ascareully identified, the match between the needs o participantsmay not be as meticulously ensured, problems in the implementa-tion process may not be identified and corrected in a timely manner,and necessary multiple assessment methods may not be designedinto the assessment. Each o these elements plays an important rolein the service-learning project’s success.

It is particularly important to note that because the Context,Input, Process, and Product evaluation model is a social systemsapproach to evaluation, all participants in a service-learning projecthelp design the activities that they agree will meet the articulatedneeds o both the service providers (university members) and ser- vice recipients (community members). Shared decision makingis essential, because “sustained, consequential involvement posi-tions them to contribute inormation and valuable insights andinclines them to study, accept, value, and act on evaluation reports”(Stufflebeam & Shinkfield, 2007, p. 330). Tis social systems approachosters an understanding and connection among service providers,community partners, and other stakeholders and can effectivelypromote long-term sustainability o a service-learning project.

From conceptualizing a service-learning project to insti-tutionalizing it in the curriculum requires inormed planning,guided implementation, and evidence o impact. A snapshot typeo assessment rom a single lens using mono-method construc-

tions is unlikely to provide the kind o comprehensive and mul-tiaceted data needed by educational policymakers. Stufflebeam’sCIPP evaluation model has the potential to guide aculty membersusing service-learning as a teaching tool to systematically gatherassessment data at each stage o a service-learning project, so that

Page 24: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 24/28

80  Journal of Higher Education Outreach and Engagement

they can make inormed judgments to sustain or improve theproject.

AcknowledgmentTe authors wish to express their appreciation to Daniel L.Stufflebeam (Distinguished University Proessor, retired romWestern Michigan University) or his helpul comments andimportant theoretical insights provided during the developmento this article.

ReferencesAlkin, M. C. (2004). Evaluation roots: racking theorists’ views and influences.

Tousand Oaks, CA: Sage.

Arzoumanian, L. (1994). Increasing community college child development asso-ciate (CDA) advisor skills in recording observations as a component o acompetency-based assessment . (ERIC Document Reproduction ServiceNo. ED367506).

Ash, S., & Clayton, P. (2004). Te articulated learning: An approach to guidedreflection and assessment. Innovative Higher Education, 29(2), 137–154.

Baker-Boosmara, M., Guevara, J., & Balour, D. (2006). From service tosolidarity: Evaluation and recommendations or international service-learning. Journal o Public Affairs Education, 12(4), 479–500.

Banta, W. . (1999). What’s new in assessment?  Assessment Update, 11(5),3–11.

Bordelon, . (2006). A qualitative approach to developing an instrument orassessing MSW students’ group work perormance. Social Work withGroups, 29(4), 75–91.

Borges, N., & Hartung, P. (2007). Service-learning in medical education:Project description and evaluation. International Journal o eaching &Learning in Higher Education, 19(1), 1–7.

Bringle, R., & Kremer, J. (1993). Evaluation o an intergenerational service-learning project or undergraduates. Educational Gerontology, 19(5),

407–416.Burke, C. (1980). Te reading interview. In B. P. Farr and D. J. Strickler (Eds.),

Reading comprehension: Resource guide (pp. 82–108). Bloomington, IN:School o Education, Indiana University.

Butin, D. (2003). O what use is it? Multiple conceptualizations o service-learning within education. eachers College Record, 105(9), 1674–1692.

Chien, M., Lee, C., & Cheng, Y. (2007). Te construction o aiwan’s edu-cational indicator systems: Experiences and implications. EducationalResearch or Policy & Practice, 6 (3), 249–259.

Clary, E. G., Snyder, M., Ridge, R. D., Copeland, J., Stukas, A. A., Haugen, J., &Miene, P. (1998). Understanding and assessing the motivation o volun-teers: A unctional approach. Journal o Personality and Social Psychology,74, 1516–1530.

Clay, M. M. (1993).  An observation survey o early literacy achievement .Portsmouth, NH: Heinemann.

Page 25: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 25/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 81

Combs, K. L., Gibson, S. K., Hays, J. M., Saly, J., & Wendt, J. . (2008).Enhancing curriculum and delivery: Linking assessment to learningobjectives. Assessment and Evaluation in Higher Education, 33(1), 87–102.

Dalkey, N. C., & Helmer, O. (1963). An experimental application o the Delphimethod to the use o experts. Management Science, 9(3), 458–467.

Driscoll, A., Gelmon, S., Holland, B., Kerrigan, S., Spring, A., Grosvold, K., &Longley, M. (1998). Assessing the impact o service-learning: A workbooko strategies and methods (2nd ed.). Portland: Portland State University,Center or Academic Excellence.

Driscoll, A., Holland, B., Gelmon, S., & Kerrigan, S. (1996). An assessmentmodel or service-learning: Comprehensive case studies o impact onaculty, students, community, and institution.  Michigan Journal oCommunity Service Learning, 3, 66–71.

Eyler, J., & Giles, D. E. (1999). Where’s the learning in service-learning?  San

Francisco, CA: Jossey-Bass.Felix, J. L. (1979). Research and evaluation to improve instruction: Te

Cincinnati strategy. Educational Evaluation & Policy Analysis, 1(2),57–62.

Felten, P., Gilchrist, L. Z., & Darby, A. (2006). Emotion and learning: Feelingour way toward a new theory o reflection in service-learning. Michigan

 Journal o Community Service Learning, 12(2), 38–46.

Fernald, D., & Duclos, C. (2005). Enhance your team-based qualitativeresearch. Annals o Family Medicine, 3(4), 360–364.

Fonteyn, M., Vettese, M., Lancaster, D., & Bauer-Wu, S. (2008). Developing

a codebook to guide content analysis o expressive writing transcripts. Applied Nursing Research, 21, 165–168.

Furco, A. (2002). Sel-assessment rubric or the institutionalization o service-learning in higher education. Berkeley, CA: University o Caliornia.

Galvin, J. C. (1983). What can trainers learn rom educators about evaluatingmanagement training? raining and Development Journal, 37 (8), 52–57.

Gelmon, S. B. (2000a). Challenges in assessing service-learning.  Michigan Journal o Community Service Learning , Special Issue, 84–90.

Gelmon, S. B. (2000b). How do we know that our work makes a differ-

ence? Assessment strategies or service-learning and civic engagement. Metropolitan Universities: An International Forum, 11(2), 28–39.

Gelmon, S., Holland, B., Driscoll, A., Spring, A., & Kerrigan, S. (2001). Assessing service-learning and civic engagement: Principles and techniques.Providence, RI: Campus Compact.

Hamm, D., Dowell, D., & Houck, J. (1998). Service learning as a strategyto prepare teacher candidates or contemporary diverse classrooms.Education, 119(2), 196.

Holland, B. (2001). A comprehensive model or assessing service-learningand community-university partnerships. New Directions or Higher

Education, 114, 51–60.Karayan, S., & Gathercoal, P. (2005). Assessing service-learning in teacher

education. eacher Education Quarterly, 32(3), 79–92.

Kezar, A. (2002). Assessing community service-learning. About Campus, 7 (2),14–20.

Page 26: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 26/28

82  Journal of Higher Education Outreach and Engagement

Koliba, C. J., Campbell, E. K., & Shapiro, C. (2006). Te practice o service-learning in local school–community contexts. Educational Policy, 20(5),683–717.

Kraf, R., & Krug, J. (1994). Building community: Service-learning in the aca-demic disciplines. Denver, CO: Colorado Campus Compact.

Leslie, L., & Caldwell, J. (2006). Qualitative reading inventory – 4. Boston,MA: Pearson.

Mabry, J. B. (1998). Pedagogical variations in service-learning and studentoutcomes: How time, contact, and reflection matter. Michigan Journal oCommunity Service Learning, 5, 32–47.

MacQueen, K., McLellan, E., Kay, K., & Milstein, B. (1998). Codebook devel-opment or team-based qualitative analysis. Cultural Anthropology 10(2),31–36.

Marchel, C. (2004). Evaluating reflection and sociocultural awareness in ser-

 vice-learning classes. eaching o Psychology, 31(2), 120–123.Marston, D., Mirkin, P., & Deno, S. (1984). Curriculum-based measurement:

An alternative to traditional screening, reerral, and identification. Te Journal o Special Education, 18, 109–117.

Matthews, J. M., & Hudson, A. M. (2001). Guidelines or evaluating parenttraining projects. Family Relations, 50(1), 77–86.

McKenna, M. C., & Kear, D. J. (1990). Measuring attitude toward reading: Anew tool or teachers. Te Reading eacher, 43(8), 626–639.

Moore, D. . (1999). Behind the wizard’s curtain: A challenge to the truebeliever. National Society or Experiential Education Quarterly, 25(1),23–27.

National Youth Leadership Council. (2008). K-12 service-learning standardsor quality practice. Saint Paul, MN. Retrieved April 30, 2010, romhttp://www.nylc.org/pages-resourcecenter-downloads-K_12_Service_Learning_Standards_or_Quality_Practice?emoid=14:803

Nicholson, . (1989). Using the CIPP model to evaluate reading instruction. Journal o Reading, 32(4), 312–318.

NVivo Qualitative Data Analysis Sofware (Version 8). (2008). n.p.: QSRInternational Pty Ltd.

Osokoya, M., & Adekunle, A. (2007). Evaluating the trainability o enrolleeso the Leventis Foundation (Nigeria) Agricultural Schools’ Programs.

 Australian Journal o Adult Learning, 47 (1), 111–135.

Peterson, . (2004). Assessing perormance in problem-based service-learningprojects. New Directions or eaching & Learning, 100, 55–63.

Pritchard, I. (2002). Community service and service-learning in America:Te state o the art. In A. Furco & S. H. Billig (Eds.), Service-learning:Te essence o the pedagogy  (pp. 3–19). Stamord, C: Inormation AgePublishing.

Reeb, R. N., Katsuyama, R. M., Sammon, J. A., & Yoder, D. S. (1998). Te

Community Service Sel-efficacy Scale: Evidence o reliability construct validity, and pragmatic utility.  Michigan Journal o Community ServiceLearning, 5, 48–57.

Rosenberg, M. (1965). Society and the adolescent sel-image. Princeton, NJ:Princeton University Press.

Page 27: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 27/28

Using the Context, Input, Process, and Product Evaluation Model as a Comprehensive Framework 83

Shaffer, J. P. (1980). Control o directional errors with stagewise multiple testprocedures. Annals o Statistics, 8, 1342–1347.

Steinert, Y., Cruess, S., Cruess, R., & Snell, L. (2005). Faculty development orteaching and evaluating proessionalism: From project design to cur-riculum change. Medical Education, 39(2), 127–136.

Steinke, P., & Buresh, S. (2002). Cognitive outcomes o service-learning:Reviewing the past and glimpsing the uture.  Michigan Journal oCommunity Service Learning, 8(2), 5–14.

Stufflebeam, D. L. (1971). Te use o experimental design in educational eval-uation. Journal o Educational Measurement, 8(4), 267–274.

Stufflebeam, D. L. (2003). Te CIPP model or evaluation. In D. L. Stufflebeam& . Kellaghan (Eds.), Te international handbook o educational evalua-tion (Chapter 2). Boston, MA: Kluwer Academic Publishers.

Stufflebeam, D. L., & Shinkfield, A. J. (2007). Evaluation theory, models, &

applications. San Francisco, CA: Jossey-Bass.roppe, M. (Ed.). (1995). Foreword. In M. roppe (Ed.), Connecting cognition

and action: Evaluation o student perormance in service-learning courses.Providence, RI: Campus Compact.

yler, R. W. (1942). General statement on evaluation. Journal o EducationalResearch, 35, 492–501.

Zhang, G., Griffith, R., Metcal, D., Zeller, N., Misulis, K., Shea, D., & Williams,J. (2009, April). Assessing service and learning o a service-learning pro-

 gram in teacher education using mixed-methods research. Paper presentedat the American Education Research Association Annual Conerence,

San Diego, CA.Zhang, G., Zeller, N., Shea, C., Griffith, R., Metcal, D., Misulis, K., Williams, J.,

& Knight, S. (2008, October). A 360° assessment o the multidimensionaleffects o a service-learning program in teacher education using mixed-methods research. Paper presented at the 8th  International ResearchConerence on Service-Learning and Community Engagement, NewOrleans, LA.

About the Authors

Guili Zhang  is an assistant proessor o research and evaluationmethodology in the Department o Curriculum and Instructionat East Carolina University. She earned her bachelor’s degree inBritish and American Language and Literature rom ShandongUniversity, her master’s degree in English Education romGeorgia Southern University, and her Ph.D. in research andevaluation methodology rom the University o Florida.

Nancy Zeller  is a proessor o research methodology in theDepartment o Curriculum and Instruction at East Carolina

University. She earned her bachelor’s degree in speech and the-ater, her master’s degree in English language and literature, andher Ph.D. in educational inquiry methodology, all rom IndianaUniversity–Bloomington.

Page 28: Ej 957107

7/17/2019 Ej 957107

http://slidepdf.com/reader/full/ej-957107 28/28

Robin Griffith  is an assistant proessor o reading, English/language arts in the College o Education at exas ChristianUniversity. She earned her bachelor’s degree in human develop-ment and amily studies with a concentration in early childhoodeducation, her master’s degree in language and literacy educa-tion, and her Ph.D. in curriculum and instruction, all rom exasech University.

Debbie Metcalf   is a National Board Certified teacher andeacher-in-Residence in the Department o Curriculum andInstruction at East Carolina University. She earned her bachelor’sdegree in education and social sciences and her master’s degreein special education, both rom San Diego State University.

Jennifer Williams  is an assistant proessor o special educa-

tion in the Department o Curriculum and Instruction at EastCarolina University. She earned her Ph.D. in special educationrom North Carolina State University.

Christine M. Shea is a proessor in the Social Foundations oEducation program, Department o Curriculum and Instructionat East Carolina University, Greenville, North Carolina. Sheearned her bachelor’s degree in history and English literaturerom Nazareth College o Rochester, New York, her master’sdegree in elementary education rom State University o New

York at Geneseo, her M.Ed. in comparative education rom theUniversity o Illinois–Champaign at Urbana-Champaign, andher Ph.D. in educational policy studies rom the University oIllinois at Urbana-Champaign.

Katherine Misulis is an associate proessor and assistant chair-person in the Department o Curriculum and Instruction at EastCarolina University. She earned her bachelor’s degree in Englishand her master’s degree in education rom State University oNew York at Potsdam, her C.A.S. in educational administrationrom State University o New York at Brockport, and her Ph.D.

in reading education rom Syracuse University.