[advances in librarianship] advances in librarianship volume 35 || documenting the results of good...

16
Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children Virginia A. Walter Information Studies Department, University of California, Los Angeles, CA, USA Abstract This chapter documents the evolution of the application of evaluation methods to public library services for children and teens in the United States. It describes the development of age-specific output measures and the subsequent requirement by funding agencies for outcome evaluations that measure changes in skills, attitudes, behavior, knowledge, or status as a result of an individual’s participation in a service or program. Some early outcomes research studies are cited, and California initiative to implement statewide outcome evaluation of its Summer Reading Program is presented as a case study. Training and education are suggested as ways to counter the major challenges for wider implementation of outcome evaluation of youth services programs in public libraries. Keywords: Public libraries; library services for children; library services for young adults; summer reading programs; outcomes evaluation I. Introduction For many librarians charged with the management of library services to children, interest in evaluation issues started with the publication of Planning and Role Setting for Public Libraries: A Manual of Options and Procedures by McClure, Owen, Zweizig, Lynch, and Van House (1987). This much-anticipated guide to strategic planning in public libraries, produced under the auspices of the Public Library Association, proposed that librarians engage in systematic community analysis efforts and then select the roles CONTEXTS FOR ASSESSMENT AND OUTCOME EVALUATION IN LIBRARIANSHIP ADVANCES IN LIBRARIANSHIP, VOL. 35 r 2012 by Emerald Group Publishing Limited ISSN: 0065-2830 DOI: 10.1108/S0065-2830(2012)0000035006 47

Upload: w-david

Post on 09-Feb-2017

214 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Documenting the Results ofGood Intentions: ApplyingOutcomes Evaluation to LibraryServices for Children

Virginia A. WalterInformation Studies Department, University of California,Los Angeles, CA, USA

Abstract

This chapter documents the evolution of the application of evaluation methods to publiclibrary services for children and teens in the United States. It describes the development ofage-specific output measures and the subsequent requirement by funding agencies foroutcome evaluations that measure changes in skills, attitudes, behavior, knowledge, or statusas a result of an individual’s participation in a service or program. Some early outcomesresearch studies are cited, and California initiative to implement statewide outcome evaluationof its Summer Reading Program is presented as a case study. Training and education aresuggested as ways to counter the major challenges for wider implementation of outcomeevaluation of youth services programs in public libraries.

Keywords: Public libraries; library services for children; library services for young adults;summer reading programs; outcomes evaluation

I. Introduction

For many librarians charged with the management of library services tochildren, interest in evaluation issues started with the publication ofPlanning and Role Setting for Public Libraries: A Manual of Options andProcedures by McClure, Owen, Zweizig, Lynch, and Van House (1987). Thismuch-anticipated guide to strategic planning in public libraries, producedunder the auspices of the Public Library Association, proposed that librariansengage in systematic community analysis efforts and then select the roles

CONTEXTS FOR ASSESSMENT AND OUTCOME EVALUATION IN LIBRARIANSHIPADVANCES IN LIBRARIANSHIP, VOL. 35r 2012 by Emerald Group Publishing LimitedISSN: 0065-2830DOI: 10.1108/S0065-2830(2012)0000035006

47

Page 2: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Virginia A. Walter48

that would be the best fit between community needs and a library’s missionand resources. The eight roles were given by McClure et al. were:

� Community Activities Center,� Community Information Center,� Formal Education Support Center,� Independent Learning Center,� Popular Materials Library,� Preschoolers’ Door to Learning,� Reference Library, and� Research Center.

A companion volume, Output Measures for Public Libraries (Van House, Lynch,McClure, Zweizig, & Rodger, 1987) proposed methods for quantifying thelibrary services that were implemented in response to the role-setting process.

Children’s library services managers and advocates found these planningand evaluation tools to be flawed in their failure to take into account the fullrange of services, programs, and activities offered to children of all ages inpublic libraries. The menu of roles above seemed to suggest that libraryservices to children were limited to preschool children. Clara Bohrer andKathleen Reif approached both the Public Library Association (PLA) andthe Association of Library Services to Children (ALSC) and persuadedthese two divisions of the American Library Association (ALA) to apply for agrant from the US Department of Education, Library Research and Demon-stration Program to develop and field test both quantitative and qualitativeevaluation measures for public library services to children under the ageof 14 and their caregivers. The resulting publication, Output Measures forPublic Library Services to Children: A Manual of Standardized Procedures (Walter,1992), was the first effort to provide a national audience of children’slibrarians and library directors with tools for both evaluating their full rangeof services and communicating the results to various stakeholders. Thefederal agency responsible for funding that effort was sufficiently impressedthat it asked the Young Adult Library Services Association (YALSA) tosubmit a proposal for the development of similar evaluation techniques forthat aspect of library services. That grant produced another manual, OutputMeasures and More: Planning and Evaluating Public Library Services for YoungAdults (Walter, 1995). Work on those two manuals, involving extensive fieldtesting and feedback from youth services librarians and public libraryadministrators, helped to make evaluation a more commonly accepted aspectof managing library services to children and teens.

This chapter examines the next stage in evaluating children’s and youngadult services in public libraries: the search for methods that would produce

Page 3: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Outcomes Evaluation to Library Services for Children 49

evidence of the outcomes of library services for children. It provides anoverview of several evaluation studies of children’s and young adult servicesthat were done using outcomes measures and looks in some detail atCalifornia’s statewide initiative to apply outcomes evaluation to its SummerReading Program.

II. The Rationale for Outcome Evaluation

A. Output Measures

Output measures, as set forth in the various manuals developed under thePublic Library Development Program of PLA, were designed to enablelibrarians to quantify and document the work they did. They counted howmany reference questions were answered or how many toddlers attendedstory time, or how many children attended story time. They made thesestatistics more meaningful by adding per capita measures. A small publiclibrary, for example, might report an ‘‘annual library visits by children’’count of 5000. This statistic acquires more salience when compared with thenumber of children in the library’s service area. If there were 1000 childrenin the service area, the output measure ‘‘children’s library visits per child’’ iscalculated to be 5 indicating, on average, every child in that communityvisited the library 5 times.

Not every public library implemented the use of the PLA planningprocess or output measures. However, the basic concepts involved in thisapproach to evaluation were widely disseminated through workshopssponsored by state libraries and programs at various library conferences. Thenext round of planning manuals from the PLA involved the publication ofthree editions of Planning for Results (Himmel & Wilson, 1998; Nelson, 2001,2008). The biggest conceptual shift in the ‘‘planning for results’’ approachwas in expansion of the library roles identified by McClure et al. (1987) to amenu of 14 Service Responses, which included 2 for youth services:

� Create Young Readers: Early Literacy, and� Succeed in School: Homework Help.

Although PLA has published a number of companion volumes to the basicplanning manuals, these have focused on issues such as managing andmarketing rather than evaluating. Youth services librarians whose organiza-tions have chosen to adopt this planning process have evidently chosen towork within the framework of the designated service responses. There has

Page 4: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Virginia A. Walter50

been no organized initiative to create a ‘‘planning for results for youthservices.’’

B. Questioning Output Measures

Just when public librarians were becoming comfortable with the rationaleand methodology for using output measures to quantify their services,funding agencies began to ask for the answers to a new question: whatdifference did it make? Five thousand children visited your library last year.So what? One thousand children attended preschool story hour. So what?What difference did it make in these children’s behavior, attitudes,knowledge, or status because of the library’s intervention? In other words,what were the desired outcomes of the library activity? Were those outcomesreached?

C. The Shift to Outcomes Measures

The Institute of Museum and Library Services (IMLS) replaced theDepartment of Education as the primary source of federal funds to librariesfor demonstration projects. In the late 1990s, it began to require that grantproposals specify the desired outcomes of their projects and to budget for anevaluation of those outcomes. They even offered workshops for grantrecipients to learn more about methods for evaluating outcomes, and a newterm entered the vocabulary of children’s librarians seeking federal grants:the logic model.

The outcomes logic model is a systems approach to evaluation. It asksthe librarian to first consider the mission of the library and the considerationsof various ‘‘influences’’ including funding agencies or potential programpartners. The program purpose should be derived from those considerationsand should be specific about what the library intends to do, for whom, andfor what outcome or benefit. The outcome is presumed to be a change inskill, attitude, behavior, or status as a result on the participants as a result ofthe library’s program. The program purpose in turn determines the inputs(resources needed to implement the program), activities designed to achievethe program outcomes, outputs (quantitative measures of those activities),and the outcomes themselves. Program planners are further instructed toidentify the indicators that would provide data about observable behaviorsand conditions, sources for those data, intervals at which data would becollected, and the targeted amount of desired change.

Clearly, implementing outcome evaluation was going to be a morecomplex and sophisticated process than even the most challenging output

Page 5: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Outcomes Evaluation to Library Services for Children 51

measures. Given minimal training and sufficient motivation, any librariancould collect the data needed for any of the earlier output measures and dothe calculations required to analyze and understand the results. Outcomeevaluations presented new challenges. In a survey of 139 California librariesconducted in 2010, respondents described outcome evaluation as a compli-cated undertaking with many barriers to implementation, including lack ofresources and expertise (Sweeney, 2010).

It was not always easy for staff to identify the desired change inpatrons’ behaviors, attitudes, knowledge, or skills as a result of theirparticipation in a library program. What did we really hope that toddlerswould learn or do differently after they attended a series of storytimes?And if we knew the answer to that question, what would be the indicatorsof those changes? The indicators would almost always need to come fromthe patrons themselves, and librarians were professionally trained not topry into the uses that library users made of the books they read or theanswers they received in response to their reference questions. Dare werisk invading patron privacy by asking about how participation in alibrary service changed them? And of course, when the patrons werechildren, there were additional issues of parental consent and developmentalconcerns.

And yet, children’s librarians recognized the value of knowing theoutcomes of their work. They had long recognized that good intentions werenot enough. In their hearts, they knew that storytimes and summer readingprograms benefited children in many ways, but were hard-pressed to prove it.They knew that in more stringent budget situations, programs and activitieswithout documented outcomes might be eliminated. So children’s librariansbegan to get on board the ‘‘outcomes evaluation express.’’ It is likely thatmany children’s librarians used the training provided by IMLS and appliedthe logic model to their grant-funded projects, or hired professionalevaluators to do the work for them. Unfortunately, the results of those effortswere usually not reported in the library press; and the knowledge gained wasconfined to local library jurisdictions.

III. Pioneering Outcome Evaluations of LibraryServices to Children and Teens

In spite of the methodological challenges presented by outcome evaluation,a few pioneering studies were conducted and their findings disseminated to arelatively broad audience.

Page 6: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Virginia A. Walter52

In 2000 Cindy Mediavilla and Virginia Walter received an ALAResearch Grant to develop a model for evaluating the outcomes of libraryhomework assistance programs. As a conceptual framework, they used the sixoutcomes of positive youth development that were synthesized andarticulated by the Public Libraries as Partners in Youth Developmentproject:

� Youth contribute to their community.� They feel safe in their environment.� They have meaningful relationships with adults and peers.� They achieve educational success.� They develop marketable skills.� They develop personal and social skills (Meyers, 2002).

The researchers used a mixed method approach to collect data from teens,library staff, and parents in seven libraries throughout the country. Theywere interested in teasing out outcomes for two different types of partici-pation in homework assistance programs: those for teens who used thehomework centers as students; and those who worked as volunteers or paidstaff in the centers.

The results from the pilot study showed that homework centerscontributed to positive youth development outcomes in a number of ways.For example, teens felt that they were contributing to their communitywhether they received homework assistance or helped to provide it. In thefirst instance, they said that by being good students they had a positiveimpact on their communities. Parents took a broader view, pointing out thatstudents who used the homework centers were positive role models for theirpeers. Students who worked in library homework centers welcomed theopportunity to help other teens succeed. Those who lived in low-incomecommunities had a particularly keen sense of giving back to theirneighborhoods. For the most part, teens felt safe studying or working inthe library after school. Some teens articulated a feeling of security thattranscended safety from physical dangers. They felt comfortable and cared forby library staff. Everyone involved with the library homework centers sawthem as places to learn skills that will someday be useful in the workplace.This went beyond obvious outcomes such as learning math and computerskills. Parents, teachers, and library staff noted that teens learned coopera-tion, discipline, courtesy, and problem-solving in the homework centerenvironment. Teens with few employment opportunities in their commu-nities found that working or volunteering in the homework programsgave them good training and a more pleasant work environment than otherjob opportunities that might have been open to them, such as fast food

Page 7: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Outcomes Evaluation to Library Services for Children 53

restaurants. Teens with higher personal career expectations saw their librarywork experience as a resume builder or as an asset on their college appli-cations (Walter & Meyers, 2003).

A. Field Test in Library Youth Services

Dresang, Gross, and Holt (2006) used a national leadership grant fromIMLS to develop and field test an outcome-based planning and evaluation(OBPE) model especially geared to library youth services. They field-testedtheir model at the Saint Louis Public Library and published the resultsin Dynamic Youth Services through Outcome-Based Planning and Evaluation.Their model, an adaptation of the logic model widely cited in guides tooutcome evaluation, makes it clear that planning is an important andintegral step in the process. These researchers show how outcomes flowlogically from assessing community needs and the library’s capacity formeeting those needs. In turn, the service response or program designed tomeet those needs flows logically from the defined outcomes. The evaluationthen provides data about whether the outcomes have been met and informsthe next cycle of OBPE.

The team behind the OBPE model tested their methods on a specificprogram at the Saint Louis Public Library, and the results of this outcome-based evaluation are reported in the publication cited above. The program,Children’s Access to and Use of Technology Evaluation (CATE), was an effortto learn how the digital divide affected low-income children in the city, todevelop programs and services to bridge the divide for fourth through eighthgrade students, and to develop the OBPE model for evaluating those services.Among the findings:

� Children were enthusiastic about using computers and thought they were more competent than

they actually were.� Children had little knowledge of the skills generally associated with information literacy.� Children wanted more kid-friendly sites and activities.� Boys and girls used computers differently but did not differ in their actual amount of usage.� Computer users also valued other services and resources offered by the library.� Teachers, parents, and children valued the library staff’s assistance.

By clearly demonstrating the applicability of outcome-based planningand evaluation efforts to library services to children, Dresang et al. (2006)provided a useful foundation on which other evaluators and library profes-sionals could build.

Page 8: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Virginia A. Walter54

IV. Summer Reading Program Outcome Evaluations

Public libraries in the United States have been offering summer readingprograms since the early 1900s (Locke, 1988). These reading promotionprograms have been institutionalized as a basic service in many, if not most,public libraries. These programs were assumed to help children avoid thenotorious ‘‘summer reading loss’’ that is well-documented in educationalresearch. As Matthews (2010) pointed out, however, there was little researchdocumenting the actual outcomes of summer reading programs. Librarieshad been citing the one study conducted by Heyns (1978) that suggestedthat library summer reading programs helped to reduce summer learningloss. There was nothing more current until the Dominican study, PublicLibrary Summer Reading Programs Close the Reading Gap (Roman, Carran, &Fiore, 2010) was published.

Funded by IMLS, the Dominican study was a product of partnershipsbetween the Johns Hopkins University Center for Summer Learning, TheColorado State Library, and the Texas State Library and Archives Com-mission. It was piloted at three public libraries, with the full study conduc-ted at 11 sites across the United States. An advisory committee providedoversight. The study targeted children entering fourth grade. Childrenwere given pretests and posttests to determine their reading levels beforeand after their library’s summer reading programs. While there were somelimitations to the study, including participant attrition and the lack of anonparticipant control group, the study nevertheless provided new data froma national sampling of summer reading programs.

Among the findings from the Dominican study was the unsurprisingresult that children who participated in library summer reading pro-grams scored higher on reading achievement tests at the beginning of thefollowing school year than did students who had not participated. Childrenwho participated tended to include more girls, Caucasians, and represen-tatives of higher socioeconomic levels than those who did not participate.They came from families with more books at home than those of non-participants, and they were more confident and enthusiastic readers thanchildren who did not take part in summer reading programs (Roman et al.,2010).

The Dominican study received a great deal of attention, being the firstnational study of summer reading that attempted to document the outcomesof this ubiquitous public library activity. There have, however, been otherinitiatives at the state level to capture the outcomes of library summer readingprograms. The Colorado State Library, for example, offers mini-grants to

Page 9: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Outcomes Evaluation to Library Services for Children 55

public libraries that measure four related statewide outcomes, namely thatchildren and teens will

� develop a habit of reading;� think that reading is fun;� become regular library users; and� think that the Summer Reading Program has helped them read better.

In addition to quantifying results, Colorado librarians are asked to collectqualitative data about the four outcomes by talking to participants and theirfamilies. Although the results are reported to the Colorado State Library,they have not been disseminated to the profession (Colorado SummerReading Program, n.d.).

V. The Case of the California Summer Reading Program

In California, the statewide Summer Reading Program is funded in part byan IMLS grant through the California State Library and is coordinated by theCalifornia Library Association. Participating libraries were asked to submitusage data but there was no effort to collect outcome data until 2008. At thattime Natalie Cole, the program coordinator, convened an advisory group oflibrarians from throughout the state to consider the implementation of anoutcome component.

A preliminary needs assessment had been done in a meeting of morethan 80 librarians at the California Library Association Conference. Theadvisory group revisited those findings and agreed that the most commonneeds that public libraries were trying to meet with summer readingprograms was the relative lack of reading for pleasure and low levels oflibrary usage in the summer. The group developed desired outcomes basedon those needs for three different age levels—preschool, school age, and teen.They designed methods for testing those outcomes using surveys, interviews,and focus groups. Those outcomes were tested at nine public library sites in2009 and 2010.

The pilot test, conducted over two summers, helped to determine thenext steps. Many of the participating libraries had suffered staff shortagesbecause of the budget cuts that were prevalent all over California in 2009.The reduced levels of staffing had made it difficult to implement the datacollection methods for all age levels. It became clear to the advisory groupthat compromises would have to be made between methodological rigor and

Page 10: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Virginia A. Walter56

practicality. The research design for 2011 would be what the consultant tothe project has called ‘‘outcome evaluation light.’’

The advisory group decided to broaden the outcomes evaluation in2011 to any California library that wished to participate. The preschoolsegment would be eliminated but families would be added into the mix.Libraries with summer reading programs that targeted children under fivewould report results as family participation. The result to be measuredfor school age, teen, adult, and family participants was limited to twooutcomes. The first outcome was defined as: Children [teens/adults/families]belong to a community (broadly defined) of readers and library users. Theadvisory group liked the notion of creating a reading community at thelibrary and felt that this outcome captured the essence of reading forpleasure that summer programs ideally encourage. Outcome One speaks toreaders who are looking for a social setting in which they can interactwith other readers. It also allowed for a more subtle interpretation of thelibrary as a haven for those who find reading to be a more personal andprivate activity. It assumed that any child, teen, adult, or family group whochose to participate in a library summer reading program, identified withthe library and with the summer program, thereby became part of acommunity of interest.

The concepts of community and libraries and reading are very closelylinked. It is instructive to visit a library and watch how people of all agesinteract with this public space. Parents read to their young children or spendtime in the new Family Place sites popping up all over California and otherparts of the United States. People help each other spontaneously at Internetstations. School children share homework assignments with classmates whoforgot to bring the instructions home with them. Patrons chat with clerks asthey check books in or out. Even people sitting quietly by themselves with abook or a magazine or their laptop in front of them seem to be enjoying thefeeling of being part of a community.

Outcome One recognized that there are many kinds of communities besidesthe geographic ones in which we live. These communities of choice may bebased on religion, voluntarism, hobbies, cultural or leisure activities, orethnic pride. With new emphasis on community and community building inthe Summer Reading Program, Californians are positioning the publiclibrary as a significant community of interest.

The librarians who had spearheaded the pilot project were still concernedabout the many children and teens in their communities who were not beingreached by their summer reading programs. Their experiences echoedfindings from the Dominican study that children who participated in librarysummer reading programs were, for the most part, the library ‘‘regulars.’’

Page 11: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Outcomes Evaluation to Library Services for Children 57

They were children who already liked to read and would probably read forpleasure during the summer whether there was a library program or not.Studies repeatedly showed that children from low-socioeconomic groups whowere most likely to suffer from summer learning loss and were consistentlyunderrepresented in summer reading programs. The Idaho Commission forLibraries had identified and tackled this problem with their ‘‘Bright FuturesSummer Reading Opportunities: Reaching Underserved Children’’ (IdahoCommission for Libraries, 2011). California, however, had not articulatedthis need as an outcome for summer reading.

To solve this problem, the group adopted a second outcome, aimed atreaching out to the children, teens, and families who were not regular libraryusers by articulating Outcome Two: [Desired number] of [underserved targetgroup] participate in the summer reading program. Libraries would beencouraged to analyze their own communities and identify the underservedpopulations that they wanted to reach. For example, a library might frametheir outreach outcome as: 20 children from the Recreation Center daycamp participate in the summer reading program. Or, 30 Spanish-speakingchildren participate in the summer reading program.

With this stated outcome, the summer reading program in Californiabecame both an outcome- and outreach-based program. This second outcomewas similar to simple output measures in that it asked librarians to set aquantitative target. The advisory group felt, however, that because it calledfor changes in behavior by a specific group of people it qualified as anoutcome measure.

Outcome Two also built on the community orientation of OutcomeOne. It acknowledged that the library is an important node in a web ofsupport services engaged in the increasingly difficult task of helpingfamilies raise healthy children. Through outreach, Summer ReadingPrograms can draw more people of all ages into that web so they can benurtured by it and can be enabled to nurture others as well. Outcome Twowas intended to institutionalize public library’ ongoing efforts to opentheir hearts and minds and doors to every man, woman, and child in theirservice areas.

California libraries participating in the statewide program in 2010 wereencouraged, but not required, to conduct the designated outcome evaluation.A webinar conducted in January, and available in its archived form sincethen, was attended by more than 120 people. It was designed to motivatepeople to take part, to give some basic information and training, and todirect people to the extensive web site devoted to the project (Introductionto California’s Outcome-Based Summer Reading Program, 2010). This website gives detailed background information, directions for collecting data

Page 12: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Virginia A. Walter58

through surveys, interviews, and focus groups, and tips for effective outreachefforts (California Summer Reading Program, 2011).

In spite of the best efforts of the coordinator and the advisory group,only 10 library jurisdictions (with a total of 224 branches) elected toparticipate in the outcomes evaluation aspect of the Summer ReadingProgram in 2011. However, between them, the participating librariesyielded an impressive amount of data.

School age children, the traditional audience for summer readingprograms, were predictably the most heavily represented participants. Fourthousand two hundred and ninety-nine children between the ages of 4 and13 participated in the outcome evaluation. They were given a menu ofdescriptors for the library in the summer time:

� A place to see friends.� A place to meet new friends.� A place to find books to read.� A place to find other things to read.� A place for activities and shows.� A place to do homework or study.� A place to use computers.� A community center.� A friendly place.� A fun place.� An awesome place.� A place for me.

They were allowed to check as many statements as they wanted. By far thelargest number of responses—78%—went to ‘‘a place to find books to read.’’However, 42% said it was ‘‘a place to meet friends’’ and 60% said it was ‘‘afriendly place.’’ These findings, taken with a 66% positive response to thestatement that they ‘‘like to share books or talk about the books they read,’’were interpreted as indicators that Outcome One had been met. The summerreading program did seem to provide children with a feeling of belonging toa community of readers.

A smaller number of teens participated in the outcome effort—a total of865 from 159 library branches. Even more teens than children described thelibrary as a place to find books to read: 83%. More teens—65%—also foundthe library to be a place to find other things to read. This makes sense, giventhe number of teens who like to read magazines and online content. Whileonly 43% of teens described the library as a place to see friends, nearly asmany, 39%, said it was a place to meet new friends. And 68% of the teenssaid they liked to talk about the books they read.

Page 13: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Outcomes Evaluation to Library Services for Children 59

Only 4 library jurisdictions, with a total of 63 branches, reported onfamily participation in summer reading. Two hundred sixty-six familysurveys for Outcome One were returned. Ninety-four percent said the library isa place to find books to read. Fifty-two percent said the library is a place tosee friends or other families, and 45% said that it is a place to meet newfriends. This is consistent with the observations of many librarians thatfamilies with preschool children tend to develop relationships with the otherfamilies they meet at storytimes and other library activities. A whopping92% said they like to share books or talk about the books they read.

The data for Outcome Two are a little harder to parse. Even fewer librarieschose to implement the outreach-oriented outcome. Six library jurisdictionsreached a total of 1064 previously underserved children, teens, and familiesthrough special outreach efforts. The range of people served is impressive: 73migrant children attending summer school in Imperial County, 15 teens at aLos Angeles residential drug rehabilitation program, 15 family membersfrom a social service agency in a low-income community, 8 parents andchildren from a college day care program, 3 homeless adults and 23 ESLstudents in Long Beach, 40 recipients of the Sacramento Food Bank SummerProgram, 90 parents and children in a Head Start program, 19 expectant andnew parents in a low-income community in Santa Clara County.

While the numbers of libraries participating was a little disappointing,the positive comments from the librarians who took part helped to validatethe value of the outcomes effort. One youth services coordinator welcomedthe opportunity to develop a more intentional approach to summer readingin her system. Like several other managers, she also found that the data theyreceived were welcomed by the library administrators, donors, and Board ofCommissioners.

Many librarians reported significant lessons learned that will make adifference in how they implement aspects of the program next year,particularly the outreach component. Many reported that the experience ledto new relationships with people working in various agencies that theypartnered with during the summer. One librarian wrote,

One very positive lesson: all our outreach for Summer Reading does have anaccumulative effect. For example, two of the Head Starts that were visited this yearwere happy to hand over rosters to the librarian because last year a librarian had visitedthem, and they had a successful experience then so they were happy to do it again. Justshows the subtle effects that outreach does have in our favor.

Two different libraries found that teens responded to the volunteeropportunities that had been created for the Summer Reading Program andredesigned their teen programs accordingly.

Page 14: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Virginia A. Walter60

In addition to the surveys reported briefly above, librarians wereencouraged to hold focus groups with representative groups of summerreading participants. Those who did so were impressed with what theylearned about what works and what doesn’t work so well in their program-ming, promotion, and outreach. They became converts to this qualitativeresearch method as a means to getting in-depth feedback from their patrons.

In general, many librarians have become convinced of the value of havinggood outcomes data, both to share with stakeholders and to use in makinginternal decisions about where to put resources. Some library administratorsmade good use of the information gathered to build support for the summerreading program. More than one respondent asked to see the statewide dataso they could see how their own library compared.

The coordinator of California’s Summer Reading Program, Natalie Cole,hopes to at least double the number of libraries participating during 2012. Itwould be helpful to be able to offer some kind of incentive for participation,but funding for that is not available. However, word is beginning to spreadabout the benefits of participating in both the outcome and outreachcomponents. Last year’s participants are proving to be excellent recruiters.They can speak first-hand about how relatively easy it was to implement andhow useful the results have been.

VI. Conclusion

The challenge of making outcome evaluation a standard aspect of youthservices is that it is still perceived as a complex—and optional—activity bymany practitioners. When evaluation is required by a funding agency, mostwill look for an outside professional to do the work. What Dresang et al.(2006) did with their OBPE model and what the organizers of the outcome-and outreach-based California Summer Reading Program tried to do, was tomake this tool available to front-line librarians and youth services coordi-nators. California librarians who conducted outcome evaluations of theirsummer reading programs learned that this is not rocket science and that ityields enormously useful information about their work. It contributed to amore intentional and focused approach to a program that had been conductedwithout much thought for decades.

Having worked now for three years to implement a soft launch of anoutcome- and outreach-based Summer Reading Program, California’s propo-nents now recognize that ongoing education is needed in order to convince morelibrarians to embrace this approach. Without monetary incentives to offer, stateorganizers must communicate the benefits to organizations that adopt outcome

Page 15: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Outcomes Evaluation to Library Services for Children 61

evaluation as a standard management tool. In every print- and web-basedpromotional piece and at every training opportunity, they repeat the benefitswhich accrue. They repeat that outcome-based summer reading programs:

� Demonstrate meaningful results.� Are relevant to the local community.� Attract funding.� Build capacity among staff.� Contribute to improved management decision making.

In addition California Summer Reading Program organizers are being evenmore proactive about training librarians in the skills needed to implement it.California libraries have been particularly hard hit by budget cuts, and manylibrarians are doing the work previously done by two or three professionals.Some have said that they just cannot contemplate doing anything differentor new. Therefore it is hard to convince them that proving the positiveimpacts of their services on their customers can help to compete for fundingin bad economic times.

As noted earlier, in California, methods for data collection have beenstreamlined to one short survey to be completed at the end of the summer byat least 100 participants along with 2 focus groups for each age level targetedby the program. Tips for doing these have been posted on the web site andrepeated in training workshops and webinars. Perhaps the most effective toolfor convincing people that they can do this, however, has been the testimonyof librarians who have done it. Their voices speak eloquently about therelative ease and value of outcome evaluation as a means to capture the resultsof the good work that youth services librarians do.

References

California Summer Reading Program. (2011). Retrieved from http://www.cla-net.org/displaycommon.cfm?an¼1&subarticlenbr¼64

Colorado Summer Reading Program: Outcome-Based Evaluation (OBE) Information.(n.d.). Retrieved from http://www.cde.state.co.us/cdelib/summerreading/Downloads/pdf/OBE_information.pdf

Dresang, E. T., Gross, M., & Holt, L. E. (2006). Dynamic youth services throughoutcome-based planning and evaluation. Chicago, IL: American Library Association.

Heyns, B. (1978). Summer learning and the effects of schooling. New York, NY:Academic Press.

Himmel, E. E., & Wilson, W. J. (1998). Planning for results: A public library trans-formation process. Chicago, IL: American Library Association.

Idaho Commission for Libraries. (2011). Bright futures reading opportuntities: Reachingunderserved children. Boise, ID: Author.

Page 16: [Advances in Librarianship] Advances in Librarianship Volume 35 || Documenting the Results of Good Intentions: Applying Outcomes Evaluation to Library Services for Children

Virginia A. Walter62

Introduction to California’s outcome-based summer reading program [webinar].(2010). Retrieved from http://infopeople.org/training/introduction_California’s-outcome/based-summer-reading-program

Locke, J. L. (1988). The effectiveness of summer reading programs in public libraries in theUnited States. Doctoral dissertation. University of Pittsburgh, Pittsburgh, PA.

Matthews, J. (2010). Evaluating summer reading programs: Suggested improve-ments. Public Libraries, 49(4), 34–40.

McClure, C. R., Owen, A., Zweizig, D. L., Lynch, M. J., & Van House, N. A.(1987). Planning and role setting for public libraries: A manual of options andprocedures. Chicago, IL: American Library Association.

Meyers, E. (2002). Youth development and libraries: A conversation with KarenPittman. Public Libraries, 41(September–October), 256–260.

Nelson, S. (2001). New planning for results: A streamlined approach. Chicago, IL:American Library Association.

Nelson, S. (2008). Strategic planning for results. Chicago, IL: American LibraryAssociation.

Roman, S., Carran, D. T., & Fiore, C. D. (2010). The Dominican study: Public librarysummer reading programs close the reading gap. River Forest, IL: DominicanUniversity Graduate School of Library and Information Science. Retrieved fromhttp://www.dom.edu/academics/gslis/downloads/DOM_IMLS_BOOK_2010_FINAL_web.pdf

Sweeney, J. K. (2010, November). State of library evaluation in California2010. Paper presented at the California Library Association Conference,Sacramento, CA.

Van House, N. A., Lynch, M. J., McClure, C. R., Zweizig, D. L., & Rodger, E. J.(1987). Output measures for public libraries: A manual of standardized procedures(2nd ed.). Chicago, IL: American Library Association.

Walter, V. A. (1992). Output measures for public library service to children: A manual ofstandardized procedures. Chicago, IL: American Library Association.

Walter, V. A. (1995). Output measures and more: Planning and evaluating public libraryservices for young adults. Chicago, IL: American Library Association.

Walter, V. A., & Meyers, E. (2003). Teens and libraries: Getting it right. Chicago, IL:American Library Association.