[Advances in Librarianship] Advances in Librarianship Volume 35 || Assessment in a Medium-Sized Academic Library: A Success Story

Download [Advances in Librarianship] Advances in Librarianship Volume 35 || Assessment in a Medium-Sized Academic Library: A Success Story

Post on 16-Feb-2017




0 download

Embed Size (px)


<ul><li><p>Assessment in a Medium-SizedAcademic Library: A Success Story</p><p>Carolyn Gutierrez and Jianrong WangLibrary, The Richard Stockton College of New Jersey, Galloway, NJ, USAAbstract</p><p>This case study demonstrates the positive changes that evolved from a series of assessmentactivities. It shows that even smaller libraries can conduct assessment, with the support ofcolleagues and the library administration. Librarians can take a proactive role rather thanwaiting for a mandate from college administration. Two years of LibQUALs survey results(2005 and 2008) were analyzed in depth using statistical correlation analyses. Following this,respondents comments were categorized by dimension and analyzed to detect correlations.The information collected was then used to track trends and highlight the strengths andweaknesses of the library through the eyes of its users. A comparison of the survey resultsshowed an increase in perceived service in all dimensions. However, user expectations roseeven faster, especially with regard to e-resources, equipment, and study space. Positive resultsabout staff expertise and service attitudes demonstrated that users valued the personalattention and capabilities of the library staff. The study shows that a user survey is only thefirst step in an assessment process. Assessment can be effective only if follow-up actionsare taken to address negative feedback and the actions then communicated to all stakeholders.While assessment has become a necessity for many libraries, small- and medium-sizedlibraries often shy away from it, due to limited resources. The Richard Stockton CollegeLibrary undertook assessment in areas in which it could expect achievable results. Anotheroutcome came in the form of additional resources, which narrowed the gap between libraryservices and users needs.</p><p>Keywords: Academic libraries; library assessment; evaluation; LibQUALs; data analysisI. Introduction</p><p>In this period of tightened budgets and relatively free access to informationonline, academic librarians are discovering that the central role of the libraryin their institutions is no longer a given. It is necessary now to justify itsposition with tangible evidence of the librarys contributions to the missionof their institutions.CONTEXTS FOR ASSESSMENT AND OUTCOME EVALUATION IN LIBRARIANSHIPADVANCES IN LIBRARIANSHIP, VOL. 35r 2012 by Emerald Group Publishing LimitedISSN: 0065-2830DOI: 10.1108/S0065-2830(2012)0000035007</p><p>65</p></li><li><p>Carolyn Gutierrez and Jianrong Wang66The Richard Stockton College of New Jersey (RSC) is a liberal arts collegewith over 8000 students located in southern New Jersey. The library staffconsists of 3 administrators, 5 librarians, 18 full-time staff and 10 part-time.Although the library participated in LibQUALs surveys and assessmentof certain individual services, such as pre- and post-testing of student learningin information literacy sessions and analysis of periodical usage, an in-depthassessment of library services as a whole had never been undertaken.</p><p>In the summer of 2008 a Faculty Resource Network (FRN) seminar onlibrary assessment at New York University, attended by the authors, trans-formed attitudes about assessment by a library. Conference conveners, Hillerand Self (2008), emphasized using assessment creatively and adopting apositive, rather than defensive attitude toward findings, even if they arenegative. They introduced seminar participants to the merits and applicationsof assessment, and stressed that lack of a statistical background was not abarrier. Everyone and every department in a library can conduct and benefitfrom assessment on some level. Assessment can provide the opportunity to tellthe librarys story to administrators, emphasizing not only its strengths but itsweaknesses. It becomes part of an ongoing process toward improving libraryservices and meeting user needs. According to the seminar conveners,whatever type of assessment libraries employ, following up on the results iscrucial to success. Stakeholders need to be informed of the results and ofmeasures taken to remediate the problems identified through assessment.</p><p>Neither of the Stockton librarians attending the FRN seminar had astatistical background. Nevertheless, they returned to work with enthusiasmabout assessment. They initiated a meeting of library faculty and adminis-trators and persuaded them about its value and, as a result, a Library AssessmentCommittee was established consisting of the library director, two associatedirectors, five librarians, and the coordinator of information technology.</p><p>Although there are diverse ways to carry out assessment, many librariesuse LibQUALs, a web-based assessment tool that collects and interpretslibrary users perceptions of library service quality. Since the RSC libraryadministered LibQUALs in 2005 and 2008, the Assessment Committeedecided to start with an analysis of that data.</p><p>In both surveys, users expressed satisfaction with the services providedby the library staff but had some negative views about Information Controland Library as a Place. In 2005 the library administration reviewed theLibQUALs results and shared them with the professional staff. Somechanges were made, notably in the establishment of a Graduate StudentLounge and increased library hours, but there the analysis more or less ended.The 2008 results showed some improvement, but the expectations of usershad increased even more. The changes made as a result of the 2005 responses</p></li><li><p>Assessment in a Medium-Sized Academic Library 67did not seem to be adequately reflected in the 2008 survey, which wasdisappointing. We responded and gave them what they wanted and theyrestill unhappy was the general sentiment expressed. The LibQUALs surveywas again relegated to the shelf. This time the Assessment Committee took adifferent approach.A. Literature Review</p><p>Since the focus of this article is not just assessment but also about using theresults, the authors examined how other libraries utilized their LibQUALsfindings. LibQUALs provides a quantitative representation of respondentsperceptions of library services but does not address the questions Why dothey feel that way? or How should we improve our services? It is whatlibraries do with the LibQUALs analysis that can lead to evidence-basedchange. A detailed analysis of respondent comments or a follow-up withqualitative research can fill these gaps.</p><p>Gerke and Maness (2010) in their LibQUALs analysis discoveredwhich factors correlated with a high rate of satisfaction with the electronicresources collection namely, patron age, discipline, frequency of library website use, and frequency of visits to the physical library. Interestingly, theyfound it was not a persons age or discipline that made a significant differ-ence, but frequency of their use of the library web site. The more attractivelibraries with study space available also had the highest rate of on-site use.They found a link between the frequency of library visits and satisfactionwith electronic resources. This suggests that use of a librarys physical spaceand its web site are not divorced from their level of satisfaction with theelectronic resources collection. They may in fact reinforce one another. Gerkeand Maness study refutes the argument that brick and mortar libraries willsoon become obsolete.</p><p>LibQUALs results can also be helpful when analyzed prior to a strategicplanning process. Saunders (2008) discusses the value of drilling down intothe LibQUALs data in preparation for strategic planning. LibQUALsdata offer a contemporary snapshot of how patrons view their library, whilestrategic planning focuses on where the library should be in 5 or 10 years. Priorto their LibQUALs survey, Purdue librarians had an anecdotal under-standing of patron use and perceptions, but the LibQUALs data helpedthem to produce a balanced strategic plan based on fact, not merely impression.</p><p>Another way of approaching the LibQUALs data is to examine thedifferences in patrons and their expectations. Jones and Kayongo (2008)reported that Notre Dames libraries found users comments extremely helpfulin shedding light on the quantitative data. They discovered, after examining</p></li><li><p>Carolyn Gutierrez and Jianrong Wang68and mapping them to the LibQUALs survey questions, that under-graduates, graduates, and faculty had differing needs and expectations. Forundergraduates, access to the physical library, despite the availability of remoteaccess to electronic resources, was still important. They wanted the library tobe open 24 hours for studying and appreciated the personal attention receivedfrom library staff. Graduate students, on the other hand, were more concernedwith print and/or electronic journal collection coverage and much lessconcerned with services or the physical library. Faculty echoed the graduatestudent concerns, but expressed their dissatisfaction with the size of thecollections even more vehemently. Both groups voiced unhappiness withthe recall policy when books were checked out of the library. In response, NotreDame revised their recall policy, introduced 24-hour access during studyweeks, and renovated the Graduate Student area.</p><p>Heath, Kyrillidou, and Askew (2004) provide an excellent compilationof articles on how libraries utilized their LibQUALs findings. One ofthe entries (Knapp, 2004) emphasized the importance of adopting a positiveapproach to the findings. Instead of reacting defensively to negative res-ponses, Knapps library system drew on the results to redesign public servicesat the University of Pittsburgh Library System. To increase student satis-faction with the quality of service, they launched new initiatives such asdigital reference service, a targeted instruction program, and a Peer-to-PeerLibrary Consultants Program.</p><p>Of particular interest was the way in which Lewis (2009) at EastCarolina University, tied in their LibQUALs results with student inform-ation literacy outcomes. Local questions added to the survey were specificallydesigned to garner information on how their information literacy programcontributed to student learning and academic success.</p><p>Those surprised and dismayed with lower scores after switching toLibQUALs Lite from LibQUALs are directed to an evaluation of howsampling in LibQUALs Lite can affect scoring (Thompson, Kyrillidou, &amp;Cook, 2009). The authors provide linking equations to convert scores fromthe traditional LibQUALs items into the LibQUALs Lite protocol.II. Data Analysis at Stockton</p><p>Both quantitative and qualitative analyses were utilized in analyzing the twoyears of results at Stockton. Quantitative data from LibQUALs elicitedinformation on how users felt about the library services, while the qualitativedata from comments indicated why users rated the services as they did, andwhere and how they would like the services to be improved.</p></li><li><p>Assessment in a Medium-Sized Academic Library 69A. Quantitative Analysis</p><p>The LibQUALs survey deals with three dimensions which are Affect ofService, Information Control, and Library as Place which are detailed inAppendix A. Statistical correlation analysis was used to compare 2005 and2008 results to determine if there were any trends. Dimensions wereanalyzed by user groups and each question was correlated with its ratings ofthe five means (minimum, desired, perceived, adequacy, and superiority)defined by LibQUALs. Significance was sought through comparing ratingchanges reflected in each dimensional questions made by user categories. Tobenchmark Stockton library service ratings, results were also compared withacademic libraries in the Virtual Academic Library Environment (VALE), aNJ state consortium. The goal was threefold: (1) to identify areas in whichfeasible improvements might be made; (2) to offer constructive suggestionson how those improvements might be accomplished; and (3) to highlightareas that met with user approval. The results were as follows:Sou1. Improved user services</p><p>Faculty, students, and staff perceived that library service overall had improved in 2008. All</p><p>three dimensions were rated higher in 2008 than in 2005. This was particularly true of Affect of</p><p>Service in which all questions were rated significantly higher (Fig. 1).</p><p>2. Rising expectations</p><p>Correlated with the rising score of perceived services were the rising expectations. Data from the</p><p>2005 and 2008 surveys indicated that, overall, users minimum and desired expectations were</p><p>rising, and rising faster than perceived services. Expectations in areas of Affect of Service, such</p><p>as remote access to e-resources, making information easily accessible for independent use,</p><p>library as a getaway for study, learning or research were significantly higher (Fig. 2).</p><p>3. Greater demand for resources</p><p>The Adequacy gap score showed that the library was struggling to meet the minimum</p><p>expectations of users in Information Control and Library as Place. Print and/or electronic6.4</p><p>6.6</p><p>6.8</p><p>7</p><p>7.2</p><p>7.4</p><p>7.6</p><p>Affect of Service Information Control</p><p>Library as Place</p><p>2005</p><p>2008</p><p>Fig. 1 Perceived services, 2005 and 2008.rce: LibQUALs Survey: Richard Stockton College of New Jersey, 2005 and 2008.</p></li><li><p>12345678</p><p>2005 2008 2005 2008</p><p>Minimum Desired</p><p>Affect of Service</p><p>Information Control</p><p>Library as Place</p><p>Fig. 2 Expectations of services, minimum vs. desired, 2005 and 2008.Source: LibQUALs Survey: Richard Stockton College of New Jersey, 2005 and 2008.</p><p>0</p><p>0.1</p><p>0.2</p><p>0.3</p><p>0.4</p><p>0.5</p><p>0.6</p><p>Affect of Service Information Control</p><p>Library as Place</p><p>2005</p><p>2008</p><p>Fig. 3 Adequacy gap (perceived servicesminimum expectations), 2005 and 2008.Source: LibQUALs Survey: Richard Stockton College of New Jersey, 2005 and 2008.</p><p>Carolyn Gutierrez and Jianrong Wang70journal collections I require for my work was rated lower in 2008 than 2005, suggesting a</p><p>greater gap between the user expectation for services and the services that users perceived they</p><p>were receiving. It was also evident that resource accessibility was regarded as important. The</p><p>gap was also greater in describing the library as a getaway for study, learning or research and</p><p>community space for group learning and group study (Fig. 3).B. Benchmarking</p><p>Eleven VALE libraries participated in the 2008 LibQUALs survey (seeAppendix B), including Stockton. Among them were seven colleges anduniversities and four community colleges. The total number of respondentswas 4475. Because there was no separate VALE data available excluding RSCdata, the data provided in this article are a relative comparison.</p></li><li><p>Assessment in a Medium-Sized Academic Library 71Compared with other VALE institutions, RSC ranked higher in PerceivedService Means, Affect of Service, especially in Employees who instill confi-dence in users (0.37 higher) and Employees who are consistently courteous(0.34 higher) as shown in Fig. 4. However, RSC was rated lower in InformationControl and Library as Place as indicated in Figs. 5 and 6.6.0 6.5 7.0 7.5 8.0</p><p>Employees who instill confidence in usersGiving users individual attention</p><p>Employees who are consistently courteousReadiness to respond t...</p></li></ul>


View more >