ma lei hsieh instruction librarian rider university patricia h. dawson science librarian rider...

Click here to load reader

Upload: james-henderson

Post on 18-Jan-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

Ma Lei Hsieh Instruction Librarian Rider University Patricia H. Dawson Science Librarian Rider University VALE User Conference, Rutgers University, Piscataway, NJ Jan. 5, 2011 Three Summative Surveys in Assessing Information Literacy and Learning Outcomes Overview Three Summative Surveys 1. Develop survey questions 3. Test questions 4. Before library instruction begins, students take the online survey. 2. a. Install survey in Google Docs 5. Data exported to Excel at semester end 6. Data tabulation and analysis 7. Findings shared among librarians 8. Librarians modify strategies in teaching 9. Determine next cycle of assessment Information Literacy Objectives Assessment Cycle Methodology 1.All surveys were administered online via Google Docs in Moore Librarys Labs. 2.In fall 2009 and Spring 2010, pre-test survey was administered to students attending the research instruction (RI) sessions covering a diverse number of disciplines. 3.In fall 2010, pre-test Survey were administered to students in core composition (CMP-125) classes attending the RI sessions. 4.Post-test survey were administered to CMP-125 classes that came back for follow-up sessions, prior to the instruction. Methodology continued 5.An identity code was used to match students who took both pre- and post- test surveys in fall Records for the CMP-125 in fall 2009 (named S1.a) were extracted to compare the results with the CMP-125 in fall 2010 because these are the most similar cohorts. 7.The assessment results of two classes in fall 2010 with variation of teaching and testing methods were used for comparison purposes. 8.One-minute papers were used for two classes to qualify student learning in the sessions. Surveys Participants by Grade Level S1 & S2 1 st Year 2 nd Year 3 rd Year 4 th Year Grad.Other % No. S1 23% (259) 16% (184) 22% (247) 19% (214) 16% (177) 3% (36) 100% (1117) S2 35% (304) 15% (128) 20% (175) 14% (126) 16% (138) 1% (9) 100% (880) All years were well represented with more freshmen in both surveys. Participating Students Areas of Study Disciplines S1 S2 Business 39%42% Social Sci. 21%20% Education 21%19% Humanities 6%7% Science 5%4% Questions on S1 (Fall 2009) Questions on S2 (Spring 2010) Scores of S1 & S2 Q1Q2Q3Q4Q5Average S170%66%27%79%51%59% Q6Q7Q8Q9Q10Average S214%43%32%29%25%29% Most participants could differentiate scholarly journals from popular magazines (Q4). The majority of students understand the purposes of the catalog and databases (Q1, Q2). Students were not likely to use an encyclopedia for background information (Q3). Scores of S1 & S2 continued 1 Q1Q2Q3Q4Q5Average S170%66%27%79%51%59% Q6Q7Q8Q9Q10 S214%43%32%29%25%29% About half of the participants could locate the librarys full-text journals (Q5). On the whole, students were weak in searching strategies and would have difficulty searching the librarys resources (Q6-10). Scores of S1 & S2 continued 2 Q1Q2Q3Q4Q5Average S170%66%27%79%51%59% Q6Q7Q8Q9Q10 S214%43%32%29%25%29% Students had most difficulty searching the catalog by subject (Q6). Participants were not likely to use books as a reliable source for detailed history on a subject (Q10). A large majority of students will need help with Boolean connectors (Q7, Q9). Scores by Grade Level (S1 & S2) 1 st Year 2 nd Year 3 rd Year4 th Year Grad. S1 55%59% 63% S2 29%30%28% 30% Frequency of Prior RI Sessions at Rider - S2 (spring 2010) 0 time1-2 times3-4 times> 5 times % of Students 39%41%15%5% The frequency depended on students memory; therefore the precision is not known. On an average, two-fifths of students had no prior RI session. Three-fifths of freshmen had never had a RI session. One-fifth of seniors and 43% of graduate students had never had a RI session. Frequency of Prior RI Sessions by Scores S2 Comparison of S1 & S2 Disciplines S1 Scores S2 Scores Humanity 58% 29% Business 56% 28% Education 63% 30% Science 61% 29% Social Science 62% 28% Average 59% 29% Differences in scores among areas of study are not significant. Some areas of study were under represented. Conclusive Findings of S1 & S2 These two surveys as pre-tests provide a snapshot across many disciplines of students skills in IL. Both indicate that even though the majority of students at the Lawrenceville campus of Rider University knew the purposes of the librarys catalog and journal databases, many would have difficulty searching effectively in these resources. CMP-125 Classes by Grade Level Fall 2009 (S1.a) & Fall 2010 (S3) S1.a N=136 S3 N=177 Freshman12%11% Sophomore74%73% Junior7%8% Senior3%6% Other4%3% CMP-125 Classes by Discipline Fall 2009 (S1.a) & Fall 2010 (S3) S1.a N=136 S3 N=177 Business41% 30% Soc. Sci.16%23% Science13%14% Edu.9% 11% Human.7%8% Undecl.7% 5% Other7%10% NQ1Q2Q3Q4Q5 Average S1.a 13667% 64%27%74%49%47% S %75%22%81%47% 59% S3: Frequency of Prior RI Sessions by Scores 1.Frequency was by students memory, the precision is unknown. 2.Fall 2010 pre-test of CMP-125 students. Comparison of the Fall 2010 (S3) Pre- and Post-test Surveys (N=44) Q1Q2Q3Q4Q5Q6Q7Q8Q9Q10 Ave. Pre- 73%75%27%82%45%7%45%7%36%25% 42% Post- 75%61%20%68%48%23%30%27%25%30% 41% Diff. 2%-14%-7%-14%3%16%-15%20%-11%5%-1% 1.These students took both pre- and post-test surveys. 2.They had similar strengths and weaknesses as the previous cohorts. 3. Students improved on 5 items but decreased on 5. Comparison of the Fall 2010 Pre- and Post-test Surveys (N=44) Q1Q2Q3Q4Q5Q6Q7Q8Q9Q10 Ave. Pre- 73%75%27%82%45%7%45%7%36%25% 42% Post- 75%61%20%68%48%23%30%27%25%30% 41% Diff. 2%-14%-7%-14%3%16%-15%20%-11%5%-1% 1.Scores improved on the concepts of finding and using books (Q1, Q6, Q10), on truncation (Q8), finding full text journals (Q5). 2.Scores decreased on the concepts of databases (Q2), and searching with Boolean connectors AND & OR (Q7, Q9). 3.Scores decreased on identifying scholarly journal (Q4). Comparisons of Three Other Classes Class A: In Fall 2010, a CMP-125 class took pre-test before instruction, and post-test right after the RI sessions. The professor gave one point credit to students as incentive if the post-test beats the previous cohorts. Class B: An Educational Opportunity Program (EOP) freshmen took the survey two months after receiving 4 RI sessions in the summer. Class C: Administered a one-minute paper that asked what the students found confusing after the end of the second RI session. Results of Class A (N=16) CMP-125 class received credit for post-test immediately after instruction. This class averaged at 58% at post-test, 17% higher than the peers in the CMP-125 classes. The incentive may help students pay better attention. Students remembered the concepts better right after the instruction. Results of Class B (N=15) Two EOP sections received 4 RI sessions during the summer with one group received another post-test later in the fall semester. In the summer after receiving 4 RI sessions, both classes improved significantly with post-test scores 62% vs. pre test 38%. The pre- and post-test questions in the summer are different from those of S3, but they cover additional core concepts. Students were given gift incentives for good learning in the summer. The group that received the post-test given to the CMP-125 classes during the fall semester averaged 37%, 4% lower than the post-test scores of S3, possibly indicating a low retention of skills from the summer sessions. Findings of Class C (N=31) From the one minute paper: What do you still find confusing? Boolean connector questions (using and/or) = 2 Citation questions (when no date or author, how to cite) = 2 Noodlebib (citation software program) Truncation What other resources available to use? How to organize gathered information for paper Defining the topic Database organization requires too many clicks (home page confusing??) Need step-by-step instructions in writing Findings continue 1 Findings continue 2 Implications Implications - continued Next Steps? Next Steps? - continued References