wilu assessment rubrics workshop

43
Welcome! WILU May 2014 Rubric Assessment (à la RAILS) for Your Library’s Instruction Program Claire Holmes Carroll Wilkinson [email protected] [email protected]

Upload: claire-holmes

Post on 24-Jun-2015

177 views

Category:

Education


1 download

TRANSCRIPT

Page 1: WILU Assessment Rubrics Workshop

Welcome!

WILU May 2014

Rubric Assessment (à la RAILS) for

Your Library’s Instruction Program

Claire Holmes Carroll [email protected] [email protected]

Page 2: WILU Assessment Rubrics Workshop

Agenda for Today :

• Background on Assessment,

RAILS & Rubrics

• Norming & Rating Sessions

• Reflections & Questions

Page 3: WILU Assessment Rubrics Workshop

Assessment…• Knowing what you are doing

• Knowing why you are doing it

• Knowing what students are learning as a result

• Changing because of the information

(Debra Gilchrist, Dean of Libraries and Institutional Effectiveness, Pierce College, from Assessment: Demonstrating the Educational Value of the Academic Library, ACRL Assessment Immersion, 2011)

Page 4: WILU Assessment Rubrics Workshop

Identify learning

outcomes

Create and enact

learning activities

Gather data to

check for learning

Interpret data

Enact decisions

to increase learning

Information Literacy

Instruction Assessment

Cycle (ILIAC)

Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional Skills." Journal of Documentation. 65.4. 2009.

Page 5: WILU Assessment Rubrics Workshop

The Institute of Museum and Library Services is the primary source of federal support for the nation’s 123,000 libraries and 17,500 museums. The Institute's mission is to create strong libraries and museums that connect people to information and ideas.

Megan Oakleaf, founder of all things RAILS. (more @ www.railsontrack.info)

Page 6: WILU Assessment Rubrics Workshop

www.railsontrack.info

Page 7: WILU Assessment Rubrics Workshop

RAILS Project Purposes

• Investigated an analytic rubric approach to IL assessment in higher education

• Developed a suite of IL rubrics

• Investigated rubric reliability & validity

• Developed training materials for training/ norming/ scoring

• Explored indicators of rater expertise

Page 8: WILU Assessment Rubrics Workshop

RAILS Participants’ Purposes• Professional development opportunity

• Develop rubrics for use on campuses

• Identify opportunities for assessment within the curriculum

• Gain experience in norming

• Assess student work to learn about their information literacy skills

Page 9: WILU Assessment Rubrics Workshop

New IL framework…

Our shift to examining and assessing student learning through the lenses of meta-literacy & threshold concepts will require time and consideration…

Page 10: WILU Assessment Rubrics Workshop

Framework for Information Literacy in Higher Education

• A growing and developing document that introduces a new definition for IL and provides an explanation of why and how IL has changed since 2000.

• May be adopted by the ACRL Board in August, 2014 after further open discussion and final revision.

Page 11: WILU Assessment Rubrics Workshop

IL in 2014

• Meta-literacyexpansion of the scope of traditional information skills

• Threshold conceptsthe core foundational ideas that, once grasped by the learner, create new perspectives

Page 12: WILU Assessment Rubrics Workshop

Teaching Values: A new approach?

DispositionsValues behind the development of information literate capabilities

Persistence and adaptability

Critical thinking Value of intellectualcuriosity

Page 13: WILU Assessment Rubrics Workshop

Understanding by Design1. What do you want students to learn?

(outcome)

2. How will you know that they have learned it? (assessment)

3. What activities/assignments help them learn, offer evidence of that learning and also provide assessment data?

(teaching method & assessment)

(Wiggins & McTighe, 2006)

Page 14: WILU Assessment Rubrics Workshop

Performance/Integrated AssessmentStudents reveal their learning when they are provided with:

complex,

authentic

LEARNING ACTIVITIES

to explain, interpret, apply, shift perspective, empathize

and self-assess.

What we assess.What they learn.

(Megan Oakleaf, Assessment: Demonstrating the Educational Value of the Academic Library, ACRL Assessment Immersion, 2011)

Page 15: WILU Assessment Rubrics Workshop

5 Questions for Assessment Design:1. Outcome What do you want the student to be able to

do?

2. IL Curriculum What does the student need to know in order to do this well?

3. Pedagogy What type of instruction will best enable the learning?

4. Assessment How will the student demonstrate the learning?

5. Criteria for evaluation

How will you know the student has done well?

(Lisa Hinchcliffe, Student Learning Assessment Cycle. ACRL Assessment Immersion, 2011)

Page 16: WILU Assessment Rubrics Workshop

Evidence of “authentic” student learning:

For instance, the research worksheet in your packet that asks students to break down and practice sequential steps in the search process.

Brainstorm… What other possible examples of evidence of student learning do we collect? What could we collect?

Page 17: WILU Assessment Rubrics Workshop

Brainstorm ideas…

Page 18: WILU Assessment Rubrics Workshop

Evidence: Possible examples of authentic student learning…• Research journal• Reflective writing• “think aloud”• Self or peer evaluation• Works cited page• Annotated bibliography• Posters• Multimedia presentations• Speeches• Open-ended question

responses

• Group projects• Performances• Portfolios• Library assignments• Worksheets• Concept maps• Citation maps• Tutorial responses• Blogs• Wikis• Lab reports

Page 19: WILU Assessment Rubrics Workshop

1. What are our expectations of students completing this assignment?

2. What specific learning outcomes do we want to see reflected in the completed assignment?

3. What evidence can we find that will demonstrate learning success?

4. Are there clear differences between the various levels of student work in this assignment?

Why create and use a rubric?

Page 20: WILU Assessment Rubrics Workshop

Norm!

Page 21: WILU Assessment Rubrics Workshop

• 2 dimensions1. criteria

2. levels of performance

• grid or table format

• judges quality

• translates unwieldy data into accessible information

(Image: thefirstgradediaries.blogspot.com)

Page 22: WILU Assessment Rubrics Workshop

Norming is Crucial

• “I know it when I see it” does not mean “I can articulate it.”

• Norming is critical for establishing shared understanding of the rubric and achieving greater inter-rater reliability.

Page 23: WILU Assessment Rubrics Workshop

Reasons why we norm…

Rubrics are powerful tools that provide structure and consistency to assessment. If more than one rater uses a rubric, norming is crucial. Norming facilitates agreement, identifies misunderstandings, and minimizes measurement errors.

Holmes, C. & Oakleaf, M. (2013). The Official (and Unofficial) Rules for Norming Rubrics Successfully. Journal of Academic Librarianship, 39(6), 599-602.

Page 24: WILU Assessment Rubrics Workshop

Inter-rater Reliability

• From the Cambridge Dictionary of Sociology“Measurement gives rise to consideration of the issues of reliability and validity. Reliability refers to the ability to

repeat the results of a measurement accurately (common forms include inter-rater reliability; test-retest reliability; and measures of internal consistency, including split-half and coefficient alpha). 

 

 • From Sage’s Encyclopedia of Survey Research Methods:

“The concept of inter-rater reliability essentially refers to the relative consistency of the judgments that are made of the same stimulus by two or more raters…. An

important factor that affects the reliability of ratings made by a group of raters is the quantity and the quality of the training they receive.)

Page 25: WILU Assessment Rubrics Workshop

SAMPLE RAILS RUBRIC (green handout in your packet)

 Performance Level 3

Student:Performance Level 2

Student:Performance Level 1

Student:Performance Level 0

Student:

 1.

Determines Key Concepts

Determines multiple key concepts that reflect the research topic/thesis statement accurately. 

Determines some concepts that reflect the research topic/thesis statement, but concept breakdown is incomplete or repetitive.

Determines concepts that reflect the research topic/thesis statement inaccurately.

Does not determine any concepts that reflect the research question/thesis statement. 

2. Identifies synonyms

and related terms Identifies relevant synonyms and/or related terms that match key concepts.

Attempts synonym (or related term) use, but synonym list is incomplete or not fully relevant to key concepts.

Identifies synonyms that inaccurately reflect the key concepts.

Does not identify synonyms.

3. Constructs a search

strategy using relevant operators

Constructs a search strategy using an appropriate combination of relevant operators (for example: and, or, not) correctly.

Constructs a search strategy using operator(s), but uses operators in an incomplete or limited way.

Constructs a search strategy using operators incorrectly. 

Does not use operators.

4. Uses evaluative

criteria to select source(s)

Uses evaluative criteria to provide in-depth explanation of rationale for source selected.

Uses evaluative criteria to provide a limited/superficial explanation of rationale for source selected.

Attempts to use evaluative criteria, but does so inaccurately or incorrectly.

Does not use evaluative criteria.

5. Uses Citations

Uses an appropriate standard citation style consistently and correctly. 

Uses an appropriate standard citation style consistently (bibliographic elements intact), but with minimal format and/or punctuation errors.  

Attempts an appropriate standard citation style, but does not include all bibliographic elements consistently or correctly.

Does not include common citation elements or does not include citations.

Page 26: WILU Assessment Rubrics Workshop

Rubric Norming Process1. Facilitator thinks aloud through scoring several examples.

2. Raters independently score a set of examples that reflects the range of artifacts

3. Raters come together to review their scores to identify patterns of consistent and inconsistent scores.

4. Discuss and then reconcile inconsistent scores.

5. Repeat the process of independent scoring on a new set of examples.

6. Again, raters come together to review their scores to identify patterns of consistent and inconsistent scores.

7. Discuss and then reconcile inconsistent scores. This process is repeated until raters reach consensus about applying the scoring rubric. Ordinarily, two to three of these sessions calibrate raters’ responses.

Page 27: WILU Assessment Rubrics Workshop

Workshop Norming Practice

Round 1• For first student work sample, Claire will

“norm aloud.”

• Participants will rate 1 work sample individually.

• Group discussion: Can we reach consensus for what constitutes evidence for each performance level?

Page 28: WILU Assessment Rubrics Workshop

Norm, norm, norm!

Page 29: WILU Assessment Rubrics Workshop

Keep in mind…

• An info lit skills rubric does not score discipline content; it scores information literacy skills.

• You can only score what you can see.

Page 30: WILU Assessment Rubrics Workshop

Norming: Round 2

• Participants will rate 2-3 more work samples individually.

• Group discussion: Are we closer to consensus?

• Do we establish rating ground rules?

• Does the rubric need to be modified?

Page 31: WILU Assessment Rubrics Workshop

Access Needed Info: Original Rubric

Advanced Developing BeginningDetermine Key Concepts Student:

Determines keywords/subject /subheadings that describe the research question/thesis fully including relevant variants

Student:

Determines keywords/subject /subheadings that describe the research question/thesis partially

Student:

Does not determine keywords/subject /subheadings that describe the research question/thesis

Access the Needed Information

Student:

Accesses information using effective, well-designed search strategies.

Demonstrates persistence and ability to refine search

Student:

Accesses information using simple search strategies

Student:

Accesses information randomly

Retrieves relevant information (Determine the extent of information needed)

Student:

Retrieves information sources that fit search parameters, relates to concepts or answers research question

Student:

Retrieves information sources that partially fit search parameters, relates to concepts or answers research question

Student:

Does not retrieve information that fits search parameters, relates to concepts or answers research question

Page 32: WILU Assessment Rubrics Workshop

What do norming revisions look like? Advanced Developing BeginningDetermine Key Concepts

Student: Determines keywords/subject

/subheadings that fully describe the research question/thesis fully including relevant variants

Student: Determines

keywords/subject /subheadings that partially describe the research question/thesis partially

Student: Does not determine

keywords/subject /subheadings that describe the research question/thesis

(pharm disease state & drug; nursing multi-faceted, omit shortage)

Access the Needed Information

Student: Accesses information using effective, a

logical progression of advanced search strategies such as limits, Boolean searches, or combined searches

Demonstrates persistence and ability to refine search

Student: Accesses information

using simple search strategies

Accesses information using advanced search strategies, such as limits, Boolean searches, or combined searches

Student: Accesses information

randomly Accesses information

using only simple search strategies

Retrieves relevant information (Determine the extent of information needed)

Student: Retrieves information sources that fully fit

search parameters and relate to concepts or answer research question

Student: Retrieves information

sources that partially fit search parameters or relate to concepts answer research question

Student: Does not retrieve

information sources that either fit search parameters or relates to concepts or answer research question

Page 33: WILU Assessment Rubrics Workshop

Norming/Rating Discussion

• How do we achieve consensus?

• What was challenging?

Page 34: WILU Assessment Rubrics Workshop

Rubrics – Benefits

Learning

• Articulate and communicate agreed upon learning goals

• Provide direct feedback to learners

• Facilitate self-evaluation

• Focus on learning standards

Page 35: WILU Assessment Rubrics Workshop

More benefits of a (normed) rubric…Data• Facilitate consistent, accurate, unbiased scoring

• Deliver data that is easy to understand, defend, and convey

• Offer detailed descriptions necessary for informed decision-making

• Can be used over time or across multiple programs

Other• Are inexpensive ($) to design & implement

Page 36: WILU Assessment Rubrics Workshop

Rubrics – Limitations• Possible design flaws that impact data quality

• Require significant time for development

• Sometimes fail to balance between holistic and analytic focus

• May fail to balance between generalized wording and detailed description

• Can lack differentiation between performance levels

Page 37: WILU Assessment Rubrics Workshop

• Start with established partners, existing librarian/disciplinary faculty collaborations

• Evaluate a skill relevant to many campus partners (ex. use information legally and ethically)

• Include those who can help disseminate results and promote IL assessment efforts across campus

• Meet with stakeholders regularly to review and improve assignment and rubric.

RAILS Lessons

Page 38: WILU Assessment Rubrics Workshop

• Explicit, detailed performance descriptions are crucial to achieve

inter-rater reliability.

• Raters appear to be more confident about their ratings when student artifacts under analysis are concrete, focused, and shorter in length.

• The best raters “believe in” outcomes, value constructed consensus (or “disagree and commit”), negotiate meaning across disciplines, develop shared vocabulary, etc.

More RAILS Lessons

Page 39: WILU Assessment Rubrics Workshop

Identify learning

outcomes

Create and enact

learning activities

Gather data to

check for learning

Interpret data

Enact decisions

to increase learning

Information Literacy

Instruction Assessment

Cycle (ILIAC)

Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional Skills." Journal of Documentation. 65.4. 2009.

Page 40: WILU Assessment Rubrics Workshop

Using Assessment Results…

Improvements within the library• Instruction • Relationships• Impact on learning

Growth within the field • Conferences• Publications

Page 41: WILU Assessment Rubrics Workshop

ReferencesArter, J. (2000). Rubrics, scoring guides, and performance criteria:

Classroom tools for assessing and improving student learning. Retrieved from

http://eric.ed.gov/?id=ED446100

Bresciani, M., Zelna, C. & Anderson, J. (2004). Assessing student learning

and development: A handbook for practitioners. Washington, DC:

NASPA-Student Affairs Administrators in Higher Education.

Wiggins, G. P., & McTighe, J. (2006). Understanding by design. Upper Saddle

River, NJ: Pearson Education, Inc., 2006.

Wiggins, G. P. (1998). Educative assessment: Designing assessments to inform

and improve student performance. San Francisco, CA: Jossey-Bass.

Page 42: WILU Assessment Rubrics Workshop

Selected Readings:

 

Diller, K. R., & Phelps, S. F. (2008). Learning outcomes, portfolios, and rubrics, oh my! Authentic assessment of an information literacy program. Portal: Libraries and the Academy, 8 (1), 75-89.

 

Fagerheim, B. A., & Shrode, F. G. (2009). Information literacy rubrics within the disciplines. Communications in Information Literacy, 3(2), 158-170.

 

Holmes, C. & Oakleaf, M. (2013). The Official (and Unofficial) Rules for Norming Rubrics Successfully. Journal of Academic Librarianship, 39(6), 599-602.

Knight, L. A. (2006). Using rubrics to assess information literacy. Reference Services Review, 34(1), 43-55.

 

Oakleaf, M. (2007). Using rubrics to collect evidence for decision-making: What do librarians need to learn? Evidence Based Library and Information Practice, 2(3), 27-42.

 

Oakleaf, M. (2009). The information literacy instruction assessment cycle: A guide for increasing student learning and improving librarian instructional skills.  Journal of Documentation, 65(4), 539-560.

 

Oakleaf, M., Millet, M., & Kraus, L. (2011). All together now: getting faculty, administrators, and staff engaged in information literacy assessment.  Portal: Libraries and the Academy, 11(3), 831- 852.

 

Stevens, D. D., & Levi, A. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback, and promote student learning. Sterling, VA: Stylus Publishing.

Page 43: WILU Assessment Rubrics Workshop

WILU: May 2014

Rubric Assessment (à la RAILS)

for Your Library’s Instruction Program

Claire Holmes Carroll Wilkinson

Towson University West Virginia [email protected] [email protected]

SlideShare URL:

Thank you!