using rubrics/scoring guides in program assessment ¢â‚¬¢identify possible...

Download Using Rubrics/Scoring Guides in Program Assessment ¢â‚¬¢Identify possible opportunities for using rubric/scoring

Post on 01-Nov-2019

1 views

Category:

Documents

0 download

Embed Size (px)

TRANSCRIPT

  • Using Rubrics/Scoring Guides in Program Assessment

    Carrie Zelna, Ph.D. Director, Office of Assessment Associate Vice Provost Division of Academic and Student Affairs Stephany Dunstan, Ph.D. Associate Director, Office of Assessment Division of Academic and Student Affairs

  • Outcomes

    • Identify possible opportunities for using rubric/scoring guides as an assessment tool at the student level, class/course level, or program level

    • Describe the types of rubrics/scoring guides that can be applied to student, class/course, or program assessment

    • Identify the steps necessary to apply rubrics/scoring guides systematically (norming, sampling, analysis)

  • Levels of assessment

    • Individual/student level

    • Class/course level

    • Program/Curriculum level

  • Four Steps of Assessment

    Four Steps of Assessment Linda Suskie 2009 Assessing Student Learning: A common sense guide. 2nd edition. Jossey-Bass.

    1. Establish Learning Goals (Plan)

    2. Providing Learning Opportunities (Act)

    3. Assess Student Learning (Observe)

    4. Use the results (Reflect)

    Linda Suskie email on 4/4/2008 to the Assess listserve: “….understand that assessment is

    action research, not experimental research. While it is systematic, action research is context-specific, informal, and designed to inform individual practice. As such, it doesn't have the precision, rigor, or generalizability of experimental research. “

    P. S

    te in

    k e

    & C

    . Z

    el n

    a

  • Rubric: Definition and Purpose

    • Rubric: “a scoring tool that lays out the specific expectations for an assignment” (Stevens & Levi, 2005, p. 3)

    • It is a way of organizing criteria to systematically determine if the outcome is

    met based on data gathered through papers, observation, document analysis, or some other appropriate method.

    • When you review the data in the aggregate, a rubric can help identify

    patterns of strengths and weaknesses that might allow for enhancements to the program.

  • Constructed Response

    • Short-Answer Essay Questions

    • Concept Maps

    • Identifying Themes

    • Making Predictions

    • Summaries

    • Explain Your Solution

    Course Assessment: Rubrics/Scoring Guides

    http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm

    http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm

  • Product/Performance “...reveals their understanding of certain concepts and skills and/or their ability to apply, analyze, synthesize or evaluate those concepts and skills” *

    Research Paper Capstone Project Article Reviews Film Analysis Case Study Error Analysis Panel Discussion Fishbowl Discussion Oral Presentations Course Assessment: Rubrics/Scoring Guides * http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm

    http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm

  • Types of Rubrics

    1. Holistic

    2. Check-list

    3. Rating Scale

    4. Descriptive

    5. Structured Observation Guide

  • Examples

  • Holistic (Suskie, 2009)

  • Checklist Rubric (Suskie, 2009)

  • Rating Scale Rubric (Suskie, 2009)

  • Descriptive Rubric (Suskie, 2009)

  • Structured Observation Guide (Suskie, 2009)

  • Things to consider

    • Levels of Assessment

    • Individual

    • Alignment from outcome to assignment to rubric

    • Additional information to be measured (format, program outcomes, etc.)

    • Testing the rubric

    • Class/Course

    • All of the above

    • Sampling

    • Analysis: Aggregate data

    • Program

    • All of the above

    • Norming/multiple raters

  • Testing Your Rubric/Scoring Guide

    • Metarubric: use a to review your work

    • Peer review: ask one of your colleagues to review the rubric and provide feedback on content

    • Student review: ask several students review the rubric if appropriate (students in a class may help you create it)

    • Test with products: use student work to test the rubric once you feel it is ready (3-5?)

  • Sampling: Being Systematic

    • Should you sample at all or review all the products? • Mary Allen- Rule of Thumb_ 50 to 75 is sufficient for assessment

    • Consider the attitudes of the faculty towards data:

    • Quantitative approaches

    • Qualitative approaches

    **The Office of Assessment can help choose random students for your assessment.

  • Analysis

    • Individual vs. Aggregate Scores

    • Average Score (Mean) By Dimension and Total

    • Total Score: Total scores may be reviewed to get a big picture

    • Dimension: Dimension scores to look for patterns

    • Frequency Distributions

    • Scale: Frequencies by scale to get a clearer understanding of the data

  • Scoring the Data ID# Class Age Gender

    Paper

    Length Total

    Separation/

    Objectivity Dissonance

    Understanding/

    Change in

    Perspective

    Self-

    Perception Resolution

    Application/

    Verification totals

    A FR 19 F 5 18 3 3 3 3 3 3 18

    B SR 21 M 3 17 3 3 3 3 3 2 17

    C FR 18 F 7 16 3 3 3 2 2 3 16

    D SR 21 M 5 16 3 3 3 2 3 2 16

    E SO 19 F 9 15 2 3 3 2 2 3 15

    F FR 18 M 3 14 3 3 3 2 3 0 14

    G SO 20 M 3 14 3 3 3 0 3 2 14

    H SO 19 M 5 13 2 2 3 2 2 2 13

    I FR 18 M 8 13 3 3 3 2 2 0 13

    J JR 20 F 5 13 2 2 2 2 3 2 13

    K SO 20 M 5 13 3 3 2 2 2 1 13

    L FR 18 M 7 13 2 3 2 2 2 2 13

    M JR 20 F 3 11 3 3 3 0 2 0 11

    N FR 18 F 5 10 2 2 2 2 2 0 10

    O SO 22 M 4 10 2 3 2 2 2 0 11

    P FR 18 F 6 10 2 3 1 2 1 1 10

    Q FR 19 M 9 9 2 2 1 2 1 1 9

    R FR 18 M 3 9 2 3 2 1 1 0 9

    S FR 18 M 15 7 2 1 1 1 1 1 7

    T SO 20 F 4 7 1 2 0 1 2 1 7

    Average Score 2.5 2.8 2.4 1.8 2.2 1.4 13.1

  • 0

    2

    4

    6

    8

    10

    12

    14

    16

    Separation/

    Objectivity

    Dissonance Understanding/

    Change in

    Perspective

    Self-Perception Resolution Application/

    Verification

    Scale: 3

    Scale: 2

    Scale: 1

    Scale: 0

    Frequencies Scale: 3 Scale:2 Scale: 1 Scale: 0

    Separation/ Objectivity 9 10 1 2

    Dissonance 14 5 1 2

    Understanding/ Change in Perspective 10 6 3 3

    Self-Perception 2 13 3 4

    Resolution 6 10 4 2

    Application/ Verification 3 6 5 8

    Frequencies

  • Example: University of Virginia

  • Other “Scoring” Options

    • Structured Observation Guide:

    • Thematic approach-using qualitative coding to determine what you are seeing.

  • Norming with Multiple Raters • Will multiple reviewers look at each product?

    • Consider time frame (when must all products be scored)

    • Will raters score together in the same physical location?

    • Spend time walking through the rubric/scoring guide as a group.

    • Review 1 product individually, then compare responses as a group- share why scores were chosen, discuss, reach consensus.

    • Repeat process above as needed until you feel comfortable that you are on the same page (5 to 10)

    • This could result in additional changes to the rubric/scoring guide in some cases

    • Consider doing this throughout the process to ensure that you are not drifting (“recalibrate”)

  • Creating a Rubric

    1. What are you trying to assess? (Outcomes)

    2. What does the outcome (s) ‘look‘ like?

    3. How can you group the item/elements/criteria/dimensions of the outcome?

    4. List these elements with brief description of the best case scenario

    5. How will the student demonstrate this learning? Does the assignment align with the list of elements?

    6. What else needs to be in the rubric to grade the work?

    7. Do you need specific information for the curriculum?

  • References/Resources

    • AACU Rubrics: http://www.aacu.org/value/rubrics/index_p.cfm?CFID=37 317515&CFTOKEN=54026278

    • Allen, Mary J. (2006). Assessing General Education Programs. Bolton, MA: Anker Publishing Co.

    • Stevens and Antonia (2005). Introduction to Rubrics. Sterling, VA:Stylus Publishing, LLC.

    • Suskie, L (2009). Assessing Student Learning: A common sense guide. San Francisco, CA: Jossey-Bass.

    http://www.aacu.org/value/rubrics/index_p.cfm?CFID=37317515&CFTOKEN=54026278 http://www.aacu.org/value/rubrics/index_p.cfm?CFID=37317515&CFTOKEN=54026278

View more >