constructing rubrics for open-ended activities share the future iv 18 march 2003

43
Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Upload: gilbert-harper

Post on 25-Dec-2015

214 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Constructing Rubrics for Open-ended Activities

Share the Future IV

18 March 2003

Page 2: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Workshop Materials

• Printed copies of materials will be passed out today

• Also available at the Foundation Coalition web page:

www.foundationcoalition.org

Page 3: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Form Groups

• This workshop includes two group activities

• Form groups of 4 or 5

• Line up in order of the distance you traveled to attend this conference

Page 4: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Workshop Presenters

• Susan Haag, Director of Assessment, College of Engineering & Applied Sciences, Arizona State University

• Ann Kenimer, Associate Professor, Department of Agricultural Engineering, Texas A&M University

Page 5: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Susan Haag, ASU

• Director of Assessment & Evaluation CEAS • Specialist in Instructional Technology and

Educational Psychology, ASU• Ph.D., Policy Studies/Psychology, ASU• Develops and instructs course for online delivery—

full virtual delivery and web enhanced• Teaches graduate research methods• FC Assessment and Evaluation involvement since

1998

Page 6: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Ann Kenimer, Texas A&M

• B.S., M.S., Agricultural Engineering, Virginia Tech• Ph.D., Agricultural Engineering, University of Illinois• Teaches engineering design processes,

fundamental problem solving, environmental engineering

• FC Assessment and Evaluation involvement since 2000

Page 7: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Workshop Agenda

• Rubrics:– Common Terms– What is a Rubric?– How Are Rubrics Used?– Examples of Rubrics– Characteristics of a Rubric

• Team Activities: – Use & Evaluate a Rubric– Develop a Rubric

• Common Problems and Solutions:– Resources

• Wrap Up

Page 8: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Terms Used in Workshop

Qualitative Assessment

Open-ended data

-Content analysis

-Rubric

-Check-list

-Inter-rater -Intra-rater

“Objective” Assessment

Closed-ended data

-Forced-choice response

Pre-Determined Criteria

Reliability

Validity-Theoretical-Face-Criterion

Page 9: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

What is a Rubric?(Pre-Determined Criteria)

• Definition of Rubric3,9: – A systematic scoring methodology to make

qualitative assessment and evaluation more reliable and objective by applying pre-determined criteria

e.g., Descriptive criteria are developed to serve as guidelines for scorers to assess, rate and judge student performance

Page 10: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

What is a Rubric?(”Open-ended” Data)

• It is a tool used in the qualitative assessment of “open-ended” data, such as… – Written or oral narratives– Diagrams or models– Written or oral enumerations– Behavioral demonstrations

of a student’s knowledge, applied skill, or ability to perform

Page 11: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

How Are Rubrics Used? (“Open-ended” Data)

• Advantages and drawbacks of assessing “open ended” data7

Advantages:– Can yield “rich” information (I.e., individual,

creative, complex, fine-tuned)Drawbacks:– Involves “subjectivity” in interpreting and

scoring data (i.e. the judgments of individuals scoring) as contrasted with “objective” tests

– Problems with reliability (both inter-raters and intra-rater, across time)

Page 12: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

How Are Rubrics Used? (“Open-ended” Data)

• Other methods of Qualitative Assessment used with open-ended data

– Content analysis and coding10

– Inventory checklists11

– Rubrics

Page 13: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

How Are Rubrics Used?(Diagnostic Feedback)

• Descriptions of performance standards may serve to communicate to students what is expected of quality performance5

e.g., Ideal, expected performance described in a rubric can be explicitly compared with individual performance in order to convey what aspects of performance need improvement

Page 14: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Rubric Types

Rubrics may be used “holistically” or “analytically”…

•  “Holistic” Rubric5:

– The entire response is evaluated and scored as a single performance category

• “Analytical” Rubric5:

– The response is evaluated with multiple descriptive criteria for multiple performance categories

Page 15: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Rubric Types: Example

• “Holistic”Rubric for Open-Ended Math Problems 11

• Criteria for Demonstrated Competence: (6 points) Description of Exemplary Response:

– Gives a complete response with a clear, coherent, unambiguous, and elegant explanation; includes a clear and simplified diagram; communicates effectively to the identified audience; shows understanding of the problem’s mathematical ideas and processes; identifies all the important elements of the problem; may include examples and counter-examples; presents strong supporting arguments.

Page 16: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Rubric Types: Example

• “Holistic”Rubric for Open-Ended Math Problems…

• Criteria for Inadequate Response : (2 points) Description of a Response which Begins, but Fails to Complete Problem:

– Explanation is not understandable; diagram may be unclear; shows no understanding of the problem situation; may make major computational errors.

Page 17: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Rubric Types: Example

• Analytical Rubric

• Scoring rubrics for program objectives– Life-long learning– Impact in a global/societal context

• Diana Briedis, Michigan State University

Page 18: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Characteristics of a Rubric(Reliability)

A good rubric must posses “reliability”

• Definition of Reliability4: – the extent to which the measuring

instrument yields responses that are consistent and stable across time (intra-rater) and between different scorers (inter-rater)*

 

Page 19: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Characteristics of a Rubric(Validity)

A good rubric must posses “validity”

• Definition of Validity1: – the extent to which what is being

measured by an instrument is actually what is intended. Are the test and rubric actually measuring the desired performance-outcomes? (Construct, Criterion and Face Validity)

 

Page 20: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Team Activity I

• Evaluate a Rubric

Page 21: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

The Sample Rubric

• Developed for use in a freshman-level introduction to design class

• Used to evaluate oral presentations made by freshman design teams

• Used by a panel of 3 to 4 faculty– course instructor– faculty invited for one day to serve on the review

panel– panel membership changed over the 5 days of

presentation

Page 22: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Your Task

• With your group discuss:– The merits of the sample rubric and how it was

used– Potential problems with the sample rubric and how

it was used– What you might do to improve the rubric and its

use

• We’ll share ideas in about 15 minutes

Page 23: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Your Ideas

• What are your thoughts on:– The merits of the sample rubric and how it was

used– Potential problems with the sample rubric and how

it was used– What you might do to improve the rubric and its

use

Page 24: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Sample Rubric-- Results

Individual Team

Reviewer Style Content Part. Prof. Style Content

1 5 5 5 4 4 5

2 2 2 4 3 2 3

3 4 4 4 4 3 4

4 4 3 5 4 4 4

Student A

Page 25: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Sample Rubric-- Results

Individual Team

Reviewer Style Content Part. Prof. Style Content

1 2 3 3 3 3 4

2 4.5 4

3 3 4 5 4 4 4

4 4 4 4 5 4 5

Student B

Page 26: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Sample Rubric—Changes

• Better definitions of what presentation qualities deserve what score

• Better training of reviewers• More consistency of panel members from

one day to the next

Page 27: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Constructing a Rubric

Note: there are two components involved in this assessment and evaluation methodology:

• The test instrument given to the students

• The scoring rubric used by the evaluators

Page 28: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Constructing a Rubric3,6,9

1. Develop appropriate performance goals and objectives

2. Select the assessment tasks that reflect and demonstrate the performance goals

3. Differentiate between performance levels and assign relative values to each of the levels [establish “expert”level; establish target students’ developmental level]

Page 29: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Constructing a Rubric

4. Develop descriptive criteria for each level of performance which correspond with local norms [holistic or analytical]

5. Train scorers in application of rubric

6. Pilot both test and scoring rubric [for inter-rater & intra-rater consistency, apply cross checking methods]

7. Modify test items and scoring rubric based upon scoring results & content analysis of responses

Page 30: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Develop Appropriate Performance Objectives and Tasks: Example5

Objective: CommunicationWhen students work in design teams, they are able to communicate within and outside the team as required to support team productivity and to achieve the intended impact

• Organize information in ways that make it understandable as a whole and as parts of the whole

• Present information in the format, vocabulary, level, quality, and appearance that are understood and appreciated by the intended audience

• State and document information completely and accurately

• Express information of value for enhancing products of internal and external clients

• Make Information available to and readily useable by other users

Page 31: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Team Activity II

• Develop a rubric for:– Laboratory report– Engineering design project

• We’ll discuss your rubrics in about 20 minutes

Page 32: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Team Activity II

• Discussion• What does your rubric contain?• How might you apply this activity to your

courses?

Page 33: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Common Problems (Transferability & Repeatability )

• Validity– Transferability of

assessment question interpretation

– Transferability of specifications for expected performance

• Changes in curriculum or instruction

• Changes in performance standards

• Changes in students’ prior knowledge

Transferability and Repeatability of Test Questions and Rubric Criteria

•Across similar or different courses•Over time, or across locales•Across populations•Across scorers

Page 34: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Common Problems(Transferability & Repeatability..cont)

• Reliability (interacts with validity)

– Inter-rater– Intra-rater (tends to

be more validity sensitive)

• Different scorers• Changes in scorers’

knowledge

Transferability and Repeatability of Test Questions and Rubric Criteria

•Across similar or different courses•Over time, or across locales•Across populations•Across scorers

Page 35: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Solutions to Common Problems(Transferability & Repeatability..cont)

Validity• Address..

-”Theoretical” validity2-- Review literature & other resources for precedents

-”Criterion” validity2– Ask sample of experts, novices (if appropriate) and target population to respond

-”Face” validity12-- Ask relevant sample of “local” users to respond and critique

• Content-- Analyze responses & compare target population to “local” users, to experts, to novices ( if appropriate), and to rubric criteria

Page 36: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Solutions to Common Problems (Transferability & Repeatability..cont)

Validity…cont.

• Modify test questions, if necessary, as indicated by discrepancies between response content analysis results of target population and/or local users, and the rubric

• Modify rubric criteria or scoring standards, to align with expert content and performance levels; or with local user content and performance levels if these differ from expert results

Page 37: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Solutions to Common Problems (Transferability & Repeatability..cont)

Reliability--Train and manage scorers for intra-rater consistency

• By having them take the test, then score their own and another scorer’s test, then justify their scoring to a third party

• By having them re-view and re-score the 1st test they scored after they have completed scoring their 5th test, and

• By having them review and re-score the first 5 tests scored after having completed scoring 10 tests, and continue pattern

Page 38: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Solutions to Common Problems (Transferability & Repeatability..cont)

Reliability--Train and manage scorers for inter-rater consistency

• By duplicating a sampling of all tests and having all scorers evaluate and score each test

• By having all scorers re-view each other’s scoring of this common set of test, having them discuss discrepancies, arrive at consensus on interpretation and application of rubric criteria and having them jointly re-score discrepant tests

• By having all scorers periodically and repeatedly review, each other’s scored tests, individually re-score them, then discuss, and jointly re-score two tests

Page 39: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Solutions to Common Problems (Transferability & Repeatability..cont)

Reliability – Controls

• Halfway through the scoring job, have an outsider sample each scorer’s scored tests, and have each scorer justify his/her scoring of the same items across several tests

• Report both intra-rater inconsistencies and inter-rater inconsistencies noted to scorers for their correction

• Repeat process near end of scoring job• Also calculate and examine inter-rater and intra-

rater consistency rates by test subject, and by test item; as well as inter- item correlations 8

Page 40: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

ResourcesCitation References

1. Bergeson, Dr. Terry. Office of Superintendent of Public Instruction web page. “Scoring the WASL Open-Ended Items” 1998. 1 May 2002 <http://www.k12.wa.us/assessment/assessproginfo/subdocuments/TechReports/g4part4.pdf>

2. Cronbach, Lee J., Meehl, Paul E. “Construct Validity in Psychological Tests.” Psychological Bulletin (1955). 11 June 2002. http://psychclassics.yorku.ca/Cronbach/fl

3. Ebert-May, Diane. “Classroom Assessment Techniques: Scoring Rubrics.” Field-tested Learning Assessment Guide (FLAG) web site 1999. 11 June 2002 <http://www.flaguide.org/cat/rubrics/rubrics1.htm>

4. Graduate School of Education & Information Studies. CRESST. UCLA <http://www/Rubrics/CRESSTUCLAassementglossary.html>

 

Page 41: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

ResourcesCitation References

5. Davis D.C., Gentili K.L., Calkins D.E., Trevisan M.S. ‘Transferable Integrated Design Engineering Education (TIDEE) Project." October 1998. 29 May 2002. http://www.cea.wsu.edu/TIDEE/monograph.html

6. Moskal, Barbara M. “Scoring rubrics: what, when and how?” Practical Assessment, Research & Evaluation. (2000). 1 May 2002. <http://ericae.net/pare/getvn.asp?v=7&n=3>

7. Rowntree, Derek. Home Page. “Designing an assessment” June 2000. 11 June 2002 <http://iet.open.ac.uk/pp/D.G.F.Rowntree/derek.html>

8. Rudner, Lawrence M. “Reducing Errors due to the Use of Judges.” ED355254 ERIC/TM Digest (1992). 11 June 2002 <http://ericae.net/db/edo/ED355254.htm>

Page 42: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

ResourcesCitation References

9. Seattle School District. “What is a rubric” (2000). 1 May 2002. <http://ttt.ssd.k12.wa.us/dwighth/rubricclass.htm>

10. Stemler, Steve. “An overview of content analysis.” Practical Assessment, Research, & Evaluation (2001). 11 June 2002. <http://ericae.net/pare/getvn.asp>

11. Summer Technology Institute at Western Washington University. “Rubric for Open-Ended Math Problems.” California CAP Math Report (1989). 11 June 2002. <http://ttt.ssd.k12.wa.us/dwighth/rubricclass.htm>

12. Trochim, William M.K. “Measurement Validity Types.” William M.K. Trochim Cornell University Home Page (2002). 11 June 2002. http://trochim.human.cornell.edu/kb

Page 43: Constructing Rubrics for Open-ended Activities Share the Future IV 18 March 2003

Wrap Up

• Please complete the workshop evaluation forms

• Thank you!