developing rubrics

25
DEVELOPING RUBRICS Dr. Jesus R. Dela Rosa Instructor, English Language Center JIC ABET Member December 05, 2011

Upload: dior

Post on 23-Feb-2016

86 views

Category:

Documents


0 download

DESCRIPTION

Developing Rubrics . Dr. Jesus R. Dela Rosa Instructor, English Language Center JIC ABET Member December 05, 2011. Outline. Objectives Introduction Discussion Rubrics Proposed rubrics for Oral presentation; Lab work & Writing report Academic Exchanges. Objectives. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Developing Rubrics

DEVELOPING RUBRICS

Dr. Jesus R. Dela RosaInstructor, English Language Center

JIC ABET MemberDecember 05, 2011

Page 2: Developing Rubrics

OutlineObjectivesIntroductionDiscussion

Rubrics Proposed rubrics forOral presentation; Lab work & Writing report

Academic Exchanges

Page 3: Developing Rubrics

Objectives Discuss rubrics briefly Explain a proposed rubric for

oral presentation Gather feedback on its content

and form from the participants

Page 4: Developing Rubrics

Introduction

Delivery of AS programs

Different courses= different assessment tools

Bloom’s learning domains Cognitive (head), affective(heart), psychomotor (hands)

Page 5: Developing Rubrics

Introduction Different tools to assess outcomes

Direct = What students/learners can do Indirect= What other people think and feel about students/learners can

do/have (?) Criterion 4. Continuous Improvement

Upper State U model of the Self-Study Report (SSR) 41 pages out of 100 (41%) how we measure student outcomes/performance through

assessment tools clearly specified in our Course Assessment Charts

Some reminders to SSR writers Use of various assessment tools (direct/indirect ) Use of rubrics is encouraged in some courses

Page 6: Developing Rubrics

Discussion Why rubrics

A hype? (hot issue)From test to task (performance-

based not knowledge-based) “Tell me-I listen; Teach me- I learn- Show me how- I learn and live”

ABET – outcome-based Does not believe in “one size fits all” assessment (triangulation)

Page 7: Developing Rubrics

Why rubricsWritten tests- a thing of the past?Alternative assessment tools

The bottom lineRubrics-learning grounded in the

real world where students live Student-centered; standards driven (criterion-referenced)

Page 8: Developing Rubrics

What are Rubrics Rubrics - performance-based

assessments that evaluate student performance on any given

task/set of tasks that ultimately leads to a final product, or learning outcome (over-all assessment)= Student Outcomes

use specific criteria as a basis for evaluating or assessing student performances (specific) marking sheet = subjective –more objective narrative descriptions (descriptors) of possible performance related to a given task

http://www.teach-nology.com/currenttrends/alternative_assessment

Page 9: Developing Rubrics

Three components of a rubric

Page 10: Developing Rubrics

Rubrics Types of rubrics (according to

rating scales) Holistic-

More global Views the final product as a set of

interrelated tasks contributing to the whole

Anchor points are used to assign value to descriptions of products or performances that contribute to the whole

Page 11: Developing Rubrics

Holistic rubric Holistic scoring proves to be efficient and

quick One score provides an overall impression of

ability on any given product or work Often written generically and can be used with

many tasks Saves time by minimizing the number of

decisions raters must make Raters if trained properly tend to apply them

consistently, resulting in more reliable measurement

Page 12: Developing Rubrics

Holistic rubric Disadvantages

Scoring does not provide detailed information about student performance in specific areas of content or skill

No specific feedback about students’ strengths/weaknesses

Does little to separate the tasks Performances may meet criteria in two or

more categories, making it difficult to select the one best description

Criteria cannot be differentially weighted

Page 13: Developing Rubrics

Holistic rubric: A sampleOral Presentation Scoring Rubric

Mastery =usually makes eye contact; volume is always appropriate ; enthusiasm present throughout presentation; summary is completely accurate

Proficiency=usually makes eye contact; volume is always appropriate; enthusiasm is present in most of presentation; only one or two errors in summaryDeveloping=sometimes makes eye contact; volume is sometimes appropriate; occasional enthusiasm in presentation; some errors in summaryInadequate=never or rarely makes eyes contact; volume is inappropriate; rarely shows enthusiasm in presentation; many errors in summary

Page 14: Developing Rubrics

Analytic rubric Analytic-

Scoring breaks down the objective or final product into component parts.

Each part is scored independently. The total score is the sum of the rating for all of

the parts that are being evaluated. Useful in giving feedback on areas of student

performance (strengths/weakness). Dimensions can be measured to reflect relative

importance. Progress over time can be demonstrated when

used repeatedly.

Page 15: Developing Rubrics

Analytic rubric Disadvantages

More time to prepare (ask any JIC ABET member!)

More possibilities for raters to disagree

More difficult to achieve intra- and inter-rater reliability on all of the criteria/dimensions

Page 16: Developing Rubrics

Analytic rubric: A sample

Criteria Never Sometimes always Score

Makes eye contact 0 3 4

Volume is appropriate 0 2 4

Enthusiasm is evident 0 2 4

Summary is accurate 0 4 8

Total

Page 17: Developing Rubrics

Holistic or analytic? Left to the better judgment of the experts 6 criteria at most (holistic); more criteria

(analytic) Whether holistic or analytical scales

Important factors in developing effective rubrics- Use of clear criteria that will be used to rate a

student's work. The performance being evaluated is directly

observable. Students should be informed as to what criteria

they are being held accountable.

Page 18: Developing Rubrics

Proposed rubric for oral presentationRbrc jessedit.docx

Analytic as to type (7 criteria-more ?less?)

Criteria mapped to performance indicators (PIs)-JIC ABET (establish consistency)

Four scales (1-Beginning; 2-Developing; 3-Competent; 4-Outstanding)

Provision for scoring each criterion Assessment report for oral presentation Results form part of applicable PIs & SOs

Page 19: Developing Rubrics

Validation and Reliability Concerns

Rubrics just like other assessment tools have validation and reliability concerns

AS programs may be left to decide or JIC-ABET Content validity

Let colleague(s) review your rubric (documented/recorded)

Inform students how it works (beginning of semester?)

Check if it is manageable (pilot-testing; to be documented)

Page 20: Developing Rubrics

Summary AS programs have different courses Different courses have different learning domains Rubrics as direct assessment tools

why, what, types, differences, advantages/disadvantages and which rubric to use

Proposed rubric for oral presentation prepared by JIC ABET- parts; assessment report

Validity and reliability concerns in future Content validity (colleagues & students) Other validation/reliability concerns

Page 21: Developing Rubrics

Conclusions Rubrics are direct assessment tools that

are necessary to measure learning outcomes.

Rubrics require some time to develop and validate.

There are validity and reliability concerns on the development and administration of rubrics.

JIC-ABET Committee has prepared rubrics for guidance of/fine-tuning by the AS programs.

Page 22: Developing Rubrics

Proposed Rubric for Lab WorksSN Criteria Performanc

e IndicatorsMeasuringMethods

1Beginning

2Developing

3Competent

4Outstanding

Score

1 Safety Rules c.1Observations

Exams 

- Safety procedures

were ignored. Always needs

assistance.

- Experiment is carried out with some

attention to relevant safety

procedures.

- Experiment is

generally carried out

with attention

to relevant safety

procedures.

- Experiment is carried out

with full attention to

relevant safety procedures.

 

2Identificatio

n of Equipment

c.2 ObservationsExams

- Fails to identify suitable

equipment most of the

time. Always needs

assistance.

- Identifies suitable

equipment sometimes.

Some assistance is

needed.

- Always Identifies suitable

equipment.

- Always Identifies

equipment and explores

other equipment.

 

3 Use of Equipment c.2 Observations

Exams

- Fails to use equipment properly.

Always needs

assistance.

- Uses equipment

properly most of time. Some assistance is

needed.

- Always uses

equipment properly.

- Always uses equipment

properly and explores it.

 

4Experimenta

lProcedures

c.3i.2i.3

ObservationsExams

- Fails to read Lab manuals and follow

experimental

procedures. Always needs

assistance.

- Reads Lab manuals and

follows experimental procedures most of the time. Some

assistance is needed.

- Always read Lab manuals

and follows experimen

tal procedures

.

- Always read Lab manuals in advance,

follows experimental procedures

and suggests improvements.

 

Page 23: Developing Rubrics

SN CriteriaPerforman

ce Indicators

MeasuringMethods

1Beginning

2Developing

3Competent

4Outstanding

Score

5 Discipline d.3h.3

Observations

- Does not abide by the rules

most of the time.

- Sometimes abides by the

rules. 

- Abides by the rules most of

the time. 

- Always abides by the rules most of

the time

 

6 Punctuality i.2 

Observations

SIS attendance

records

- Comes late or does not attend the lab session most of the

time.- Always

misses the deadlines.

- Comes late sometimes.

- Misses the deadline

sometimes.

- Always comes on

time.- Always

meets the deadlines.

 

- Always comes on

time.- Always ahead

of the deadline.

 

7 Participation

d.1d.2

Observations

- Does not perform

any duties of assigned team role.

- Always relies on others to

do the work.

- Performs very little duties.

- Rarely does the assigned work--often

needs reminding.

- Performs nearly all duties.

- Usually does the assigned

work--rarely needs

reminding.

- Performs all duties of assigned

team role.- Always does

the assigned work without having to be reminded.

 

Page 24: Developing Rubrics

SN

Criteria Performance Indicators

MeasuringMethods

1Beginning

2Developing

3Competent

4Outstanding

Score

1 Introduction  

e.1/e.2 Review reports

- Introduction is missing.

 

- Introduction is confusing.

 

- Introduction is adequate.

 

- Introduction provides background and a forecast of the document. Problem or situation is defined clearly with orienting material for audience

 

2 Organization I.1/I.2/a.1/a.2 Review reports

- Points are not ordered.

- Report is disorganized and layout is somewhat weak.

- Most points are ordered well. No major problems with layout.

- Points are clearly presented in a logical order. Easily followed. Page layout is effective.

 

3 Language f.3 Review reports

- Choice of words is poor.

- Grammar, spelling and punctuation are poor.

- Choice of words is sometimes not appropriate.

- Grammar, spelling and punctuation are not satisfactory.

- Choice of words is appropriate most of the time.

- Minor problems with grammar, spelling, punctuation.

- Wording is concise, clear, and easy to follow.

- Consistently proper grammar, spelling and punctuation.

 

4 Content I.1/i.2/i.3/g.1g.2/b.1/b.2/b.3a.1/a.2

Review reports

- Most elements are missing or erroneous/not related.

- Some elements are missing or inadequately discussed.

 

- Most of the elements are adequately discussed.

 

- All of the elements are adequately discussed and supported with additional relevant information

 

 

5 Conclusions e.1/e.2 Review reports

- Conclusions are missing.

- Conclusions are inadequate.

- Most but not all points contained in the conclusion

- Clear, insightful conclusions.

 

6 Visuals f.2/g.2/a.1/a.2

Review reports

- Graphs/tables/figures/pictures are missing.

- Graphs/tables/ figures/pictures are inadequate.

- Graphs/tables/ figures/pictures are adequate.

- Graphs/tables/ figures/pictures are adequate, well labeled and presented.

 

7  References

h.1/f.4/g.1/g.2/a.1/a.2

Review reports

- References are notlisted.

- Some references are used but not referred to or listed properly.

- References are used and referred and listed properly.

- References are extensively used and properly listed.

 

Proposed Rubric for Project Reports

Page 25: Developing Rubrics

THANK YOU!