dr. robert mayes university of wyoming science and mathematics teaching center [email protected]

23
UbD Think like an Assessor Stage 2 Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center [email protected]

Post on 22-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

UbD Think like an Assessor Stage 2

Dr. Robert MayesUniversity of Wyoming

Science and Mathematics Teaching Center

[email protected]

Assessor – 3 basic questionsWhat kind of evidence do we need to support the

attainment of goals?Tasks that reveal understanding, such as comparing

and contrasting or summarizing key conceptsWhat specific characteristics in student responses,

products, or performances should we examine to determine the extent to which the desired results were achieved?Criteria, rubrics, and exemplars are needed

Does the proposed evidence enable us to infer a student’s knowledge, skill, or understanding?Validity and reliability concerns

Stage 2: EvidenceThink like an assessor not an activity designer

What should be sufficient and revealing evidence of understanding?

What would be interesting and engaging activities on this topic?

What performance tasks must anchor the unit and focus the instructional work?

What resources and materials are available on this topic?

Against what criteria will I distinguish work?

How will I give students a grade and justify it to parents?

Assessor Activity Designer

Stage 2: EvidenceThink like an assessor not an activity designer

Assessor Activity DesignerHow will I be able to distinguish between those who really understand and those who don’t (though they seem to)?

What will students be doing in and out of class? What assignments will be given?

What misunderstandings are likely? How will I check for those?

Did the activities work? Why or why not?

Continuum of Assessment MethodsVary in several characteristics

Scope: from simple to complexTime Frame: short-term to long termSetting: decontextualized to authenticStructure: highly structured to ill-structured

Move from snapshot to scrapbook Self-assessment of sources of evidence (HO)

Informal checks

Observation/Dialogue

Quiz/Test

Academic Prompt

Performance Task

Collecting a Range of EvidenceActivity: (HO) determine a range of assessment

evidence you may use related to the Enduring understandingTopics important to know and doWorth being familiar withWhich assessment methods best fit the 3

categories? Worth being familiar with

Important to know and do

Enduring Understanding

Academic Prompt AssessmentsOpen-ended question or problem that require

student to prepare a specific academic responseThink critically and prepare responseRequire constructed response under exam

conditionsDivergent – no single best answerSubjective judgment based scoring using

criteria or rubricMay or may not be secure Often ill-structured – require development of

strategyInvolve analysis, synthesis, and evaluation

Performance TaskAssessmentsComplex challenges that mirror the issues

and problems faced by adultsReal or simulated settings, authenticRequire student to address audience in non-

exam conditionsDivergent – no single best answerSubjective judgment based scoring using

criteria or rubric, Greater opportunity to personalize taskNot secure – students given criteria in advance

Performance Task – 6 FacetsActivity: Use the 6 Facets of Understanding

to generate a performance task related to your enduring understandingQuestioning for Understanding (HO)Performance Verbs (HO)Performance Task creation (HO)Performance Task brainstorming (HO)

Performance Task -GRASPSCreating a performance task with context

and rolesGoalRoleAudienceSituationProduct, Performance, and PurposeStandards and Criteria for Success

Performance Task -GRASPSActivity: Create a performance task using

GRASPSGRASPS Performance Task Scenario (HO)Student roles and audiences (HO)Possible Products and Performances (HO)

Assessor Question 2: Determine achievement

What specific characteristics in student responses, products, or performances should we examine to determine the extent to which the desired results were achieved?Criteria, rubrics, and exemplars are needed

Designing Scoring RubricsRubric: criterion-based scoring guide for

evaluating a product or performance along a continuum.

Consists of:Evaluative Criteria – qualities that must be met

for work to measure up to a standardFixed Measurement Scale – often 4 or 5 levelsIndicators – descriptive terms for

differentiating among degrees of understanding, proficiency, or quality

Rubric Types

Holistic – provide an overall impression of the elements of quality and performance levels in a student’s work

Analytic – divides a student’s performance into two or more distinct dimensions (criteria) and judges each separately

Recommend use of analytic with a minimum of:Criteria for understanding (HO)Criteria for performanceUsing Facet-Related Criteria (Figure 8.3, Pg 178)

Rubric TypesGeneric – general criteria in given

performance area Can be developed before specific task defined Example: General Problem Solving Rubric Example: Generic Rubric for Understanding (HO)

Task-Specific – designed for use with particular assessment activity

Task dependent so cannot be used to evaluate related performance tasks

Rubric TypesLongitudinal Rubric – progression from naïve

to sophisticated understanding Increased understanding of complex functions and

interrelatedness of concepts Greater awareness of how discipline operates Greater personal control over and flexibility with

knowledge

Effective RubricsRelate specific task requirements to more

general performance goalsDiscriminate among different degrees of

understanding or proficiency according to significant features

Do not combine independent criteria in one column of rubric

Use Student Anchors to (Anchor design, Pg 181)Set standards based on student artifactsConsistency in judgment of student workEquip students to do more accurate and

productive self-assessment

Effective RubricsAll potential performances should fit

somewhere in rubricRely on descriptive language (what

quality looks like) not comparative or value language to make distinctions

Avoid making lowest score point sound bad, should describe novice or ineffective performance

Highlight judging performance’s impact as opposed to over rewarding just process or effort

Assessor Question 3:Valid and Reliable

Does the proposed evidence enable us to infer a student’s knowledge, skill, or understanding?

Validity: did we measure what we meant to measureDoes the evidence indicate understanding of

the expressed outcomes?Are the performances appropriate to the

understanding sought?Do not pay so much attention to correctness

that degree of understanding is lost.

ValidityTwo key validity questions for assessment tasks:A student could do well on this performance

task, but really not demonstrate the understanding you are after?

A student could perform poorly on this task, but still have significant understanding of the ideas and show them in other ways?

Activity: determining validity (Figure 8.5)

ValidityTwo key validity questions for rubric:Could the proposed criteria be met but the

performer still not demonstrate deep understanding?

Could the proposed criteria not be met but the performer nonetheless still show understanding?

ReliabilityReliable assessments reveal a credible

pattern, a clear trendNeed for multiple evidence (scrapbook)

rather than just a snapshot of student performance

Have parallel assessments on the same concept using multiple assessment formats.

Dr. Robert MayesUniversity of Wyoming

Science and Mathematics Teaching Center

[email protected]