leominster public schools district determined measures
Post on 16-Jan-2016
62 Views
Preview:
DESCRIPTION
TRANSCRIPT
Dr. Deborah A. Brady
Ribas Associates, Inc.
First Hour
Overview of District Determined Measures
The Timeline
Quality Assessments
Tools from DESe
Resources Rubrics Core Content Objectives
Second Hour
Job alike groups and departments work together
Beth Pratt and Deb Brady will go from group to group
Product: Facilitator hand in any unanswered questions or ???
By the end of the workshop, participants will:
1.Understand the quality expectations and assessment criteria for DDM assessments
2.Begin to draft a schedule for this year for your team or department
3.Begin the process of developing DDMs by (if there is time) Using the Quality Tracking Tool on at least one possible DDM Using the Educator Alignment tool to consider the local
assessment needs
4.Email or send hardcopy of your group’s meeting minutes Include progress Remaining questions What you will need to be successful
DESE Tools Quality Tracking Tool (Excel file) Educator Assessment Tool (Excel file) Core Curriculum Objectives (CCOs) Example Assessments (mainly commercial; some local) Model Curriculum Units with Rubrics (Curriculum Embedded Performance Assessments)
Rubrics: Cognitive Rigor Matrices: Reading, Writing, Math, Science
Research NY and NYC Achieve.org, PARCC, and many others
SY 2014 SY 2015 SY 2016
• September: Pilot Plan for least 5 DDMs
• December: Implementation Extension Request Form
• Pilot 5 DDMs (at least) The scores do not count
• June: Final Plan for assessing all teachers with at least 2 DDMs
• Collect first year’s data on DDMs for all educators
• Except waivered areas
• Collect Data second year of data for all educators
• Issue Student Impact Ratings for all except waived grades/courses/subjects
•
Pilot Year SY2014
SEPTEMBER DESE received B-R’s Plan forEarly grade literacy (K-3)Early grade math (K-3)Middle grade math (5-8)High school “writing to text” (PARCC multiple texts)PLUS one more non-tested course, for example:
Fine Arts Music PE/Health Technology Media/Library Or other non-MCAS growth courses including grade 10 Math and ELA, Science
DECEMBER: Implementation Extension Request Form for specific courses in the JUNE PLAN
BY JUNE PLAN for all other DDMs must be ready for implementation in year 2 SY2015 At least one “local” (non-MCAS) and two measures per educator
The scores will not
count for those who
pilot DDMs in 2014.
SY 2015
All professional personnel will be assessed with 2 DDMs, at least one local: Guidance Principals, Assistant Principals Speech Therapists School Psychologists Nurses All teachers not yet assessed; general
and special education
YEAR 2
The scores will count as the first half of the “impact score.”
The scores will count as the first half of the “impact score” with
the waivered courses as the only exception
SY2016
“Impact Ratings” will be given to all licensed educational personnel and sent to DESE
Two measures for each educator At least one local measure for everyone Some educators will have two local
measures Locally determined measures can include
Galileo, DRA, MCAS-Alt The MCAS Growth Scores can be one
measure The average of two years’ of scores And a two-year trend
Year 3
“Impact Ratings”
Are based upon two years’
growth scores for two different
assessments, one local.
DESE is still rolling out the evaluation process and
District Determined Measures
4
1
3
2
From the Commissioner:
“Finally, let common sense prevail when considering the scope of your pilots.
“I recommend that to the extent practicable, districts pilot each potential DDM in at least one
class in each school in the district where the appropriate grade/subject or course is taught.
“There is likely to be considerable educator interest in piloting potential DDMs in a no-
stakes environment before year 1 data collection commences, so bear that in mind
when determining scope.”
Everyone earns two ratings
ExemplaryProficient
Needs ImprovementUnsatisfactory
HighModerate
Low
Massachusetts Department of Elementary and Secondary Education
11
SummativePerformance
Rating
Impact Ratingon
StudentPerformance
*Most districts will not begin issuing Impact Ratings before the 2014-2015 school year.
Massachusetts Department of Elementary and Secondary Education
12Impact
Ratingon
StudentPerformance
4503699
244/ 25 SGP
230/ 35 SGP
225/ 92 SGP
TypesOn Demand (timed and standardized)
Mid-Year and End-of-Year exams
Projects
Portfolios
Capstone Courses
Unit tests
Formats
Multiple choice
Constructed response
Performance (oral, written, acted out)
MCAS Growth Scores can serve as one score for (ELA, Math 4-8; not 3, not HS)
MCAS Growth Scores must be used when available, but all educators will have 2 different measures
The MA Model Units Rubrics can be used (online for you)
GalileoBERS-2 (Behavioral Rating Scales)
DRA (Reading)Fountas and Pinnell Benchmark
DIBELS (Fluency)
MCAS-AltMAP
Why (beyond evaluation impact) determining these measures is important to every educator
Assessment Quality
ValidityReliabilityRigorScoring GuidesInter-rater reliability
You will receive tools for these areas today
Calibration of Scorers
Developing assessment protocols
Are all assessments of equally appropriate rigor K-12?
Integrity of scores
“Assessment creep”
Training assessors
Time
Tabulating growth scores from student scores
Organizing and storing scores
Capitalize on what you are already doing
Writing to text 9-12? K-12? Research K-12? Including Specialists? Art, Music, PE, Health present practices
Math—one focus K-12? “Buy, borrow, or build your own” DESE
Tools to assess Alignment
Tools to assess Rigor
Tools to assess the quality of student work
Alignment Alignment to
Common Core,PARCC, and the District Curriculum
Shifts for Common Core have been made: Complex texts Multiple texts Argument, Info, Narrative Math Practices Depth over breadth
Rigor
Reliability
Internal ConsistencyTest-retestAlternate forms/split halfInter-rater reliability0 to 1 rating for Reliability
None to 100%
Validity
Are you measuring what you intend to assessContent (=curriculum)Consequential Validity—good or bad impactDoes this assessment narrow the curriculum?Relationships (to SAT, to grades)Correlation measurement
-1 to +1 ratings
Last First Grade Course
DDM1
DDM2
DDM 3
Smith Abby 1 ELA DRA F&P Benchmark
Smith Abby 1 Math Unit Test Galileo
Jones Bob 4 ELA MCAS Growth
Unit Benchmark Galileo
Jones Bob 4 Math MCAs Growth
Unit Benchmark Galileo
Adams John 9 ELA WTT Unit
Adams John 10 ELA WTT Unit
Adams John 11 Humanities
WTT Unit
Cambridge
Anne Alg 1 Math WTT Unit
Cambridge
Anne Geom Math WTT Unit
Washington
Greg Mixed Art 1 WTT Unit Portfolio
“Borrow, Buy, or Build”
PRIORITY:Use Quality Tool to Assess Each Potential DDM to pilot this year for your school (one district final copy on a computer)
CCOs will help if this is a District-Developed Tool
If there is additional time, Use Educator Assessment Tool to begin to look at developing 2 assessments for all educators for next year
Is the measure aligned to content? Does it assess what is most important for students to learn and be able to do?
Does it assess what the educators intend to teach?
(VALIDITY)
27
# Objective
1 Students analyze how specific details and events develop or advance a theme, characterization, or plot of a grade 9 literary text, and they support their analysis with strong and thorough textual evidence that includes inferences drawn from the text.
2 Students analyze how the structure, syntax, diction, and connotative or figurative meanings of words and phrases inform the central idea or theme of a grade 9 literary text, and they support their analysis with strong and thorough textual evidence that includes inferences drawn from the text.
3 Students analyze how specific details, concepts, or events interact to develop or advance a central idea of a grade 9 informational text, and they support their analysis with strong and thorough textual evidence that includes inferences drawn from the text.
4 Students analyze how cumulative word choice, rhetoric, syntax, diction, and the technical, connotative, or figurative meanings of words and phrases support the central idea or author’s purpose of a grade 9 informational text.
5 Students produce clear and coherent writing to craft an argument, in which the development, organization, and style are appropriate to their task, purpose, and audience, using such techniques as the following:
introducing precise claim(s), distinguishing the claim(s) from alternate or opposing claims, and creating an organization that establishes clear relationships among claim(s), counterclaims, reasons, and evidence;
developing claim(s) and counterclaims fairly, supplying evidence for each while pointing out the strengths and limitations of both in a manner that anticipates the audience’s knowledge level and concerns;
using words, phrases, and clauses to link the major sections of the text, create cohesion, and clarify the relationships between claim(s) and reasons, between reasons and evidence, and between claim(s) and counterclaims;
establishing and maintaining a formal style and objective tone while attending to the norms and conventions of the discipline in which they are writing;
providing a concluding statement or section that follows from and supports the argument presented; and
demonstrating command of the conventions of Standard English.
ELA-Literacy — 9 English 9-12https://wested.app.box.com/s/pt3e203fcjfg9z8r02si
Assessment
Hudson High School Portfolio Assessment for English Language Arts and Social Studies
Publisher Website/Sample
Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments.
Traditional Assessment
Non-Traditional Assessment
Administration/ Scoring
Traditional End-of-Grade Assessment Pre/Post or Repeated
Measures Paper/Pencil
Traditional End-of-Course Assessment Performance Task Rubric Computer Supported
Selected Response Portfolio or Work Sample Rubric Computer Adaptive
Short Constructed Response Project-Based Rubric Machine Scored
Writing Prompt/Essay Observation Rubric or Checklist Scored Locally
Other:
Scored Off-Site
Buy, Borrow, Build
Each sample DDM is evaluated
Hudson’s Evaluation: Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments.
Many are standardized assessments
Is the measure informative? Do the results of the measure inform educators about curriculum, instruction, and practice?
Does it provide valuable information to educators about their students?
Does it provide valuable information to schools and districts about their educators?
31
Pre-Test/Post TestRepeated Measures (running records)
Holistic Evaluation (portfolio)
Post-Test Only (only when assessment lacks norm like AP use as baseline)
32
For Assessing Rigor and Alignment
1.Daggett’s Rigor/Relevance Scale
2.DESE’s Model Curriculum (Understanding by Design)
3.Curriculum Embedded Performance Assessments from MA Model Curriculum
4.PARCC’s Task Description
5.PARCC’s Rubrics for writing
1 2 3 4 5 6
Topic development:The writing and artwork identify the habitat and provide details
Little topic/idea development, organization, and/or details Little or no awareness of audience and/or task
Limited or weak topic/idea development, organization, and/or details Limited awareness of audience and/or task
Rudimentary topic/idea development and/or organization Basic supporting details Simplistic language
Moderate topic/idea development and organization Adequate, relevant details Some variety in language
Full topic/idea development Logical organization Strong details Appropriate use of language
Rich topic/idea development Careful and/or subtle organization Effective/rich use of language
Evidence and Content Accuracy: writing includes academic vocabulary and characteristics of the animal or habitat with details
Little or no evidence is included and/orcontent is inaccurate
Use of evidence and content is limited or weak
Use of evidence and content is included but is basic and simplistic
Use of evidence and accurate content is relevant and adequate
Use of evidence and accurate content is logical and appropriate
A sophisticated selection of and inclusion of evidence and accurate content contribute to an outstanding submission
Artwork; identifies special characteristics of the animal or habitat, to an appropriate level of detail
Artwork does not contribute to the content of the exhibit
Artwork demonstrates a limited connection to the content (describing a habitat)
Artwork is basically connected to the content and contributes to the overall understanding
Artwork is connected to the content of the exhibit and contributes to its quality
Artwork contributes to the overall content of the exhibit and provides details
Artwork adds greatly to the content of exhibit providing new insights or understandings
New York State and New York City examples
Portfolio (DESE Approved from Hudson PS)
Connecticut: Specific tasks (Excellent for the Arts, Music)
PARCC question and task prototypes http://www.parcconline.org/samples/item-task-prototypes
Purpose Discuss possible assessments Consider what you need to accomplish this year using Schedule and Checklist
Use Quality Tracking Tool on one assessment to understand how it supports your district, school, department
Look at Educator Alignment tool to consider the “singletons” that may need to be addressed in your district, school, department
Product Email or hard copy to Beth Pratt with minutes of your group’s meeting that may consider or be working on Assessments that you are working on Next steps What you need to be successful
1. Measure growth
2. Employ a common administration procedure
3. Use a common scoring process
4. Translate these assessments to an Impact Rating
5. Assure comparability of assessments (rigor, validity).
40
top related