sedl january 10–11, 2013 utop training presented by mary walker

34
SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Upload: esther-marshall

Post on 27-Dec-2015

219 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

SEDL

January 10–11,2013

UTOP Training

Presented byMary Walker

Page 2: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Who am I?Who are you?

Pick a partner and listen carefully — you will introduce your partner to the group at the end of this activity!

Introduce yourself by providing:Your name and current positionExperience and background that led to your current position

One significant formal or informal educational experience that changed or impacted your thinking about teaching and learning

Who are we?

2

Page 3: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Measuring Effective Teaching: 3-Minute paper

What does effective teaching look like?

How can we measure effective teaching?

How can we help teachers reflect on their teaching and improve it?

3

Page 4: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Some Key Features of Effective teaching [Embedded in the UTeach model]

Organized, well-managed, on-task classroomAttention to issues of diversity and accessIncorporating inquiry/investigative learningUsing technology for teaching and learningFluid and accurate communication of contentFostering student-student collaborationFormative assessment of student progressApplications and interdisciplinary connections

Critical practices of self-reflectionFacilitating classroom discussion and “student talk”

Research in Education; NSES, NRC,

NCTM Standards

4

Page 5: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

RTOP [Reformed Teaching Practices]http://physicsed.buffalostate.edu/AZTEC/RTOP/RTOP_full/about_RTOP.html

COP [Classroom Observation Protocol]www.horizon-research.com/instruments/lsc/cop.pdf

http://www.horizon-research.com/instruments/hri_instrument.php?inst_id=14

Observational protocol development for research

5

Page 6: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Focus on mathematics and science content and concepts that are significant and developmentally appropriate

Observers MUST have knowledge of STEM standards and course expectations

Values and evaluates what students are doing, not just what the teacher is doing

Provides feedback STEM teachers want and need in order to grow professionally

What makes UTOP unique?

6

Page 7: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Description of UTOP

Full version has 27 indicators (teaching and student behaviors) in 4 domainsClassroom EnvironmentLesson StructureImplementationMathematics/Science Content

1-5 scale [DK/NA options]Section Synthesis Ratings — “the human average”

7

Page 8: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Rating Indicator  1.1 The classroom environment encouraged students to generate ideas, questions, conjectures,

and/or propositions that reflected engagement or exploration with important mathematics and science concepts. Description Rubric Specific Rating Examples

Evidence:   1.2 Interactions reflected collegial working relationships among students. (e.g. students worked together

productively and talked with each other about the lesson). *It's possible that this indicator was not applicable to the observed lesson. You may rate NA in this case.Description Rubric Specific Rating Examples

Evidence:  1.3 Based on conversations, interactions with the teacher, and/or work samples, students were intellectually engaged with

important ideas relevant to the focus of the lesson. Description Rubric Specific Rating Examples

Evidence:  1.4 The majority of students were on task throughout the class.

Description Rubric Specific Rating ExamplesEvidence:  1.5 The teacher’s classroom management strategies enhanced the classroom environment.

Description Rubric Specific Rating ExamplesEvidence:  1.6 The classroom is organized appropriately such that students can work in groups easily, get to

lab materials as needed, teacher can move to each student of student group, etc.Description Rubric Specific Rating Examples

Evidence:  1.7 The classroom environment established by the teacher reflected attention to issues of access,

equity, and diversity for students (e.g. cooperative learning, language-appropriate strategies and materials, attentiveness to student needs). Description Rubric Specific Rating Examples

This indicator should be rated a 1 if there is group work during the lesson, but the group work is highly unproductive. This could include behavior where the majority of the groups are socializing, off-task, arguing, or ignoring each other, as well as regular instances of students copying and/or certain group members doing all of the work.

This indicator should be rated a 2 if …

This indicator assesses the degree to which students have learned to be collegial, respectful, cooperative, and interactive when working in groups. Evidence of collegial working relationships among students includes collaborative discussions about topics relevant to the lesson and successful distributing of roles and responsibilities within each group…

Rating of 3 Example: The students were put into debate groups for this class period - one group would debate another group, while the rest of the student groups were in the audience. The groups worked together smoothly - the students were able to pick who was doing what part of the debate, coordinate their arguments, and split the time slots when necessary. The audience also would occasionally compare their notes during breaks…

UTOP AND ONLINE MANUAL

8

Page 9: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Pilot Study

Developed and tested UTOP in some of our graduates’ classrooms — 2007 to 2009

Conducted 83 observations of:UTeach Graduates (N=21)Non-UTeach Graduates (N=15)

Novice teachers (most 0–3 years exp)

Math, science, and computer science classes; middle and high school

9

Page 10: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Pilot Study

After starting out at similar levels, UTeachers gain higher UTOP scores over time [without seeing their own data!]Teaching experience significant predictor of UTOP scores for UTeach group (p < .05)

Noyce Scholars rated significantly higher on UTOP than other groups (p < .01)

Key Question: Is the UTOP a valid and reliable instrument that measures important components of effective teaching?

10

Page 11: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

MET/nmsi Study

UTOP study conducted in partnership with the Gates Foundation’s Measures of Effective Teaching project and the National Math and Science Initiative

Opportunity to examine reliability, consistency, factor structure

Look to see if we can connect teaching behaviors on UTOP to teacher value-added gains

11

Page 12: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

The MET Projectyear one

3000 teachers from 7 school districts, 7 statesVarious subjects (mathematics, English, science) and grade levels

Multiple measures of effectiveness (observations, value-added, student surveys, teacher exams)

Multiple video lessons of each teacher Multiple classroom observation instruments

Charlotte Danielson’s FFT CLASS protocol (Pianta, et al.) MQI Rubric (Ball, et al.) UTOP (UTeach group)

12

Page 13: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

MET/nmsi Study

99 raters (math and science master teachers with LTF/AP) scored 994 video lessons of 250 teachers using UTOP

All lessons grades 4–8 mathematicsOne-third of videos double-scored

13

Page 14: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Most of the 4–8 math video lessons from this national sample did not score highly on the UTOP

Many middle school math teachers observed teaching problematic content — mostly formulaic/key word type approaches.Raters identified problematic content issues in around one half of all lessons

Results

14

Page 15: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Surface-level engagement often seen, but little emphasis on conceptual understanding

“Orderly but unambitious”

Results

15

Page 16: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

If 1 observer comes in 1 time per year to observe a teacher with the UTOP… 33% of variance in scores due to teacher characteristics 66% due to rater bias and the characteristics of the lesson

If 4 observers each come in 1 time per year to observe a teacher with the UTOP… 66% of variance in scores due to teacher characteristics

Similar for 2 observations per year with two different observers present at each observation

This reliability is close to or higher than other MET instruments

Reliability

16

Page 17: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

MET/NMSI Study:Value-Added Correlations

Are the teaching behaviors measured on the UTOP associated with higher student learning gains on standardized assessments and tests of conceptual understanding?

17

Page 18: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Value-Added Correlations

This graph is copied from the released 2012 MET Report

18

Page 19: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Multiple observations with STEM-knowledgeable, trained observers is necessary for reliability

Correlations with student test score gains and teacher UTOP scores are weakly positive at best

Observations, student perception survey data, and student learning measure different things but all are needed to get a complete picture of what happens in the classroom

Teaching Effectiveness Studies:Lessons learned

19

Page 20: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Measure of Effective Teaching (2012). http://www.metproject.org/downloads/MET_Gathering_Feedback_Research_Paper.pdf

Walkington, C., Walker, M., & Marder, M. (2011, July). Developing tools to evaluate the practice of Noyce Scholars: The UTeach Classroom Observation Protocol. Presentation at the NSF Robert Noyce Teacher Scholarship Program Conference. Washington, DC.

Walkington, C., et al. (2011). Development of the UTeach Observation Protocol: A classroom observation instrument to evaluate mathematics and science teachers from the Uteach preparation program.  http://www.cwalkington.com/UTOP_Paper_2011.pdf

UTOP studies

20

Page 21: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Can a classroom observation tool [UTOP] change the way we teach and students learn when used by a professional learning community of teachers/administrators/university facilitators?

What other support structures and resources are needed to get the most out of UTOP observations — PLCs? Coaching and mentoring by facilitators or colleagues? On-demand PD?

So, Why Utop?Putting lessons learned into practice

21

Page 22: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Learn to use UTOP by using the tool

http://uteach.utexas.edu/UTOP

UTOP Video Version UTOP Manuals — Full and Video UTOP Full Version

All videos from http://www.timssvideo.com/

8th-grade mathematics lesson – US3 Exponents 8th-grade science lesson – AU4 Energy Transfer For extra practice: Go to the TIMSS website, view JP2, score with a friend, and post to discussion forum [email with details to follow]

Overview:UTOP training agenda

22

Page 23: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Rate each indicator with a 1–5, typed into the box in the Word document

Type 1–5 sentences of supporting evidence into the “Evidence” box to justify each numerical rating.

How to use the utop

23

Page 24: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Supporting Evidence should be: Specific — based on specific quotes and interactions from video No opinions! You must justify your rating based on evidence. Somewhat brief — try to average 3–4 sentences Your evidence is how you prove to us that you actually watched the video

Bad Supporting Evidence: Is brief Is vague Is opinionated Is not really related to the indicator’s intent Gives the teacher directive feedback (“You should have…”) Is too specific to your knowledge/background

What is evidence for ratings?

24

Page 25: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

View Video 1, taking field notesDiscuss with partner or group how to score Video 1 on each indicator AND come to consensus on the Synthesis Rating

Whole-group discussion and comparison with “expert” ratingshttp://uteach.utexas.edu/UTOP/

Video 1: Energy

25

Page 26: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

View Video 2, taking field notes

Discuss “big ideas” and impressions with partner or group

End Day One

Video 2: Exponents

26

Page 27: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Review what you wrote earlier in the day about measuring effective teaching. Has anything changed? Why or why not?

Based on the day’s training, describe how you would use an instrument like the UTOP to help a teacher learn to grow professionally.

Exit ticket/Homework

27

Page 28: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Discuss answers to EXIT TICKET questions:Review what you wrote about effective teaching. Has anything changed? Why or why not?

Based on the training so far, describe how you would use an instrument like the UTOP to help a teacher learn to grow professionally.

Discuss your responses with your group.Choose a spokesperson to summarize for reporting out.

Warm-up

28

Page 29: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Whole-group discussion of Video 2 and report outSynthesis Rating for each domainHighest indicator ratingLowest indicator rating

View “teaser” of Video 3, JP2, Changing Shape without Changing AreaBegin field notesComplete with a colleague(s) at a later time

Email completed UTOPs to me for posting [Details to follow]

Day two: UTOP training (continued)

29

Page 30: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Can a classroom observation tool [UTOP] change the way we teach and students learn when used by a professional learning community of teachers/administrators/university facilitators?

What other support structures and resources are needed to get the most out of UTOP observations — PLCs? Coaching and mentoring by facilitators or colleagues? On-demand PD?

So, Why Utop?Putting lessons learned into practice

30

Page 31: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

What are we doing with the UTOP?Video annotation of user manualDevelop web-based training and recalibration modules

Develop targeted professional development modules

Manor StudyPreservice AT UTOP evaluations

Wrap up

31

Page 32: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

Profiles of two schoolsProfiles show areas of strength and weaknesses

Teacher’s individual profile is shared in private conferenceCan compare to own school profile

Teachers choose what areas to work onFacilitators assist in forming PLC’s focused on similar areas

Facilitators can pair appropriate mentorsFacilitators can provide “on-demand” PD

Data reveal example

32

Page 33: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

What are you going to do with the UTOP?

Your Turn

33

Page 34: SEDL January 10–11, 2013 UTOP Training Presented by Mary Walker

[email protected]

[email protected]

Contact information

34