teacher behavior and student achievement.2

131
TEACHER BEHAVIOR AND STUDENT ACHIEVEMENT: A STUDY IN SYSTEM DYNAMICS SIMULATION A Dissertation Presented to the Faculty of the Graduate School The University of Memphis In Partial Fulfillment of the Requirements for the Degree Doctor of Education by Jorge O. Nelson December, 1995

Upload: dr-jorge-nelson

Post on 28-Nov-2014

238 views

Category:

Documents


2 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Teacher behavior and student achievement.2

TEACHER BEHAVIOR AND STUDENT ACHIEVEMENT:

A STUDY IN SYSTEM DYNAMICS SIMULATION

A Dissertation

Presented to

the Faculty of the Graduate School

The University of Memphis

In Partial Fulfillment

of

the Requirements for the Degree

Doctor of Education

by

Jorge O. Nelson

December, 1995

Page 2: Teacher behavior and student achievement.2

ii

Copyright © Jorge O. Nelson, 1995

All rights reserved

Page 3: Teacher behavior and student achievement.2

iii

DEDICATION

This dissertation is dedicated to my parents

Doris and Gene Nelson

who have given me the love of learning.

Page 4: Teacher behavior and student achievement.2

iv

ACKNOWLEDGMENTS

I would like to thank my major professor, Dr. Frank Markus, for his

understanding and guidance. I would also like to thank the other committee

members, Dr. Robert Beach, Dr. Randy Dunn, Dr. Ted Meyers and Dr. Tom

Valesky, for their comments and suggestions on this project. This project was

not without difficulties, but without the support and interest of all the committee

members, there would have not been a project at all.

I would like to thank my mother, Doris Nelson, for her integrity and belief

in my potential. I would like to thank my father, Gene Nelson and his father, my

grandfather, Andy Nelson, for starting me on this journey as I hope to help my

son with his someday.

Page 5: Teacher behavior and student achievement.2

v

ABSTRACT

Nelson, Jorge O. Ed.D. The University of Memphis. December 1995. Teacher behavior and student achievement: A study in system dynamics simulation. Major Professor: Frank W. Markus, Ph.D.

Systemic change has been identified as a necessary step in improving

public schools. A tool to help define and improve the system of schools could be

helpful in such reform efforts. The use of computer simulations are common

tools for improvement and training in business and industry. These computer

models are used by management for continually improving the system as well as

training employees how to go about working in the system. Education is one

area that is due for such a tool to help in identifying and continually improving

the system of public schools.

A simple beginning in creating a tool for schools is the development and

validation of a simulation of, arguably, the most basic and profound processes in

education—the relationship between student and teacher. The following study

was an attempt to develop a computer simulation of a single component in the

entire system of education: a model of identified teacher behaviors which affect

student achievement in a regular classroom setting.

Forrester's (1968) system dynamics method for computer simulation was

used as a theoretical approach in creating the model. A knowledge synthesis

matrix of research-based conclusions as identified by Brophy and Good (1986)

was used to insure that valid, research-based conclusions were represented in the

model. A theory of direct instruction, adapted from Hunter's (1967) Instructional

Theory Into Practice (ITIP) model, was used for the structural framework of the

teaching paradigm. The simulation was developed on a Macintosh platform

Page 6: Teacher behavior and student achievement.2

vi

running in the ithink! system dynamics software environment (High Performance

Systems, Inc., 1994).

Data were gathered in the form of simulation outcomes of the knowledge

synthesis findings modeled in a computer simulation. Participants were solicited

to volunteer and experience the simulation in two validation sessions. Results of

two sets of six individual simulation sessions, wherein teachers, school

administrators and university professors manipulated the inputs during

simulation sessions, were used for validation purposes.

It was concluded that knowledge synthesis findings from a body of

research-based studies could be simulated on a computer in a system dynamics

environment. The implications of the simulation of research-based conclusions

in a computer model are that such simulations may provide educators with a

tool to experiment with processes that go on in the regular classroom without

causing any strain or stress to students, teachers, or any other part of the real-

world system in the process.

It is recommended that simulations such as this one be used to help

educators better understand the interdependencies which exist in the school

setting. Teachers, administrators, and researchers can see the "big picture" as

they manipulate the research-based findings in this "microworld" of a classroom.

It is also recommended that more simulations such as this one be developed, and

that by improving the process and product of educational simulations, these

studies will help educators better understand schools as complex, dynamic

systems.

Page 7: Teacher behavior and student achievement.2

vii

TABLE OF CONTENTS

Chapter Page

I. INTRODUCTION AND LITERATURE REVIEW

Introduction...................................................................................1

Problem Statement .......................................................................2

Purpose of the Study ....................................................................3

Research Question ........................................................................3

Review of the Literature ..............................................................3

Counterintuitive Behaviors ....................................................5

Complexity in Schools.............................................................6

Cognition in Simulation..........................................................7

Promises of Simulation in Educational Reform...................9

Teacher/Study Simulation Component ...............................9

A Model of Direct Instruction ..............................................10

Knowledge Synthesis Matrix ...............................................12

Simulation Software ..............................................................23

Limitations of the Study ............................................................42

II. RESEARCH METHODOLOGY

Simulation Modeling..................................................................43

Model Criteria & Knowledge Synthesis .............................45

System Definition...................................................................45

Grouping of Variables & Data Identification.....................46

Mapping the Model ...............................................................47

Model Translation..................................................................57

Model Calibration ..................................................................57

Model Validation ...................................................................65

Page 8: Teacher behavior and student achievement.2

viii

TABLE OF CONTENTS

Chapter Page

III. RESULTS

Simulation Model of Knowledge Synthesis........................... 67

Validation Results.......................................................................67

Pre-simulation Questionnaire ..............................................68

Simulation Sessions ...............................................................71

Post-simulation Questionnaire ............................................72

Additional Findings ...................................................................74

IV. SUMMARY, CONCLUSIONS, IMPLICATIONS AND

RECOMMENDATIONS

Summary......................................................................................77

Conclusions .................................................................................78

Implications .................................................................................80

Teachers and Instruction.......................................................81

Administrators .......................................................................82

Research...................................................................................83

Schools as Learning Organizations .....................................84

Recommendations ......................................................................85

REFERENCES...........................................................................................88

APPENDIX A............................................................................................93

APPENDIX B ............................................................................................95

APPENDIX C............................................................................................97

APPENDIX D............................................................................................99

APPENDIX E ..........................................................................................103

APPENDIX F ..........................................................................................106

VITA.........................................................................................................120

Page 9: Teacher behavior and student achievement.2

ix

LIST OF TABLES

Table Page

1. The Instructional Theory Into Practice (ITIP) Direct Instruction Model....................................................11 2. Knowledge Synthesis Matrix of Teacher Behaviors that Affect Student Achievement................14 3. Inclusion Criteria by Smith and Klein (1991) as Applied to Knowledge Synthesis Report by Brophy and Good (1986)...........................................................22 4. Example of Equations Generated From a Stock and Flow Diagram.............................................................33 5. Example of "Steady State" Simulation Session versus Actual Respondent Data Entry.........................................62 6. Study 1: Pre-Simulation Questionnaire Data Collection Report ...................................................................69 7. Study 2: Pre-Simulation Questionnaire Data Collection Report ...................................................................70 8. Study 1: Post-Simulation Questionnaire Data Collection Report ...................................................................75 9. Study 2: Post-Simulation Questionnaire Data Collection Report ...................................................................76

Page 10: Teacher behavior and student achievement.2

x

LIST OF FIGURES

Figure Page

1. Four basic constructs of system dynamics notation ..................24 2. High-level description of entire simulation model....................30 3. Example of submodel .....................................................................32 4. Example of initial stock data entry...............................................34 5. First example of converter data entry ..........................................35 6. Second example of converter data entry .....................................35 7. First example of initial flow data entry........................................36 8. Second example of initial flow data entry...................................37 9. Example of simulation output ......................................................38 10. Example of sensitivity analysis data entry..................................40 11. Example of four sensitivity analyses runs...................................41 12. Flow diagram of the simulation modeling sequence ................44 13. Feedback circle diagram of how teacher behaviors may affect student performance ...................................................46 14. Student sector ..................................................................................48 15. Teacher sector ..................................................................................49 16. Giving information .........................................................................50 17. Questioning the students ...............................................................51 18. Reacting to student responses.......................................................52 19. Handling seatwork .........................................................................53 20. Teacher burnout ..............................................................................54

Page 11: Teacher behavior and student achievement.2

xi

LIST OF FIGURES

Figure Page

21. System dynamics model of teacher behavior as it affects student achievement ..................................................56 22. Post-question wait time impact ....................................................58 23. Current student achievement........................................................60 24. Example of "steady state" output using data set from table 5.............................................................63 25. Example of respondent #6 output using data set from table 5.............................................................64

Page 12: Teacher behavior and student achievement.2

1

CHAPTER I

INTRODUCTION AND LITERATURE REVIEW

Introduction

Continuous improvement in American public education is perhaps one of

the most important issues facing educators in the future. Digate and Rhodes

(1995) describe how there is an increase in complaints regarding the

improvement of public schools. New strategies and tactics to assist efforts to

improve must be developed to counteract this growing perception of mediocrity

in our schools.

Recent educational reforms have not been successful in upgrading school

districts in the United States (Bell, 1993; Reilly, 1993). These reform efforts—for

example, raised student achievement standards, more rigorous teaching

requirements, a more constructive professional teaching environment, and

changes in the structure and operation of individual schools—have not satisfied

the changing needs of today’s society. Educational leaders must find effective

ways to reform the quality of public education if the gap between what society

desires and what education delivers is to be reduced.

One way to improve quality in schools is to approach reform efforts using

Forrester’s (1968) computer-based system dynamics point of view. Lunenberg and

Ornstein (1991) reaffirmed that "one of the more useful concepts in

understanding organizations is the idea that an organization is a system" (p. 17).

Darling-Hammond (1993), Kirst (1993), and others believe that problems in

educational reform are to be found in an excessive focus on specific areas within

education, and an insufficient focus on the entire system, as illustrated in

Darling-Hammond’s "learning community" concept (1993, p. 760). A closer look

Page 13: Teacher behavior and student achievement.2

2

at schools as systems promises to help educational leaders better understand the

problematic environment they face.

Other than an attempt 20 years ago to simulate student performance in the

classroom (Roberts, 1974), means to study schools as systems, for the most part,

have been unavailable to the practitioner. With the confluence of the affordable,

powerful desktop computer; system dynamics modeling languages; and the

graphical user interface comes the potential to create such a tool: a simulation

model of schools to be used by educational leaders. An initial modeling of one of

the most critical subsystems within schools—the interaction between student and

teacher—is a logical place to begin the simulation modeling process due to the

large body of research-based findings describing this area (e.g., Wittrock's

Handbook of Research on Teaching, 1986).

Using simulation to better understand the behavior of schools,

educational leaders can develop and implement more effective policies. This

saves "real" schools from yet another failed reform, while providing a cyberspace

for the developing, testing, practicing and implementing of educational policies.

Problem Statement

One relevant component necessary for creating a simulation of public

education is the relationship between teacher behavior and student achievement.

This study simulates the interaction of a number of research-based conclusions

about teacher behaviors that, one way or another, affect student achievement.

While there have been numerous studies that identify isolated teacher behaviors

in context-specific classrooms, there has been little or no effort to integrate these

findings into a generalizable simulation model. Only one study, Roberts’ (1974)

simulation exercise, was found to have attempted this kind of simulation.

Page 14: Teacher behavior and student achievement.2

3

The nonlinear complexity of outcomes found in human interactions,

sometimes counterintuitive in nature, can make it difficult for teachers to

understand and choose appropriate behavior(s) for a given classroom setting

(Senge, 1990a). There needs to be a generalizable model, using a system

dynamics approach, which illustrates how research-based conclusions regarding

teacher behaviors affect student achievement for a particular population.

Purpose of the Study

The purpose of this particular study was to create a system dynamics

simulation of research-based conclusions of teacher behaviors which affect

student achievement.

Research Question

Can the knowledge synthesis findings of teacher behaviors that affect

student achievement be usefully modeled in a system dynamics computer

simulation?

Review of the Literature

Currently, there is not a tool to help educational leaders integrate

knowledge about the systemic areas needed for reform. The following review of

the literature provides some rationale for such a tool by: identifying the need due

to counterintuitive behaviors and complexity in public education; recognizing

cognition in, and promises of, simulation as the tool; defining a teacher/student

Page 15: Teacher behavior and student achievement.2

4

simulation component based on an existing model of direct instruction;

identifying research-based conclusions in a knowledge synthesis matrix format;

and identifying a simulation software package which can be used to create the

simulation exercise.

Rhodes (1994) describes some uses of technology which: support the organizational interactions that align and connect isolated actions of individuals and work groups as they attempt to separately fulfill the common aims of the organization; and ensure that experiential ‘data’ turns into institutionalized knowledge as the organization ‘learns’. (p. 9)

Prigogine and Stengers (1984) report that "we are trained to think in terms

of linear causality, but we need new ‘tools of thought’: one of the greatest

benefits of models is precisely to help us discover these tools and learn how to

use them" (p. 203).

Schools are complex organizations. Senge (1990a) points out that

counterintuitive behaviors appear in these kinds of organizations. The variables

are confusing. For example, Waddington’s (1976) housing project case

hypothesized that by simply building housing projects, slums would be cleared

out. However, these projects then attract larger numbers of people into the area,

and these people, if unemployed, remain poor and the housing projects become

overcrowded, thereby creating more problems than before. An awareness of

such counterintuitive behavior is a prerequisite if educational leaders are to

develop and implement successful reform efforts.

Counterintuitive Behaviors

Page 16: Teacher behavior and student achievement.2

5

Four major counterintuitive behaviors of systems that cognitively

challenge educational leaders have been identified by Gonzalez and Davidsen

(1993). These behaviors are: a) origin of dynamic behavior, b) lag times, c)

feedback, and d) nonlinearities.

Origin of Dynamic Behavior

Dynamic behavior is the interaction between systemic variables and the

rate of resources moving into and out of those variables. The origin of dynamic

behavior itself is confusing to humans in that the accurate identification of all

relevant variables and rates can be difficult.

Lag time

In systems, the element of time can create counterintuitive behaviors

through lags between systemic variables and rates of moving resources. An

ignorance of lag time by management and ignoring or postponing corrective

actions until they are too late can create oscillatory behaviors in system outputs.

Short term gains can actually create long term losses.

Feedback

The identification of feedback usage can illustrate how actions can

reinforce or counteract each other (Senge, 1990a). This provides educational

leaders with the ability to discover interrelationships rather than linear cause-

effect chains. This discovery provides the opportunity to focus on recurring

patterns.

Nonlinearities

Page 17: Teacher behavior and student achievement.2

6

Schools operate in response to the effects of complex interrelationships

rather than linear cause and effect chains (Senge, 1990a). Sometimes these

interrelationships operate at a point where small changes in the system result in

a "snowballing" effect that seems out of proportion to the cause (Briggs, 1992). A

better grasp of the nonproportional interrelationships in systems is needed for

decision-making and policy analysis.

Complexity in Schools

In focusing on the nonlinear system aspects of schools, one must

incorporate concepts from the emerging science of complexity, or the study of "the

myriad possible ways that the components of a system can interact" (Waldrop,

1992, p. 86). Complexity is a developing discipline that accounts for the newly

discovered relationship between order and chaos. Wheatley (1992) describes

how these two traditionally opposing forces are linked together.

Those two forces [chaos and order] are now understood as mirror images, one containing the other, a continual process where a system can leap into chaos and unpredictability, yet within that state be held within parameters that are well-ordered and predictable. (p. 11)

Schools are complex environments that evolve over time. Educational

leaders generally measure and even forecast social and political climates before

implementing reform efforts, but they must also understand how unexpected

changes in the school district can create the need to adapt policies and

procedures if success is to be achieved. O’Toole (1993) argues that the goal of

policy is simplification. "Before an executive can usefully simplify, though, she

must fully understand the complexities involved" (p. 5). Wheatley (1992) states

that "when we give up myopic attention to details and stand far enough away to

Page 18: Teacher behavior and student achievement.2

7

observe the movement of the total system, we develop a new appreciation for

what is required to manage a complex system" (p. 110).

Using the science of complexity as a theoretical foundation, educational

leaders can observe and better understand the interrelationships found in school

districts, and therefore, increase their competence in implementing successful

reform measures. Simulation is the tool for representing reality in social settings

and it has been studied since the 1960’s (Forrester, 1968; Gonzalez & Davidsen,

1993; Hentschke, 1975; Richardson, 1991; Richardson & Pugh, 1981; Roberts,

1974; Senge, 1990b; Waldrop, 1992). Richardson (1991) describes how "the use of

digital simulation to trace through time the behavior of a dynamic system makes

it easy to incorporate nonlinearities" (p. 155). Only recently has the technology

caught up with the theory. Simulation now combines both the power of systems

theory and the emerging science of complexity to be used as a management tool

by the practitioner in the field in the form of a computer model.

Cognition in Simulation

Complex simulations create an opportunity for the participant to master

both experiential and reflective cognition (Norman, 1993).

They [simulations] support both experiential and reflective processes: experiential because one can simply sit back and experience the sights, sounds, and motion; reflective because simulators make possible experimentation with and study of actions that would be too expensive in real life. (p. 205)

For over 20 years the cognitive processes of subjects have been studied in

laboratory settings, where they interact with complex systems in such a way that

"the comprehensive work . . . has provided a rich theory of human cognition and

Page 19: Teacher behavior and student achievement.2

8

decision-making in . . . problem situation[s], including social systems" (Gonzalez

& Davidsen, 1993, p. 7).

In experiential cognition, the participant’s skills are developed and refined

to the point of automatic reflexive action during simulation sessions (Norman,

1993; Schön, 1983). The participant’s reasoning and decision-making skills are

developed and refined to the point where reflective thought is automatic both

before and after simulation sessions.

One important element to be considered when participants experience

simulation sessions is Erikson’s (1963) notion of play, or the attempt to

synchronize the bodily and social processes with the self. When man plays he must intermingle with things and people in a similarly uninvolved and light fashion. He must do something which he has chosen to do without being compelled by urgent interests or impelled by strong passion; he must feel entertained and free of any fear or hope of serious consequences. He is on vacation from social and economic reality—or, as is most commonly emphasized: he does not work. (p. 212)

Experiential cognition is reinforced when participants lose themselves

playing with simulations—similar to the child absorbed while playing at an

arcade-style video game. The results from these simulation sessions are a

heightened experience, or flow (Csikzentmihalyi, 1990).

Playing simulations also provides a collaborative environment for

educational leaders—a sharing of new ideas with colleagues who also play the

simulation. The players create a standardization of vocabulary of school district

simulation terms. Hass and Parkay’s (1993) findings regarding simulation

sessions of the M-1 tank indicate that during simulated stressful conditions

groups of tank simulators were as useful for teaching teaming skills as they were

for teaching the mechanics of tank operation. Simulations can help to build

teams as well as teach appropriate skills to individuals.

Page 20: Teacher behavior and student achievement.2

9

Promises of Simulation in Educational Reform

Computer simulation promises a number of outcomes for educational

reform. People who are not scientists will be able to create models of various

policy options, without having to know all the details of how that model actually

works (Waldrop, 1992). Such models would be management flight simulators

for policy, and would allow educational leaders to practice crash-landing school

reform measures without taking 250 million people along for the ride. The

models wouldn’t even have to be too complicated, so long as they gave people a

realistic feel for the way situations developed and for how the most important

variables interact.

The technology, rationale, and expertise required for creating school

simulations are all readily available. It is now up to educational leaders to come

forth and demand system simulations as viable data-based decision-making tools

for the coming millennium. Any other choice may be determinant in creating

less than desirable outcomes.

Teacher/Student Simulation Component

In searching for a beginning to school simulations, a logical starting point

is the interaction between student and teacher. The focus of this study is the

delivery of instruction from the teacher to the student as identified in relevant

teacher behaviors which significantly affect student achievement. Twenty years

ago Roberts (1974) made an attempt to describe such an integration of research

findings and personal experiences into a teacher-behavior/student-performance

simulation. The Roberts study occurred when computer simulation was an

Page 21: Teacher behavior and student achievement.2

10

unwieldy, expensive, and complex venture better left to computer programming

experts. The outputs from the simulation were simple two-dimensional graphs

showing directional movement in trend lines after variables were manipulated.

One drawback from this type of simulation is that educators had to depend on

programming experts to generate outputs through data entry. Educators could

not input data themselves, nor could they manipulate variables "on the fly".

Educators need to control data themselves to develop a working knowledge of

the nonlinear outcomes of public education (Nelson, 1993). With the newer,

more powerful, easier-to-use technologies available today comes the potential for

everyone to develop and "play" simulations by themselves and develop greater

understanding of the complexities in schools (Waldrop, 1992).

A Model of Direct Instruction

In searching for an existing educational model of teacher/student

interaction upon which the simulation would be based, Corno and Snow (1986)

reported that a large body of research findings fully support an agreed-upon

description of direct instruction, or "a form of . . . teaching that promotes on-task

behavior, and through it, increased academic achievement" (p. 622). Using direct

instruction as a model for this simulation exercise helped to insure that a

sufficient number of valid research-based findings would be available for

construction and validation purposes.

For this study, a conceptual framework for describing the interaction

during direct instruction between teacher and student was found in Hunter’s

(1967) Instructional Theory Into Practice (ITIP) model of effective teaching. The

ITIP model identified "cause-effect relationships among three categories of

decisions the teacher makes: a) content decisions, b) learner behavior decisions,

Page 22: Teacher behavior and student achievement.2

11

and c) teaching behavior decisions" (Bartz & Miller, 1991, p. 18). This model was

chosen due to the repetition of its essential elements as reported in numerous

studies dating back to World War II (Gagné, 1970; Good & Grouws, 1979; Hunter

& Russell, 1981; War Manpower Commission, 1945). Table 1 is an outline of the

main elements found in the ITIP model for direct instruction.

TABLE 1

THE INSTRUCTIONAL THEORY INTO PRACTICE (ITIP)

DIRECT INSTRUCTION MODEL ________________________________________________________________________

1. Provide an Anticipatory Set

2. State the Objectives and Purpose

3. Provide Input in the Form of New Material

4. Model Appropriate Examples

5. Check for Student Understanding

6. Provide Guided Practice with Direct Teacher Supervision

7. Provide Independent Practice ________________________________________________________________________

Source: Adapted from Improved Instruction. Hunter (1967).

In the ITIP model, as in other teaching models, there are certain elements

that need to be addressed when determining which direct instruction strategies

are deemed effective. Those elements include student socioeconomic status

(SES)/ability/affect, grade level, content area, and teacher intentions/objectives

(Brophy & Good, 1986). How the interactions between these elements react with

selected teaching strategies in an ITIP model for direct instruction is one

problematic area that needs to be addressed by educators before they "tinker"

with a classroom situation. Occasionally these interactions have been found

Page 23: Teacher behavior and student achievement.2

12

more significant than main effects in classroom settings (Brophy & Evertson,

1976; Solomon & Kendall, 1979). These conclusions suggest "qualitatively

different treatment for different groups of students. Certain interaction effects

appear repeatedly and constitute well-established findings" (Brophy & Good,

1986, p. 365). A simulation of these interactions is one way for educators to

develop better strategies for direct instruction to increase student achievement.

The literature firmly supported direct instruction as a model for teacher

behaviors as they affect student achievement. The next step was to define a set of

research-based conclusions regarding teacher behaviors that affect student

achievement. It was recognized that there were potentially scores of available

variables which may have influenced the setting of this model, so this study

relied upon gathered research-based conclusions reported in a knowledge

synthesis matrix from Brophy and Good (1986) as outlined below.

Knowledge Synthesis Matrix

The knowledge synthesis matrix of researchers' findings (Table 2) is

evidence of identified conclusions regarding the relationship between teacher

behavior and student achievement. Knowledge synthesis has been identified to

include four general purposes:

1. to increase the knowledge base by identifying new insights, needs,

and research agendas that are related to specific topics;

2. to improve access to evidence in a given area by distilling and

reducing large amounts of information efficiently and effectively;

3. to help readers make informed decisions or choices by increasing

their understanding of the synthesis topic; and

Page 24: Teacher behavior and student achievement.2

13

4. to provide a comprehensive, well-organized content base to

facilitate interpretation activities such as the development of textbooks, training

tools, guidelines, information digests, oral presentations, and videotapes (Smith

& Klein, 1991, p. 238). A logical addition to general purpose number four might

include another entry for interpretation activities in the form of simulations.

Within the knowledge synthesis, some of the research-based conclusions

were context-specific, whereby certain teacher behaviors enhanced student

achievement at one grade level, but hindered student achievement at another.

Therefore, such context specific findings were included in the model at the

appropriate level and magnitude.

Criteria for Selection

To insure that findings reported in the knowledge synthesis matrix were

of sufficient significance, a summary and integration of consistent and replicable

findings as identified by Brophy and Good (1986) was selected for the knowledge

synthesis variables using an adaptation of Smith and Klein's (1991) inclusion

criteria for acceptance. These variables have been "qualified by reference to

grade level, student characteristics, or teaching objective" (Brophy & Good, 1986,

p. 360). The following is a brief description of the research-based conclusions

integrated and summarized in Brophy and Good's (1986) knowledge synthesis

used as initial data for this study.

TABLE 2

Page 25: Teacher behavior and student achievement.2

14

KNOWLEDGE SYNTHESIS MATRIX OF

TEACHER BEHAVIORS THAT AFFECT STUDENT ACHIEVEMENT.

Student

Teacher

Giving

Information

Questioning the Students

Reacting to

Student Responses

Handling

Seatwork & Homework

Teacher

Burnout1

Socio-econ.

Status

Management Skills & Org.

Vagueness in Terminology

Student

Call-outs

Teacher Praise

Independent Success Rate

Hours Worked

per Day

Grade Level

Experience

Degree of

Redundancy

Higher Level Questioning

Negative Feedback

Available

Help

Enthusiasm

Expectation

Question/

Interactions

Post-Ques. Wait Time

Positive

Feedback

Response

Rate

Source: Adapted from Teacher behavior and student achievement. Brophy & Good (1986).

Relationship of Knowledge Synthesis to Simulation Variables

The knowledge synthesis matrix described four divisions of variables of

lesson form and quality as well as two areas of context-specific findings,

describing both teacher and student (Brophy & Good, 1986). Seventeen variables

were represented throughout these six areas. An experimental division, teacher

burnout, was added as an optional seventh component along with two more

variables, to introduce to the model a chaotic function to the simulation model

for discussion and experimentation after the validation was completed.

Category 1—student. There were two context-specific variables in the

student arena which affected achievement: grade level and socioeconomic status

(SES). Grade level was described as early grades (1-6) and late grades (7-12).

Regarding SES, Brophy and Good (1986) stated:

1 Optional set of variables for experimentation use only.

Page 26: Teacher behavior and student achievement.2

15

SES is a 'proxy' for a complex of correlated cognitive and affective differences between sub-groups of students. The cognitive differences involve IQ, ability, or achievement levels. Interactions between process-product findings and student SES or achievement-level indicate that low-SES-achieving students need more control and structure from their teachers; more active instruction and feedback, more redundancy, and smaller steps with higher success rates. This will mean more review, drill, and practice, and thus more lower-level questions. Across the year it will mean exposure to less material, but with emphasis on mastery of the material that is taught and on moving students through the curriculum as briskly as they are able to progress. (p. 365)

Students have differing needs in upper and lower grade levels as well as

in higher and lower socioeconomic levels. For example, regarding

socioeconomic status, Solomon and Kendall (1979) found that 4th grade students

from a high socioeconomic status (SES) need demanding and impersonal

instructional delivery, and students from a low SES need more warmth and

encouragement in delivery strategies. Brophy and Evertson (1976) found that

low SES students needed to get 80% of questions answered correctly before they

move on to newer content, but high SES students could accept a 70% rate of

success before they move on to newer content. SES is a variable which affects the

way students achieve in school.

To illustrate differences in student achievement at upper and lower grade

levels, Brophy and Good report on the impact of praise at differing levels.

"Praise and symbolic rewards that are common in the early grades give way to

the more impersonal and academically centered instruction common in the later

grades" (1986, p. 365). This example describes how teachers need to be aware of

the use of praise when trying to increase student achievement. Praise has been

shown to elicit a more positive impact on younger children than on their older

counterparts.

Page 27: Teacher behavior and student achievement.2

16

Grade level and socioeconomic status are variables that affect student

achievement at differing rates and levels. Teachers need to be aware of these

differences to better meet the individual needs of their students.

Category 2—teacher. The teacher category had three types of variables

identified in Brophy and Good's (1986) knowledge synthesis report. Those were

management skills and organization, experience, and expectation.

Management skills and organization played a large part in affecting

student achievement across the grade levels. "Students learn more in classrooms

where teachers establish structures that limit pupil freedom of choice, physical

movement, and disruption, and where there is relatively more teacher talk and

teacher control of pupils' task behavior" (Brophy & Good, 1986, p. 337). Teachers

who are more organized have a more positive affect on student achievement

than otherwise.

Another conclusion showed that teachers need time—experience—to

develop their expertise and increase their effectiveness, because ". . . the majority

of [first year] teachers solved only five of the original 18 teaching problems of

first-year teachers in fewer than three years. Several years may be required for

teachers to solve problems such as classroom management and organization"

(Alkin, Linden, Noel, & Ray, 1992, p. 1382). Teachers who have more teaching

experiences have been shown to elicit a more positive affect on student

achievement than otherwise.

A number of findings were integrated under the expectation variable in

Brophy and Good's knowledge synthesis report. "Achievement is maximized

when teachers . . . expect their students to master the curriculum" (Brophy &

Good, 1986, p. 360). "Early in the year teachers form expectations about each

student's academic potential and personality. . . If the expectations are low . . .

the student's achievement and class participation suffers" (Dunkin, 1987, p. 25).

Page 28: Teacher behavior and student achievement.2

17

Expectation was also found to be context specific in increasing student

achievement, showing that ". . . in the later grades . . . it becomes especially

important to be clear about expectations . . ." (Brophy & Good, 1986, p. 365).

Expectation is a variable which teachers must address when thinking about

increasing student achievement, generally speaking as well as in context.

The initial two categories were based upon the context-specific variables

of both student and teacher. The remaining four categories of variables were

based upon lesson design and quality of instruction.

Category 3—giving information. One of the variables found in this group

was identified as vagueness in terminology. "Smith and Land (1981) report that

adding vagueness terms to otherwise identical presentations reduced student

achievement in all of 10 studies in which vagueness was manipulated" (Brophy

& Good, 1986, p. 355). Teachers need to be clear and precise in their delivery

methods if student achievement is to be maximized. If a teacher uses vague

terminology in his/her delivery, student achievement is not maximized.

A second variable in the giving information group was the degree of

redundancy in the delivery of instruction. "Achievement is higher when

information is presented with a degree of redundancy, particularly in the form of

repeating and reviewing general rules and key concepts" (Brophy & Good, 1986,

p. 362). When teachers repeat new concepts, student achievement was shown to

increase more than if new concepts were not repeated.

In the same group yet a third variable was identified as the number of

questions/interactions per classroom period. "About 24 questions were asked per 50

minute period in the high gain classes, . . . In contrast, only about 8.5 questions

were asked per period in the low-gain classes . . . " (Brophy & Good, 1986, p.

343). When teachers initiate a higher incidence of questions/interactions with

Page 29: Teacher behavior and student achievement.2

18

their students, achievement has been shown to increase more than when teachers

elicit a lower incidence of question/interactions.

Category 4—questioning the students. A fourth category or group of

variables, questioning the students, contained four research-based conclusions

which affect student achievement: students call-outs (i.e. speaking without raising

their hands), higher-level questioning, post-question wait time, and response rate.

One variable, student call-outs, was shown to make a difference in

achievement because "student call-outs usually correlate positively with

achievement in low-SES classes but negatively in high-SES classes" (Brophy &

Good, 1986, p. 363). Student call-outs are defined as incidents where students

"call-out" answers to questions without raising their hands or other forms of

classroom management for responses. In low-SES situations, student call-outs

correlate with higher achievement, where the student who is frequently shy or

withdrawn takes a risk and calls out an answer. This situation often helps a

student's self-esteem if the call-out is not prohibited by or frowned upon the

teacher. In high-SES situations student call-outs tend to distract the more secure

student body and, in turn, lower student achievement levels due to an increase in

off-task behaviors.

The higher level questioning variable illustrated how ". . . the frequency of

higher-level questions correlates positively with achievement, the absolute

numbers on which these correlations are based typically show that only about

25% of the questions were classified as higher level" (Brophy & Good, 1986,

p. 363). Teachers who use higher level questions about one-fourth of the time

had better results in increasing student achievement than any other amount of

higher-level questioning of the students.

The post-question wait time variable described how teachers gave students a

few seconds to think about a question before soliciting answers to the question.

Page 30: Teacher behavior and student achievement.2

19

"Studies . . . have shown higher achievement when teachers pause for about three

seconds (rather than one second or less) after a question, to give the students

time to think before calling on one of them" (Brophy & Good, 1986, p. 363). For

example, Hiller, Fisher, and Kaess (1969); Tobin (1980); and Tobin and Capie

(1982) all report that three seconds seems to be a desirable amount of wait time

teachers need to follow before calling on students for responses to questions.

Wait time gives slower students an extra moment or two to further process the

question before they attempt to reply with the answer. This quantitative wait

time data was used as a desirable benchmark in a "wait time" variable

requirement, with less time creating less than desirable output from the model.

The fourth variable in this category was identified as response rate from the

student to teacher input. "Optimal learning occurs when students move at a

brisk pace but in small steps, so that they experience continuous progress and

high success rates (averaging perhaps 75% during lessons when a teacher is

present, and 90-100% when the students must work independently)" (Brophy &

Good, 1986, p. 341). Teachers who monitor student success and move on to new

material or independent work after seeing about 75% success rate from the

students elicited the highest gains in student achievement than any other

response rate amount.

Category 5—reacting to student responses. The fifth category, as

identified by Brophy and Good (1986), describes general lesson form and quality

and was named reacting to student responses. This category integrates three

research-based conclusions: teacher praise, negative and positive feedback.

Teacher praise was shown to affect students' achievement in a

socioeconomic status context. "High-SES students . . . do not require a great deal

of . . . praise. Low-SES students . . . need more . . . praise for their work" (Brophy

& Good, 1986, p. 365). Praise was also found to affect students differently at

Page 31: Teacher behavior and student achievement.2

20

differing grade levels. "Praise and symbolic rewards that are common in the

early grades give way to the more impersonal and academically centered

instruction common in the later grades" (p. 365).

The type of negative feedback given to students was found to affect their

achievement. "Following incorrect answers, teachers should begin by indicating

that the response is not correct. Almost all (99%) of the time, this negative

feedback should be simple negation rather than personal criticism, although

criticism may be appropriate for students who have been persistently

inattentive" (Brophy & Good, 1986, p. 364). Teachers who almost never

personally criticized their students had higher achievement from those students

than teachers who personally criticized their students.

Positive feedback was a third and final variable identified in the Brophy and

Good (1986) knowledge synthesis category of reacting to student responses.

"Correct responses should be acknowledged as such, because even if the

respondent knows that the answer is correct, some of the onlookers may not.

Ordinarily (perhaps 90% of the time) this acknowledgment should take the form

of overt feedback" (p. 362). Teachers who acknowledge success in student

responses elicit more student achievement than those who ignore successful

responses from their students.

Category 6—handling seatwork. A sixth category of variables used to

illustrate how teacher behaviors affect student achievement in the knowledge

synthesis was handling seatwork. Within this category were two conclusions

defined in the research: independent success rate, and available help.

Interrelated in the final research-based conclusion in this category,

independent success rate and available help were shown to affect achievement.

"For assignments on which students are expected to work on their own, success

rates will have to be very high—near 100%. Lower (although still generally high)

Page 32: Teacher behavior and student achievement.2

21

success rates can be tolerated when students who need help get it quickly"

(Brophy & Good, 1986, p. 364). Teachers who successfully taught the lesson and,

therefore, had students who could work independently with high success rates

elicited higher achievement gains than those teachers who needed to help

students who did not "get" the lesson in the first place.

Category 7—teacher burnout (experimental in nature). A seventh and

optional category was added to the model. This category has been titled teacher

burnout and was included to add a chaotic function for experimental purposes.

This part of the simulation model was "turned off" during validation studies to

insure that only the findings described in the knowledge synthesis were

simulated during validation.

The assumption used in this category was adapted from a simulation of

employee burnout by High Performance Systems, Inc. (1994). In this simulation

it was assumed that "a certain amount of burnout accrues from each hour of

overtime that's worked" (p. 166). This adaptation of a burnout variable was

added to an enthusiasm variable as defined by Brophy and Good (1986). They

found that "enthusiasm . . . often correlates with achievement" (p. 362).

In the High Performance Systems burnout simulation, burnout impacted

the work yield of the employee. In this study, the adaptation attempted to

illustrate how the burnout of the teacher impacted the enthusiasm of the teacher,

thereby affecting the achievement of the student.

Brophy and Good's (1986) summary report findings can be related to

Smith and Klein's (1991) criteria for knowledge synthesis as outlined in Table 3.

In this table, evidence is presented to support using the Brophy and Good report

as a knowledge synthesis upon which the system dynamics model may be based.

TABLE 3

Page 33: Teacher behavior and student achievement.2

22

INCLUSION CRITERIA BY SMITH AND KLEIN (1991) AS APPLIED TO

KNOWLEDGE SYNTHESIS REPORT BY BROPHY AND GOOD (1986)

1. Focus on normal school settings with normal populations. Exclude studies conducted in laboratories, industry, the armed forces, or special facilities for special populations.

2. Focus on the teacher as the vehicle of instruction. Exclude studies of programmed instruction, media, text construction, and so on.

3. Focus on process-product relationships between teacher behavior and student achievement. Discuss presage2 and context variables that qualify or interact with process-product linkages, but exclude extended discussion of presage-process or context-process research.

4. Focus on measured achievement gain, controlled for entry level. Discuss affective or other outcomes measured in addition to achievement gain, but exclude studies that did not measure achievement gain or that failed to control or adjust for students’ entering ability or achievement levels.

5. Focus on measurement of teacher behavior by trained observers, preferably using low-inference coding systems. Exclude studies restricted to teacher self-report or global ratings by students, principals and so forth, and experiments that did not monitor treatment implementation.

6. Focus on studies that sampled from well-described, reasonably coherent populations. Exclude case studies of single classrooms and studies with little control over or description of grade level, subject matter, student population, and so on.

7. Focus on results reported (separately) for specific teacher behaviors or clearly interpretable factor scores. Exclude data reported only in terms of typologies or unwieldy factors or clusters that combine disparate elements to mask specific process-outcome relationships, or only in terms of general systems of teacher behavior (open vs. traditional education, mastery learning, etc.).

After the knowledge synthesis was identified and described, a technology-

based tool was needed to create the simulation which would assist educators in

developing policy and for problem-solving exercises. The next step was to

determine what software was to be used to create such a tool.

2 Presage variables include "teacher characteristics, experiences, training, and other properties that influence teaching behavior" (Shulman, 1986, p.6).

Page 34: Teacher behavior and student achievement.2

23

Simulation Software

To create the simulation, the purchase of a software package was required.

Computer simulations have been in use by the military since World War II. One

of the first computer simulations combining feedback theory with decision-

making was developed in the early 1960's using a computer programming

language entitled DYNAMO (Forrester, 1961). These simulations were

considered complex due to "their numerous nonlinearities, capable of

endogenously shifting active structure as conditions change, [which] give these

models a decidedly nonmechanical, lifelike character" (Richardson, 1991, p. 160).

System Dynamics

Out of the simulation sessions by Forrester and others grew a field of

study entitled system dynamics, which has "academic and applied practitioners

worldwide, degree-granting programs at a few major universities, newsletters

and journals, an international society, and a large and growing body of

literature" (Richardson, 1991, p. 296).

System dynamics is a field of study

which includes a methodology for constructing computer simulation models to achieve better understanding and control of social and corporate systems. It draws on organizational studies, behavioral decision theory, and engineering to provide a theoretical and empirical base for structuring the relationships in complex systems. (Kim, 1995, p. 51)

Within the system dynamics paradigm is a structural representation of

components within the system in question. There are four major constructs

found in every system dynamics simulation that represent components in

Page 35: Teacher behavior and student achievement.2

24

systems: Forrester's (1968) systems dynamics notation of stocks, flows,

converters, and connectors.

In Figure 1, a simulation sector entitled Student is defined using these four

constructs and is included here as an example to help describe how the modeling

process uses system dynamics notation.

FIGURE 1.

FOUR BASIC CONSTRUCTS OF SYSTEM DYNAMICS NOTATION

Stocks. Tyo (1995) describes the first of these four constructs as: "stocks,

which are comparable to levels" (p. 64). Lannon-Kim (1992) defines stocks as

accumulators, or "a structural term for anything that accumulates, e.g. water in a

bathtub, savings in a bank account, current inventory. In system dynamics

notation, a 'stock' is used as a generic symbol for anything that accumulates"

(p. 3). For example, in Figure 1 the identified stock represents the current level of

Page 36: Teacher behavior and student achievement.2

25

student achievement reported as an accumulation of a percentage score ranging

from 0 to 100, with 100 being the maximum percentage of achievement possible

over a given amount of time—in this case 180 school days. This level of student

achievement can fluctuate depending on certain teaching behaviors attributed to

changes in student achievement as modeled in the simulation.

Flows. The second construct in systems dynamics notation is called a

flow, or rate. This part of the model determines the "amount of change

something undergoes during a particular unit of time, such as the amount of

water that flows out of a tub each minute, or the amount of interest earned in a

savings account" (Lannon-Kim, 1992, p. 3). In Figure 1, the indicated flow

determines the amount and direction of change in achievement as reported in a

percentage score in the adjacent stock—either higher, lower or equal—and is

updated "daily" during the simulation run (i.e., 180 times each session). This

flow is represented mathematically in system dynamics notation with the

following equation:

Achievement Change = Behavior Impact - Current Student Achievement

This equation shows how the impact of teaching behaviors directly affects

outcomes in student achievement.

The relationship in Figure 1 between the stock Current Student

Achievement and the flow Achievement Change is represented in system

dynamics notation in the following equation:

Current Student Achievement(t) = Current Student Achievement(t-Dt) +

(Achievement Change) * Dt

This equation shows how student achievement changes over time depending on

the rate of change in achievement and the current level of achievement. To

Page 37: Teacher behavior and student achievement.2

26

clarify, suppose we said that Achievement Change = A, Behavior Impact = B,

Current Student Achievement = C, Time = t, and Change in Time = Dt. Then the

formulae would look like the following: A = B + C, and

C = C (t - Dt) + A * Dt

Converters. The third construct, converters—sometimes called auxiliaries

and/or constants, are similar to formula cells in a spreadsheet. These values,

whether automatically and/or manually entered, are used to modify flows in the

model. In Figure 1, the selected converter represents the identified behaviors

which impact on the flow involving changes in student achievement. The

following equation illustrates the mathematical representation of the

interrelationships between all of the knowledge synthesis research findings in all

of the subsystems within the model as compiled in the converter Behavior

Impact:

Behavior Impact = MEAN (Level of Independence, Management of

Response Opportunities, Quality of Structuring, Quality of Teacher

Reactions, (Teacher Expectation * GRADE & EXPECTATIONS))

This equation states that the output from all of the model sectors are averaged

together to be used as a modifier in the flow Achievement Change3. Again, to

clarify, suppose we said that Behavior Impact = B, Level of Independence = L,

3 Teacher Expectation was first determined by the relationship between grade level and previous grade point average as described in the knowledge synthesis matrix findings by Brophy and Good (1986) and then averaged with the rest of the sector outcomes.

Page 38: Teacher behavior and student achievement.2

27

Management of Response Opportunities = M, Quality of Structuring = S, Quality

of Teacher Reactions = T, Teacher Expectation = E and GRADE &

EXPECTATIONS = G. Then the formula would look like the following:

B = (L + M + S + T + (E * G)) 5

Connectors. The connector, or link, represents the fourth construct in

system dynamics and is a connection between converters, flows and stocks. In

Figure 1, the behavior impact converter and the achievement change flow are

tied together with the indicated connector. This link ties the two parts together

in a direct relationship, where behavior impacts directly on achievement.

A system dynamics approach, based on the four major constructs of

stocks, flows, converters, and connectors, allows computer models to "extract the

underlying structure from the 'noise' of everyday life" (Kim, 1995, p. 24). This

"noise" has been described as chaos, or "the science of seeing order and pattern

where formerly only the random . . . had been observed" (p. 24). The next step in

creating a system dynamics-based tool for educational reform was to determine

which computer application would be used to create the simulation exercise.

The following section describes requirements for the simulation, and a brief

review of various computer programs and their functions.

Computer Applications of System Dynamics

The first system dynamics computer simulations were written mainly in

the DYNAMO computer language on mainframe computers in the late fifties and

early sixties. These simulations were programmed by computer experts and

researchers either had to learn complex programming languages or wait until the

computer programmers completed the task and reported the results. Today, this

Page 39: Teacher behavior and student achievement.2

28

type of simulation programming has its drawbacks due to the lack of portability

and the extremely long learning curve associated with mainframe programming

languages. A more ideal simulation software package would be developed and

run on laptop computers in a more user-friendly environment (e.g. operating

systems using graphical user interfaces such as Windows or Macintosh) by

educators and researchers in the field, not on mainframes by computer

programming experts.

In the search for a more ideal software package for the novice, a few

options were uncovered. Tyo (1995) and Kreutzer (1994) reviewed system

dynamics software packages, similar in design to DYNAMO, which are currently

available for inexpensive, personal computers. Based on these two distinct

software reviews, two types of system dynamics software were considered for

the purpose of creating the simulation: Powersim (ModellData AS., 1993), and

ithink! (High Performance Systems, Inc., 1994). The following is a brief summary

of those two reviews. Kreutzer (1994) described the ithink! package: Because of its powerful features and ease of use, ithink! is one of the most popular system dynamics modeling tools. It allows you to draw stock-and-flow diagrams on the computer screen, completely mapping the structure of the system before you enter equations. You can add more detail and then group elements into submodels, zooming in for more detail in complex models. The manual is such a good introduction to system dynamics that we recommend it even if you use another program. (p. 3)

Stating that "ithink! is one of the most powerful simulation packages

reviewed", Tyo (1995, p. 64) goes on to describe how ithink! has "by far the best

tutorials and documentation, as well as a large number of building blocks"

(p. 64). Both reviewers agreed that ithink! provides good authoring support for

novice modelers as well as good support for sensitivity analysis so that the

Page 40: Teacher behavior and student achievement.2

29

models can be tested and retested with varying inputs. Tyo (1995) gave ithink! a

five star rating (out of a possible five).

Powersim was given high ratings due to its built-in workgroup support.

Tyo (1995) stated that "the multi-user game object lets several users run the

model concurrently to cooperate or compete against one another. This is

particularly useful for testing workgroups" (p. 64).

Both system dynamics packages, Powersim and ithink!, were purchased

for the purpose of creating and testing the simulation model. The final

simulation model was created using the ithink! system dynamics modeling

language, and was developed and simulated on an Apple Macintosh laptop

computer platform, model Duo 230. The ithink! package was chosen over

Powersim because there was a wealth of available documentation and support

material included in the ithink! package and because of the ease with which

programming was accomplished in the Macintosh computer operating system.

Simulation using ithink! software. To develop a simulation in the ithink!

environment a number of steps have to be followed to insure logical functioning

of the model. Those steps include: defining a high-level description of the entire

model; creating subsystems (i.e., sectors), which represent the separate parts of

the model; placing each variable in its prospective sector for accurate visual

representation; and constructing connections (i.e., dependencies) between the

variables in and across sectors.

Page 41: Teacher behavior and student achievement.2

30

FIGURE 2.

HIGH-LEVEL DESCRIPTION OF ENTIRE SIMULATION MODEL

The first step in the ithink! modeling process is to construct a high-level

description of the entire model. Figure 2 shows an example of how the entire

model can be defined in process frames, "each of which models one subsystem,

Page 42: Teacher behavior and student achievement.2

31

such as the rocket propellant in a space shuttle" (Tyo, 1995, p. 64). Process

frames are constructed to represent elements found in a particular system, such

as the elements of a teacher behavior/student achievement simulation as

described in the knowledge synthesis of Brophy and Good (1986).

Each arrow in the high-level description can represent the connection of

one or more research findings to the corresponding sectors. These arrows are

drawn to illustrate dependencies among sectors such as those identified in the

knowledge synthesis. According to Tyo (1995), "frames can be connected to one

another to show dependencies between subsystems. For example, if a company

incorrectly bills its customers, the number of calls to customer service will

increase" (p. 66). If there is doubt about how the sectors are connected in the

high-level description, there is a function in ithink! which allows connections to

be made at the subsystem level, where the interdependencies are sometimes

more clear to the user.

After the high-level frames are laid out, the model is further defined by

identifying submodels within each pertinent area. Tyo (1995) describes how "the

modeler steps down into each of the frames to add the necessary constructs for

each submodel" (p. 66).

To construct a submodel, the modeler uses the four basic components in

system dynamics notation previously mentioned—stocks, flows, converters, and

connectors. A map of the submodel, using the four components, is drawn on the

computer screen to visually represent the relationship(s) desired. In Figure 3, the

following example is a submodel entitled Student Enrollment. The stock is

entitled Current Student Enrollment. Two flows entering and exiting the stock

are entitled New Students and Losing Students. Two converters with connectors

to the flows are entitled Student Growth Fraction and Student Loss Fraction.

Page 43: Teacher behavior and student achievement.2

32

The stock returns information via feedback loops to the flows in the form of

connectors.

FIGURE 3

EXAMPLE OF SUBMODEL

In this submodel a graphical representation was drawn to visually

describe the relationships between new students entering the school as well as

students leaving the school. The flow of students entering and leaving the school

is illustrated by the direction of the arrows on the flows themselves.

For each stock and flow diagram in the simulation, the software also

creates a set of generic equations (see Table 4). Tyo (1995) describes how

the modeler moves into 'modeling' mode to define the mathematical relationships among the stocks, flows and other constructs. . . ithink! presents the modeler with valid variables to use in defining mathematical relationships. (p. 66)

Johnston and Richmond (1994) describe the set of equations in simplified

terminology:

Page 44: Teacher behavior and student achievement.2

33

What you have now, is what you had an instant ago (i.e., 1 dt [delta time] in the past) + whatever flowed in over the instant, - whatever flowed out over the instant. The software automatically assigns + and - signs based on the direction of the flow arrowheads in relation to the stock. (p. 9)

TABLE 4

EXAMPLE OF EQUATIONS GENERATED FROM

A STOCK AND FLOW DIAGRAM

________________________________________________________________________

Current_Student_Enrollment(t) = Current_Student_Enrollment(t - dt) + (new_students - losing_students) * dt INIT Current_Student_Enrollment = { Place initial value here… } new_students = { Place right hand side of equation here… } losing_students = { Place right hand side of equation here… } student_growth_fraction = { Place right hand side of equation here… } student_loss_fraction = { Place right hand side of equation here… }

________________________________________________________________________

The new student enrollment is dependent on the relationship between the

flow of new students affected by the student growth fraction. The number of

students leaving the school is dependent on the student loss fraction as it affects

the flow called losing students. Johnston & Richmond (1994) describe further the

relationships:

In order to simulate, the software needs to know 'How much is in each accumulation at the outset of the simulation'? It also needs to know 'What is the flow volume for each of the flows'? The answer to the first question is a number. The answer to the second may be a number, or it could be an algebraic relationship. (p. 9)

Page 45: Teacher behavior and student achievement.2

34

For example, let's assume that there are currently 100 students enrolled in

the school. To enter this information into the submodel, the modeler simply

"double-clicks" with the mouse on the Current Student Enrollment stock and a

new "window" appears on the screen (see Figure 4). Within this window are

numerous options to enter and change data requirements. The modeler types

the number 100 in the box entitled "INITIAL (Current__Student__Enrollment) =".

FIGURE 4

EXAMPLE OF INITIAL STOCK DATA ENTRY

After closing this window the modeler can then enter the student growth

fraction in the same manner: double-clicking on the converter icon, and typing in

Page 46: Teacher behavior and student achievement.2

35

the number desired. For this example it was assumed that for every 10 students

currently enrolled in the school, two new students entered as well. Thus, the

number ".2" was entered in the appropriate box within the window entitled

"student__growth__fraction = ..." (see Figure 5).

FIGURE 5

FIRST EXAMPLE OF CONVERTER DATA ENTRY

As with the student growth fraction, the student loss fraction was entered

in the same manner in the corresponding converter (see Figure 6). This time it

was assumed that only a few students were leaving the school (2 for every 100)

so the number ".02" was entered into the corresponding window entitled

"student__loss__fraction = ...".

FIGURE 6

SECOND EXAMPLE OF CONVERTER DATA ENTRY

Page 47: Teacher behavior and student achievement.2

36

After the ratio of student population growth to student population decline

was determined, the rate at which each of the flows of incoming and outgoing

students had to be determined. By double-clicking on the flows "new students"

and "losing students", data was entered into corresponding new windows that

appeared on the screen (see Figure 7).

FIGURE 7

FIRST EXAMPLE OF INITIAL FLOW DATA ENTRY

To set the rate of flow an equation had to be entered into the window

entitled "new__students = ...". This equation, "Current__Student__Enrollment *

student__growth__fraction", describes the relationship between the current

number of students and the new incoming students. Every time the simulation

model completes one time step (i.e., dt = delta time), the level of current student

enrollment changes as well.

Page 48: Teacher behavior and student achievement.2

37

The flow of losing students was similar in data entry to the previous flow

data entry. The only difference in the operation was that the equation had a

student loss fraction as the other part of the equation, not a student growth

fraction as previously described (see Figure 8).

FIGURE 8

SECOND EXAMPLE OF INITIAL FLOW DATA ENTRY

Once the initial data entry is completed the modeler can return to the

stock and flow diagram, initiate a simulation "run", and watch the results as they

unfold on the screen. To represent the results a number of graphical options

exist within the software. For this example a simple graph was used to illustrate

the outcomes of the simulation "run" (see Figure 9).

The graph describes X and Y axes, where X represents the number of

months during the school year and Y represents the number of students enrolled

in the school (from zero to a possible 1500 students maximum). In the initial

Page 49: Teacher behavior and student achievement.2

38

"day" of the school year, the number of students currently enrolled was 100,

same as initially entered in the stock entitled Current Student Enrollment.

During the simulation "run" the number of students enrolled gradually

increased due to more new students enrolling (two to every 10) versus students

lost (two to every 100) so that by the end of the "year" the student population was

close to 800 students. With this type of simulation, an administrator could

predict needs for future facilities and when capacities were going to be met

during the year.

FIGURE 9

EXAMPLE OF SIMULATION OUTPUT

Page 50: Teacher behavior and student achievement.2

39

After the simulation has been set up to run, sensitivity analysis is

necessary to insure that the model describes what is out there in the "real" world.

Tyo (1995) describes how:

ithink! lets users do sensitivity analysis on the model by running it repeatedly with varying inputs. The results of each run are written to a separate line on the output graph. For input, the user can set basic statistical distributions or use graphs. An ithink! model can be animated with the level of the stocks moving up and down as appropriate. (p. 66)

To do a sensitivity analysis the modeler needs to choose certain

specifications in the simulation software and run the simulation a number of

times until the outcomes match the desired results. For example, in the Student

Enrollment model a desired outcome might be that the maximum number of

enrolled students not exceed a lower limit than 800 (for reasons of facilities, etc.).

In the software, there is a function that allows the modeler to do a number of

sensitivity runs automatically until the desired output is achieved. To keep the

enrollment down fewer students can be allowed into the school than two for

every 10 currently enrolled students, or more can be made to leave the school

than two for every 100 currently enrolled students. Thus, the modeler enters

differing values in the student growth or loss fraction converters and watches the

respective outcomes.

To initiate the sensitivity analysis the modeler selects an option entitled

Sensitivity Specifications and a window appears on the computer screen (see

Figure 10). In this window the modeler first selects the system dynamics

component to be analyzed—in this example the student growth fraction

converter was selected. Then the modeler selects the number of runs desired to

automate the simulation. For this example four runs were selected. After which

the modeler chooses a range of data for test purposes. The example shows a

Page 51: Teacher behavior and student achievement.2

40

range of values from "0.05" (i.e., five for every 100 students currently enrolled) to

the initial "0.2" in the first simulation run (i.e., two for every 10 students currently

enrolled). This selected range is less than and equal to the original specifications

in the student growth fraction converter data entry to bring down the number of

new students enrolling in the school.

FIGURE 10

EXAMPLE OF SENSITIVITY ANALYSIS DATA ENTRY

After the data is entered the modeler selects the graph function in the

previously described window and runs the simulation. The simulation cycles

four times, each time using a different value for the student growth fraction

converter. The resulting output is drawn on the graph and the modeler can see

which, if any, output is desirable (see Figure 11). Then the modeler can go back

to the student growth fraction converter and change the data entry to the desired

Page 52: Teacher behavior and student achievement.2

41

fraction to "set" this component in the model, or run the analysis again with

different data sets to create a more desirable outcome.

FIGURE 11

EXAMPLE OF FOUR SENSITIVITY ANALYSES RUNS

Using sensitivity analysis, the modeler can insure that the simulation

describes a situation that reflects what is happening in the system in question.

The desired outcomes can be achieved with a logical application of the program.

In conclusion, the ithink! system dynamics software is a system dynamics

package that can be run on inexpensive computers and the ease of use and

practicality of this package makes it an ideal program for creating a technology-

based tool to be used in educational reform.

Page 53: Teacher behavior and student achievement.2

42

Limitations of the Study

This study is limited to the simulated observation of variables described in

the knowledge synthesis matrix based upon Brophy and Good's (1986) summary

of teacher behaviors and student achievement found in Table 2.

Page 54: Teacher behavior and student achievement.2

43

CHAPTER II

RESEARCH METHODOLOGY

Simulation Modeling

Modeling educational processes in a computer simulated environment

involved a number of ordered steps to complete the product in a logical,

sequential fashion. This sequence involved selection of the software; design and

construction of the model; and calibration and validation of the outputs from the

simulation.

The simulation exercise was constructed in a simulation and authoring

tool environment known as ithink! (High Performance Systems, Inc., 1994). To

identify the proper selection and magnitude of inputs for the simulation, a

matrix was developed using information reported in a knowledge synthesis

report of research-based conclusions of significant teacher behaviors that affect

student achievement (Brophy & Good, 1986). To insure the creation of a valid,

reliable simulation, the methodology for creating a computer simulation model

of the variables identified in the matrix was based upon recommended modeling

techniques reported by Whicker and Sigelman (1991). This sequential order

included the following components: model criteria, knowledge synthesis,

system definition, grouping of variables, data identification, mapping the model,

model translation, model calibration, and validation of the model.

The organization of this chapter is illustrated in Figure 12—a flow

diagram of the simulation design and construction process for each step of the

modeling process as well as pertinent subheadings.

Page 55: Teacher behavior and student achievement.2

44

FIGURE 12

FLOW DIAGRAM OF THE SIMULATION MODELING SEQUENCE

________________________ Source: Adapted from Whicker and Sigelman (1991).

Page 56: Teacher behavior and student achievement.2

45

Model Criteria & Knowledge Synthesis

Following the flow diagram of the modeling process, the first step was to

define the criteria of a simulation exercise of teacher behaviors which affect

student achievement. In the previous chapter the criteria for the model as well as

the knowledge synthesis have been discussed and defined. The next section

describes an attempt to define the system itself.

System Definition

The actual student achievement/teacher behavior findings were reduced

into interactions among variables and factors. These interactions, described in

Figure 13, are known as feedback circle diagrams or "circles of influence rather than

straight lines" (Senge, 1990a, p. 75).

This feedback circle diagram was the first step in describing a school in

system dynamics terminology (Richardson, 1991). For example, the desired

academic achievement for a classroom setting influenced the teacher’s perception

of how great the gap in knowledge is at the present time (e.g., standardized test

scores, previous grade reports). This perception, in turn, influenced behaviors

the teacher exhibited, using the Instructional Theory Into Practice (ITIP) model as

a conceptual framework for planning appropriate instruction strategies. These

behaviors, in turn, influenced the student performance which, in turn, influenced

current academic achievement. As the current academic achievement

approached the desired level of achievement, the perceived knowledge gap

decreased and, in turn, teacher behaviors changed.

Page 57: Teacher behavior and student achievement.2

46

ITIP Framework

Current AcademicAchievement

Perceived Knowledge Gap

Teacher BehaviorsDesired

AcademicAchievement

Student Performance

Current AcademicAchievement

Desired Academic

Achievement

ITIP Framework

FIGURE 13

FEEDBACK CIRCLE DIAGRAM OF HOW TEACHER BEHAVIORS

MAY AFFECT STUDENT PERFORMANCE

Grouping of Variables & Data Identification

The grouping of the variables were previously discussed and identified in

the knowledge synthesis (see Table 2). The data requirements of the model

described in this study were identified and recorded to provide initial values for

the variables as reported in the knowledge synthesis matrix in the previous

chapter.

Page 58: Teacher behavior and student achievement.2

47

Mapping the Model

The theoretical part of simulation design was followed by writing the

system dynamics application using research-based conclusions, or "mapping" the

model, using the available software tools found in ithink!. In this next section,

the four divisions of generic variables previously identified in the knowledge

synthesis matrix (Table 2)—giving information, questioning students, reacting to

student responses, and handling seatwork—were mapped into unique model

sectors, describing a structural framework based upon the Instructional Theory

Into Practice (ITIP) model (Table 1). A student sector and teacher sector

describing context-specific research-based conclusions completed the model

designed for validation of Brophy and Good's findings as they were illustrated in

a system dynamics environment. An optional sector, teacher burnout, was

added outside of the structural framework, to be introduced after validation was

completed.

Each process frame (i.e., sector) of the actual computer model is illustrated

in the following figures. In each of the following figures, there is a graphic

representation of the research-based systemic interactions between variables,

connectors, and stock-and-flow diagrams in each illustration as identified in the

knowledge synthesis matrix (see Table 2) and discussed in the review of the

literature. The sector names correspond directly to the category names as

outlined in the knowledge synthesis.

Sector 1—Student

The first sector was a description of the student in two context-specific

areas: socioeconomic status (SES), and grade level (Figure 14). The two context-

Page 59: Teacher behavior and student achievement.2

48

specific areas identified by Brophy and Good (1986) that are directly affected by

SES are student call-outs and teacher praise.

behavior impact ?GRADE LEVEL

Current Student Achievement

~

GRADE & EXPECTATIONS

achievement chg

?

SES

Choose Socioeconomic status (SES) �

Choose grade level �

Student

FIGURE 14

STUDENT SECTOR

SES affects achievement in regard to student call-outs. In low SES

situations student call-outs correlate positively with achievement and vice versa

in high SES situations.

SES also affects achievement in regard to the level of teacher praise. Low

SES students need a great deal of praise as opposed to their high SES

counterparts who do not need as much praise.

In regard to grade level, four areas (Brophy & Good, 1986) are directly

affected in context-specific findings: vagueness in terminology from the teacher,

teacher's level of organizational skills, the impact of teacher praise, and teacher

expectations.

Vagueness in teacher delivery in instruction is more important in later

grades than in the earlier grades. The more vague the teacher is in her/his

delivery of instruction, the less achievement is realized from younger students.

In the earlier grades, organizational skills of the teacher are more

important than in later grades. Students in earlier grades need more help in

following rules and procedures than their older counterparts.

Page 60: Teacher behavior and student achievement.2

49

Praise is more important in the earlier grades than their later counterparts.

Praise increases student achievement more with younger students than with

older students.

In the later grades it is more important to be clear about teacher

expectations than in the earlier grades. To help them increase academic

achievement, older students need to know more about what, exactly, is expected.

Sector 2—Teacher

A description of the teacher in three research findings (Brophy & Good,

1986)—organizational skills, experience, and expectation—is illustrated in Figure

15.

Teacher Expectationchg in pcvd ability

?

ACADEMIC POTENTIAL INDICATORAverage Management Level

chg in managing

? organizational skills

? years experience

~MANAGEMENT IMPACT~

EXPERIENCE IMPACT

Choose amount of organizational skills �

Choose amount of experience �

Use last year'sG.P.A.(from 0.0 to 4.0)�

Teacher

FIGURE 15

TEACHER SECTOR

A teacher's organizational skills affects student achievement. Teachers

who are well organized have better success in increasing academic achievement

from the students, especially in the lower grades, than teachers who are less than

well organized. Students who are placed in environments where the teacher

controls student task behaviors experience more academic successes.

Page 61: Teacher behavior and student achievement.2

50

A teacher's experience directly affects student achievement as well.

Teachers who have more than three years teaching experience solve more

teaching problems and have better classroom management than their more

inexperienced counterparts.

Expectation plays a major role in student achievement. Teachers form

expectations about a student's ability and personality early in the year. If this

expectation is low the student's achievement is not maximized.

Sector 3—Giving Information

In this sector, a combination of the three variables as identified by Brophy

and Good (1986) was included: vagueness in terminology, degree of redundancy,

and question/interactions per period (Figure 16).

ENTHUSIASM

chg in enthusiasmadjustment delay

~

GRADE & CLARITY

~

GRADE & ENTHUSIASM

Quality of Structuring

changes in structuring

?

redundancy

~

GRADE & MANAGEMENT

~ CLARITY IMPACT

?

vagueness factor

?

amount of interactions

~REDUNDANCY IMPACT

~ACADEMIC INTERACTIONS IMPACT

Choose amount ofquestions/answer interactions per class period �

Choose degree of redundancy �

Choose amount of vagueness in terminology as used by the teacher �

Giving Information

FIGURE 16

GIVING INFORMATION

Vagueness in terminology has been shown to reduce student

achievement. This finding is especially more noticeable in the upper grades.

Page 62: Teacher behavior and student achievement.2

51

Teachers who are redundant in their explanations of new content material

realize higher achievement in their students than those who do not repeat

content in their delivery of instruction. This is especially true in repeating and

reviewing general rules and key concepts.

Teachers who generate more question/interactions among their students

realize more achievement gains than their counterparts who have fewer

interactions with students. High-gain classes experience about 24

question/interactions per 50 minute period versus 8.5 question/interactions or

less in lower-gain classrooms.

Sector 4—Questioning the Students

The fourth sector was designed to include the following variables defined

by Brophy and Good (1986): student call-outs, higher-level questions, post-

question wait time, and student response rate (Figure 17).

~COGNITIVE LEVEL IMPACT

? postquestion wait time~

WAIT TIME IMPACT

?

percent of higher level questions

?

percent of correct responses~

SUCCCESS RATE IMPACT

? student callouts ~STUDENT CALLOUTS IMPACT

Management of Response Opportunities

change in management

questioning factors

Choose amount of wait-time �

Choose amount of higher level questions �

Choose response rate �

Choose if student call-outs are or are not allowed by the teacher �

Questioning the Students

FIGURE 17

QUESTIONING THE STUDENTS

Page 63: Teacher behavior and student achievement.2

52

Teachers who allow student call-outs in low SES classes notice greater

student achievement than those who do not allow call-outs. The opposite effect

is true if the students are from a high SES background.

The percentage of higher-level questioning techniques also affects student

achievement. A teacher who uses about 25% of questions in the higher-level

domain can increase student achievement than if more or less of these types of

questions are used.

Postquestion wait time has been shown to increase student achievement if

optimally used. Three seconds seems to be the most efficient amount of time to

wait before calling on students after questioning.

The teacher who moves at a brisk enough pace to engage students during

questioning realizes more achievement gains from the students than teachers

who do not briskly cover the content area. About 75% of student successes in

responding to questions is sufficient coverage for achievement to increase.

Sector 5—Reacting to Student Responses

This fifth sector was defined in three variables (Brophy & Good, 1986):

teacher praise, negative and positive feedback (Figure 18).

~

IMPACT•PRAISE

?correct response feedback

~CORRECT RESPONSE FEEDBACK IMPACT

~INCORRECT RESPONSE IMPACT

?

teacher praiseTEACHER PRAISE IMPACT

?incorrect response feedback

Quality of Teacher Reactions

chg in reactionscollective responses

~delay in adjusting

Choose amount of time correct responses are acknowledged �

Choose amount of praise from the teacher �

Choose amount of time negative feedback is simple negation versus personal criticism �

Reacting to Student Response

FIGURE 18

REACTING TO STUDENT RESPONSES

Page 64: Teacher behavior and student achievement.2

53

Teacher praise impacts low SES students' achievement gains—the more

praise from the teacher, the more achievement realized from the student. Grade

level is another area that is sensitive to teacher praise. Younger students learn

more when praised for their effort than do their older counterparts.

Negative feedback is an important factor to consider when teaching.

When a teacher encounters an incorrect answer to his/her question, 99% of the

time the feedback from the teacher should be simple negation rather than

personal criticism for the wrong answer.

Positive feedback for correct responses from students also impacts student

achievement gains. Correct responses should be acknowledged almost all of the

time (90%)—if not for the respondent's sake, for the onlookers who are

wondering if the answer was correct.

Sector 6—Handling Seatwork

The sixth sector was described in Figure 19 as two variables from Brophy

and Good's knowledge synthesis (1986): independent seatwork, and amount of

available help.

Level of Independencechg in independence

AVAILABLE HELP IMPACTSUCCESS RATE IMPACT

?

success rate

?

help available

Choose amount of success in independent seatwork �

Choose amount of help available during seatwork �

Handling Seatwork

FIGURE 19

HANDLING SEATWORK

Page 65: Teacher behavior and student achievement.2

54

Students who work independently after the lesson was taught need to be

successful nearly 100% of the time in their seatwork. Those students who need

help and immediately receive attention from the teacher will also benefit in their

academic achievement.

Sector 7—Teacher Burnout

This optional sector was defined using two variables: enthusiasm and

amount of teacher burnout (Figure 20).

FIGURE 20

TEACHER BURNOUT

Enthusiasm from the classroom teacher has been shown to affect

achievement gains in students. Brophy and Good (1986) report that "enthusiasm

often correlates with achievement" (p. 362).

Teacher burnout was one area that was not covered in the knowledge

synthesis. This variable was based on assumptions solely derived from personal

experiences from the researcher and a number of colleagues.

After the individual sectors were defined, the entire model was integrated

into a large-scale simulation format by connecting the seven sectors together in

a logical structure.

Page 66: Teacher behavior and student achievement.2

55

This integration exercise added structure to the entire model by

graphically, as well as functionally, defining the places where there was feedback

between the individual sectors based on the research findings in the knowledge

synthesis. The next section describes how the sectors were connected using the

knowledge synthesis as a basis for integration.

Model Integration

Mapping of the complete model of the teacher behavior as it affects

student achievement was constructed by combining the seven sectors previously

described. The next step in mapping was to connect the sectors to each other by

way of logical intersections derived from the knowledge synthesis research

findings (see Figure 21).

The connections between sectors in the complete model brought together

all the research-based conclusions into one complete system. For example, the

socioeconomic status variable in Sector 1—Student was connected to the student

call-outs impact variable in Sector 5—Questioning the Students due to the

finding which stated that "student call-outs usually correlate positively with

achievement in low-SES classes but negatively in high-SES classes" (Brophy &

Good, 1986, p. 363). All of the connections were based upon the findings in the

knowledge synthesis matrix discussed in the literature review in Chapter I or

were not included in the model.

Figure 21 is the actual graphic representation—the "map"—of the

simulation model. This map includes all of the 17 variables and seven divisions

of variables according to the conclusions previously stated in the knowledge

synthesis matrix (Brophy & Good, 1986).

Page 67: Teacher behavior and student achievement.2

56

print other page and check pg. number

FIGURE 21.

SYSTEM DYNAMICS MODEL OF TEACHER BEHAVIOR

AS IT AFFECTS STUDENT ACHIEVEMENT

Page 68: Teacher behavior and student achievement.2

57

Model Translation

After this visual mapping of the model was created, the model was

defined in mathematical equations in the manner previously described in

Chapter I (see Table 4). The numerical equivalencies were taken from the

research, and, subsequently, entered into formulae (see Appendix F).

Model Calibration

In this study, the sensitivity of the initial model was adjusted to behave in

the anticipated fashion when certain input values were altered. To do this, every

identified variable (e.g., teacher behavior, classroom demographics, student

achievement, etc.) was isolated in each of the separate sectors and the subsequent

simulation outputs from each variable in their individual simulation runs were

compared against real world outputs to insure significance in the sameness of

generated data versus reality. For example, the wait time variable, as described

in the knowledge synthesis, was isolated from the rest of the sector, with all other

sectors in isolation as well, and sensitivity analysis was used to insure that three

seconds was, in fact, the amount of time which reflected the highest level of

achievement possible before connecting this variable back into the sector and the

complete system as well.

To achieve this isolated calibration of each variable within every sector a

relationship was identified and assigned between all pertinent variables within

each sector and the student achievement variable, based upon the research

findings in the knowledge synthesis. For example, the aforementioned wait time

variable was described in relating the impact of wait time on student

achievement. The relationship must either cause the student achievement to rise

Page 69: Teacher behavior and student achievement.2

58

or fall during the simulation. "Studies . . . have shown higher achievement when

teachers pause for about 3 seconds . . . after a question, to give the students time

to think before calling on one of them" (Brophy & Good, 1986, p. 363).

Using this conclusion a wait time impact graph was plotted and

sensitivity runs were used to determine the necessary relationship between post-

question wait time and student achievement. This graphical function enabled

the simulation to reproduce "complex nonlinear relationships" (High

Performance Systems, Inc., 1994, p. 14) throughout the simulation model. Figure

22 illustrates the post-question wait time impact graph found in the Sector 4—

Questioning the Students component of the model.

FIGURE 22

POST-QUESTION WAIT TIME IMPACT

Page 70: Teacher behavior and student achievement.2

59

In Figure 22, there is a table of inputs and outputs as well as a graphical

description. When postquestion__wait__time (X axis) data is entered into the

simulation, a relative WAIT__TIME__IMPACT output (Y axis) corresponds to

that input. For example, if the number 3.00 is entered by a participant in a given

simulation run, a corresponding 100 (i.e. student achievement score equivalency)

is plotted on the WAIT__TIME__IMPACT axis. Any other input returns less than

optimal output from the model. Therefore, the three second wait time variable is

insured to impact student achievement more than any other data entered in this

particular variable.

The relationships between each variable and the student achievement

outcome variable were defined using graphical functions similar to the example

described above. A listing of X and Y axes plots from every graphical function in

the model can be recreated using the numerical outputs found in the System

Dynamics Computer Simulation Model Formulae in Appendix F.

Once each variable was calibrated in isolation, the individual sectors were

calibrated in isolation as well, to assure that they reacted to input in the

anticipated fashion as defined by the research. Each sector was isolated from the

model, run with differing data sets, and calibrated so that optimum results were

achieved when the ideal input, as reported in the research, was entered. For

example, the giving information sector was calibrated so that: 1) when there was

little or no vagueness from the teacher, 2) when there was a high degree of

redundancy, and 3) when there was the optimum amount of question/answer

interactions (i.e., 24 per 50-minute period), then, and only then, did the stock-

and-flow diagram labeled Quality of Structuring indicate an output of ideal

proportions (i.e. the number 100).

Page 71: Teacher behavior and student achievement.2

60

Sector Integration

Each sector of the system dynamics model was integrated back into the

model by incorporating variables from other sectors with logical connecting

points. For example, the socioeconomic status (SES) variable in Sector 1—

Student was found to affect the impact of student call-outs in Sector 4—

Questioning the Students. "Student call-outs usually correlate with achievement

in low-SES classes but negatively in high-SES classes" (Brophy & Good, 1986, p.

363). Therefore a connection was created between the two sectors, whereby SES

directly affected the impact of student call-outs on achievement.

FIGURE 23

CURRENT STUDENT ACHIEVEMENT

Current Student Achievement

Page 72: Teacher behavior and student achievement.2

61

A graphical representation of the stock-and-flow entitled Current Student

Achievement was created to illustrate the combined outcomes from all of the

sectors (see Figure 23). Tyo (1995) describes how ithink! "provides both time

series and scatter graphs to view the output of a simulation run" (p. 66). This

"scorecard" of student achievement was situated in Sector 1—Student as a logical

reflection of the results of manipulating the data entered during individual

simulation sessions. The range of scores could run from 0% as a reflection of no

student achievement up to 100% to reflect the maximum amount of student

achievement.

"Steady State" Simulation

After the integration of every variable as it affected every other pertinent

variable, a number of simulation runs were performed to find the "steady state"

of the model. This "steady state" (i.e., ideal simulation output) was required to

insure that each of the individually-calibrated sectors did not adversely affect the

other sectors in ways not described in the research when combined in the

completed model.

To insure that a true "steady state" was attained, the simulation outcomes

were balanced until the ideal achievement level for the student occurred when

all of the research-based conclusions from each calibrated sector were

represented during the manipulation of data entry in each and every variable.

Numerical outputs from each variable as well as the accumulated results

illustrated in the Current Student Achievement stock were checked and when all

of the outcomes were as described in the knowledge synthesis, the model was

considered at "steady state".

Two complete sets of data: one set required for a "steady state" simulation

run and one set from an actual respondent who was included in the validation

Page 73: Teacher behavior and student achievement.2

62

studies were included in Table 5 to illustrate a comparison of how differing sets

of data can completely change the final outcome of the simulation run as

reflected in the Achievement graph in the Current Student Achievement stock.

TABLE 5

EXAMPLE OF "STEADY STATE" SIMULATION SESSION

VERSUS ACTUAL RESPONDENT DATA ENTRY

KNOWLEDGE SYNTHESIS

VARIABLE

Study #1 - Resp. #6

DATA ENTRY

"Steady State"

DATA ENTRY

• amount of questions/interactions 5 (per 50 min. period) 24 (per 50 min. period)

• degree of redundancy in instruction 1 (low degree) 1 (high degree)

• teacher vagueness factor 4 (near max. amount) 0 (least amount)

• teacher help available 3 (not available) 1 (readily available)

• student success @ seatwork 30 (% of time) 100 (% of time)

• % of correct responses from student 5 (% of time) 75 (% of time)

• % of higher-level questions from teacher 30 (% of time) 25 (% of time)

• postquestion wait time from teacher 3 (seconds) 3 (seconds)

• student call-outs 1 (allowed) 3 (not allowed)

• correct response feedback from teacher 50 (% of time) 100 (% of time)

• incorrect response feedback 50 (% of time) 100 (% of time)

• teacher praise 0 (much praise) 1 (little or no praise)

• grade level of student 5 (5th grade-Primary) 12 (senior-High School)

• socio-economic status of student 0 (low SES) 1 (high SES)

• student G.P.A. from previous year 1 (cum G.P.A.) 4 (cum. G.P.A.)

• teacher organizational skills 0 (unorganized) 1 (highly organized)

• years experience of teacher 3 (years experience) 10 (years experience)

Total Score accumulated in the

Current Student Achievement stock

53 (% achievement)

100 (% achievement)

If a modeler were to input the "steady state" data set from Table 5 into the

simulation, a Current Student Achievement score of 100 would be plotted on the

Achievement graph (see Figure 24).

Page 74: Teacher behavior and student achievement.2

63

FIGURE 24

EXAMPLE OF "STEADY STATE" OUTPUT

USING DATA SET FROM TABLE 5

If the modeler decided to enter the data set from respondent #6 found in

Table 5, the Current Student Achievement score would be the same as plotted on

the Achievement graph in Figure 25. This function allows users to run ideal,

"steady state" simulations as well as to run actual simulation runs from other

sources for comparison purposes. A data base of simulation runs could be

developed for further study and discussion.

Page 75: Teacher behavior and student achievement.2

64

FIGURE 25

EXAMPLE OF RESPONDENT #6 OUTPUT

USING DATA SET FROM TABLE 5

This "steady state" did not include outcomes based on including Sector

7—Teacher Burnout due to the fact that this sector was experimental in design

and not a "true" indicator of all of the combined variables as defined in the

knowledge synthesis matrix.

One assumption in the calibration step was that there was no weighting of

the outputs from the stock-and-flow diagrams as they interrelated with each

other. Each of the 17 variables were given the exact same level of impact, or

weight, as all of the others. For example, the output from Quality of Structuring

was given the same weight as the output from Management of Response

Opportunities as well as all the other outputs from the stock-and-flow diagrams.

Page 76: Teacher behavior and student achievement.2

65

Model Validation

The research-based conclusions gathered for this study, as identified by

Brophy and Good (1986) and illustrated in the knowledge synthesis matrix, were

used as baseline findings of teacher behaviors which affect student achievement.

In addition to reliance upon the knowledge synthesis matrix, the model was

demonstrated to a select group of practitioners—teachers, administrators and

university professors —who helped to determine if the outputs from the model

were or were not similar to how real teacher behaviors affect student

achievement in the classroom.

Selection of the practitioners for the first validation study was determined

by responses to a cover letter sent to chief executive officers of 197 overseas

American-type schools by obtaining their office addresses from The ISS directory

of overseas schools 1993-94 edition (International School Services, Inc., 1993).

This validation study group, comprised mainly of private, overseas, American-

curriculum school administrators attending a recruitment conference, was self-

selected and conveniently available for the first validation sessions.

Individual simulation sessions were conducted with each of the six

interested respondents at the Association for the Advancement of International

Education (AAIE) annual conference in New Orleans, Louisiana, February, 1995.

A pre-simulation questionnaire was used to identify and categorize practitioner

predictions of what data the model generated before the actual simulation was

run. A post-simulation questionnaire was used to gather participant reflections

about the outputs after the simulation had been demonstrated for a comparison

of the expected versus actual outputs.

Page 77: Teacher behavior and student achievement.2

66

A copy of the cover letter, reply form from the respondents, follow-up

letter, pre-simulation questionnaire, and post-simulation questionnaire are

included in Appendices A, B, C, D and E respectively.

Selection of the practitioners for the second validation study was

determined by a group of educational leadership graduate students attending a

1995 College of Education summer session course entitled Leadership and Policy

Studies 7120—The Supervisory Process, at The University of Memphis,

Memphis, Tennessee. This group, comprised mainly of public school teachers

from the Memphis area, was not self-selected, but coerced by the professor to

volunteer for participation the validation study.

Individual simulation sessions were conducted with each of the six

interested respondents using the same pre- and post-simulation questionnaire to

gather participant reflections about the outputs after the simulation had been

demonstrated for a comparison of the expected versus actual outputs.

The limitations of two small groups used for validation purposes is

obvious and needs to be addressed here. Neither group was randomly selected,

nor were there sameness in populations between the groups. One group was

self-selected, while the other was coerced to participate. One group was mainly

private school administrators and the other—public school teachers. The groups

were purposively selected for their availability, not for randomness or otherwise.

Results may have been different if the groups were randomly selected and larger

in size.

Page 78: Teacher behavior and student achievement.2

67

CHAPTER III

RESULTS

Simulation Model of Knowledge Synthesis

The purpose of this particular study was to create a system dynamics

simulation of research-based conclusions of teacher behaviors which affect

student achievement. Results regarding the research question—whether the

knowledge synthesis findings of teacher behaviors that affect student

achievement can be usefully modeled in a system dynamics computer

simulation—are presented in this chapter, based on observations from each of

the individual questions presented to the 12 respondents who participated in the

two validation sessions.

Validation Results

Two attempts to validate the simulation model using practitioners in the

field of education as session participants were used to find if the model

represented how teacher behaviors affect student achievement. The results from

both validation studies were comprised of simulation and questionnaire data

and are reported in this section.

In both validation studies the respondents participated in individual

simulation sessions using pre- and post-simulation instruments to gather data as

well as individual computer simulations using the system dynamics model

previously described. Each of the participants orally completed a pre-simulation

questionnaire, witnessed a computer simulation session, and wrote a post-

Page 79: Teacher behavior and student achievement.2

68

simulation questionnaire used to gather their observations. The complete

process lasted approximately 25-30 minutes for each simulation session,

including the pre-simulation questionnaire, simulation run, and post-simulation

questionnaire.

In the first validation study there were four males and two females in

attendance: one private school teacher, four private school administrators and

one university professor. In the second study there were three males and three

females: five public school teachers and one post-doctoral student.

Pre-simulation Questionnaire

The pre-simulation questionnaire solicited information regarding the

participant's choice of data to be entered into the computer model that would

reflect the highest achievement possible for the type of student chosen by the

participant (see Appendix D). This information, obtained by personal interview,

was entered into the simulation model before the participants watched the

computer sessions.

The participants from both studies were asked 17 questions derived from

the 17 knowledge synthesis findings. These questions were designed to reduce

data entry into numerical equivalencies to determine how separate variables

would perform within the complete simulation. An eighteenth variable—

amount of hours worked per day—was entered to allow the participants to

observe how the model would function when the optional sector, teacher

burnout, was "switched on".

The pre-simulation questionnaire data collection results from the two

validation studies are described in Tables 6 and 7 respectively. These tables

illustrate how each respondent in the two studies answered the 17 questions.

Page 80: Teacher behavior and student achievement.2

69

TABLE 6

STUDY 1:

PRE-SIMULATION QUESTIONNAIRE DATA COLLECTION REPORT

#

QUESTION

respondent #1

respondent #2

respondent #3

respondent #4

respondent #5

respondent #6

4

amount of interactions

24

10

12

24

10

5

5

degree of redundancy

1

1

1

1

1

1

3

vagueness factor

0

0

2

0

3

4

14

help available

1

1

1

3

3

3

13

success @ seatwork

100

87.5

70

100

50

30

8

% of correct response

100

80

60

85

40

5

7

% of higher level ques.

75

35

35

20

30

30

9

postquestion wait time

10

10

6

3

2

3

6

student call-outs

3

3

3

3

3

1

12

corr. resp. feedback

100

100

80

50

40

50

11

incorr. resp. feedback

100

0

70

75

70

50

10

teacher praise

0

1

1

1

1

0

2

grade level

8

7

10

5

6

5

1

socio-econ. status

1

1

1

1

1

0

15

student G.P.A.

3

4

2.9

3

2

1

17

organizat'l skills

1

1

.5

1

.5

0

16

years experience

8

10

5

7

1

3

Total Score reported in Current Student Achievement

84%

74%

74%

93%

63%

53%

Page 81: Teacher behavior and student achievement.2

70

TABLE 7

STUDY 2:

PRE-SIMULATION QUESTIONNAIRE DATA COLLECTION REPORT

#

QUESTION

respondent #1

respondent #2

respondent #3

respondent #4

respondent #5

respondent #6

4

amount of interactions

24

15

18

24

24

10

5

degree of redundancy

0

1

0

1

1

1

3

vagueness factor

0

1

1

1

0

0

14

help available

1

1

3

1

1

1

13

success @ seatwork

100

85

85

75

80

100

8

% of correct response

90

70

70

80

75

60

7

% of higher level ques.

60

25

60

75

25

60

9

postquestion wait time

3

3

3

3

3

3

6

student call-outs

3

3

3

3

3

1

12

corr. resp. feedback

100

100

40

100

100

100

11

incorr. resp. feedback

100

92

50

100

100

100

10

teacher praise

0

0

0

0

0

0

2

grade level

8

8

8

5

2

1

1

socio-econ. status

1

0

1

1

1

1

15

student G.P.A.

4

3

3

3

3

4

17

organizat'l skills

1

.5

.5

1

1

1

16

years experience

7

8

7

5

5

8

Total Score reported in Current Student Achievement

94%

86%

73%

82%

86%

82%

Page 82: Teacher behavior and student achievement.2

71

Simulation Sessions

In individual simulation sessions, each of the 12 participants, together

with the author, observed how the simulation model reacted to her/his inputs.

The simulation results were dependent on choices of numerical input by the

participant according to answers from the pre-simulation questionnaire as

entered into the computer simulation. After the initial simulation run, the

participant observed a "steady state" simulation run, previously identified by the

author, whereby all of the outcomes were maximized (i.e., student achievement

was realized at 100%). From the outcomes generated by the "steady state"

computer simulation the participants then reviewed research-based conclusions

(recorded within the simulation) as to how each item in question realized its

maximum potential within the system. This second simulation run gave each

participant a chance to understand how the less than desirable outcomes

generated in his/her simulation run might be rectified in future sessions. This

type of comparison between ideal simulated outcomes and respondents'

simulated outcomes may be one way to "teach" desirable teacher behaviors in

future in-service activities.

After the two simulation runs, Sector 7—teacher burnout was included

(i.e., "switched on") in a third simulation run to demonstrate to the participants

how one basic variable—in this case, hours worked per day—could completely

change simulation outputs throughout the representations of teacher behaviors

in the simulation. This chaotic function was described as an optional sector to

the participants, based on assumptions, not on knowledge synthesis findings.

Also explained to the participants was how other sectors could be constructed

and included at a later date, giving the simulation an open-ended, continuous

improvement quality—modifiable at any time.

Page 83: Teacher behavior and student achievement.2

72

Post-simulation Questionnaire

The post-simulation questionnaire was designed to solicit reflections from

each participant about the results from the individual simulation sessions. There

were four questions regarding demographic data of the participants and eight

questions regarding the knowledge synthesis/simulation outcomes (see

Appendix E).

The following section describes each question presented to the

respondents and the observation of the responses generated during the

validation sessions. The responses to the questions were described using the

following Likert-type scale, with the stem being neutral and the direction

provided in the response option:

5 4 3 2 1 ______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor

disagree

Using a scale of one to five, with five indicating that the respondent

strongly agreed with the question and one indicating that the respondent

strongly disagreed, a mean score was calculated to show differences between

responses to each question as well as directions in opinions and beliefs among

the respondents regarding the following statements.

Question 1. Computer simulation of a classroom setting is very

important: The results from the first question were that three respondents

strongly agreed, seven respondents agreed, and two respondents neither agreed

nor disagreed. Mean score = 3.83 (study 1); Mean score = 4.33 (study 2).

Page 84: Teacher behavior and student achievement.2

73

Question 2. Direct instruction (active teaching) is very important: Eight

of the respondents strongly agreed and four respondents agreed. Mean score =

4.67 (study 1); Mean score = 4.67 (study 2).

Question 3. The computer simulation truly reflected what really happens

in the classroom: Two of the respondents strongly agreed, eight respondents

agreed, and two respondents neither agreed nor disagreed. Mean score = 4.00

(study 1); Mean score = 4.00 (study 2).

Question 4. I feel very confident that the research findings used in this

study were very accurately modeled in the computer simulation: Four

respondents strongly agreed, seven respondents agreed, and one respondent

neither agreed nor disagreed. Mean score = 4.00 (study 1); Mean score = 4.50

(study 2).

Question 5. The computer simulation session was very helpful in

identifying appropriate behaviors for a given classroom setting: One respondent

strongly agreed, ten respondents agreed, and one respondent neither agreed nor

disagreed. Mean score = 4.00 (study 1); Mean score = 4.00 (study 2).

Question 6. The computer simulation is a very good way to demonstrate

to teachers how certain behaviors function with one group of students and do

not function with another group: Five respondents strongly agreed, Five

respondents agreed, one respondent neither agreed nor disagreed, and one

respondent disagreed. Mean score = 4.00 (study 1); Mean score = 4.17 (study 2).

Question 7. The computer simulation helped me to better understand

the complexities of classroom teaching in general: Two respondents strongly

agreed, five respondents agreed, and five respondents neither agreed nor

disagreed.

Mean score = 3.83 (study 1); Mean score = 3.67 (study 2).

Page 85: Teacher behavior and student achievement.2

74

Question 8. I look forward to more computer simulations of this type:

Four respondents strongly agreed and eight respondents agreed. Mean score =

4.32 (study 1); Mean score = 4.33 (study 2).

Additional Findings

The data findings from the post-simulation questionnaires are reported in

Tables 8 and 9 respectively. One unsolicited comment from one of the

respondents was that the simulation seemed to be grounded upon a very

important instructional theory (i.e., Instructional Theory Into Practice (ITIP)

model for direct instruction by Hunter, 1967). This respondent thought that ITIP

was very effective as a framework for teaching. Another unsolicited comment

was from a respondent who wanted to receive a copy of the finished dissertation

project along with a run-time copy of the simulation as soon as it was published.

This participant, when asked, replied that it would be used in teacher in-service

activities.

One respondent commented that she believed higher achievement

students (which she attributed to high SES) need less lower-level questioning

techniques than reported by Brophy and Good in the knowledge synthesis

findings (1986). Another respondent was surprised that this same finding was

not what they had expected at all. Higher-level questioning techniques was an

issue brought up more than once during participant comments. Overall, the

reactions to the simulation exercise were more positive than not.

Page 86: Teacher behavior and student achievement.2

75

TABLE 8

STUDY 1:

POST-SIMULATION QUESTIONNAIRE DATA COLLECTION REPORT (N=6)

under 30 30-50 over 50

Age range III III

teacher administrator other

Nature of work I IV I

elementary middle school high school university

grade level4 III III III I

bachelor master doctor

highest degree IV II

5 4 3 2 1

MEAN SCORE

question #:

strongly agree

agree

neither agree nor disagree

disagree

strongly disagree

3.83 1. I III II

4.67 2. IV II

4.00 3. II II II

4.00 4. I IV I

4.00 5. I IV I

4.00 6. II III I

3.83 7. II I III

4.32 8. II IV

4 Administrators may have been represented in more than one grade level due to the scope of their responsibilities.

Page 87: Teacher behavior and student achievement.2

76

TABLE 9

STUDY 2:

POST-SIMULATION QUESTIONNAIRE DATA COLLECTION REPORT (N=6)

under 30 30-50 over 50

Age range II IV

teacher administrator other

Nature of work V I

elementary middle school high school university

grade level IV II

bachelor master doctor

highest degree III II I

5 4 3 2 1

MEAN SCORE

question #:

strongly agree

agree

neither agree nor disagree

disagree

strongly disagree

4.37 1. II IV

4.67 2. IV II

4.00 3. VI

4.50 4. III III

4.00 5. VI

4.17 6. III II I

3.67 7. IV II

4.32 8. II IV

Page 88: Teacher behavior and student achievement.2

77

CHAPTER IV

SUMMARY, CONCLUSIONS, IMPLICATIONS, AND RECOMMENDATIONS

Summary

The purpose of this particular study was to create a system dynamics

simulation of research-based conclusions of teacher behaviors which affect

student achievement. Interpretations of and conclusions regarding the results

presented in the previous chapter are described here. Post-questionnaire

responses will be discussed along with observations and recommendations for

future research as well. It is evident from the small sample size of respondents

(study #1: N = 6; study #2: N = 6) that conclusions from this study might

possibly mask a particular result and that more validation sessions with a larger

sample size would help to strengthen (or weaken) the following interpretations.

In light of the research question, the findings from this study tentatively

support the position that knowledge synthesis conclusions can be usefully

modeled on a system dynamics computer simulation. The results from two

validation studies of educators who predicted possible outcomes, witnessed the

simulation, and recorded their observations about the computer-generated

outcomes from the simulation indicate a positive response to the research

question of whether or not knowledge synthesis findings could be modeled in a

system dynamics environment in a computer simulation. The study also

revealed that all of the respondents valued the usage of computer simulations in

education, or that student/teacher interactions can be usefully simulated.

Page 89: Teacher behavior and student achievement.2

78

Conclusions

In chapters one and two the research question stated: can the knowledge

synthesis findings of teacher behaviors that affect student achievement be

usefully modeled in a system dynamics computer simulation? The results from

this study indicate that knowledge synthesis findings can, more likely than not,

be usefully modeled in a system dynamics environment. The following is a

description of the conclusions regarding the study.

The most important response recorded during the validation sessions was

in regard to the theoretical basis on which the simulation was founded. In

question number 2 it is apparent from the respondents' observations that the

direct instruction approach to teaching was highly valued (Mean scores5 = study

#1: 4.67 and study #2: 4.67). This indicates that the area of research on teaching

used in developing the simulation—Hunter's (1967) Instructional Theory Into

Practice (ITIP) model—was pertinent to all of the respondents and can be

considered important enough for future simulation studies. Another important

consideration is that 100% of the respondents look forward to such simulations

in the future (Mean scores = study #1: 4.32 and study #2: 4.33). Thus, it can be

concluded that simulations based on ITIP models are valuable exercises for

educators.

The validity of the simulation exercise appeared to be supported by the

fact that responses to questions 3 and 5—whether the simulation truly reflected

what really happens in the classroom, and that the session was very helpful in

identifying appropriate teacher behaviors—indicated a tentative agreement from

the majority of the respondents (Mean scores = study #1: 4.32 and study #2:

4.33). Responses to questions 1 and 6—importance of simulations and how good 5 All mean scores are based on a possible 5 maximum rating

Page 90: Teacher behavior and student achievement.2

79

they are in demonstrating behaviors to teacher—indicate that the respondents

generally valued these factors as well (Question #1 mean scores = study #1: 3.83

and study #2: 4.33; question #6 mean scores = study #1: 4.00 and study #2: 4.17)

with one disagreement from the participants. It can be concluded that, for the

most part, the simulation generally represented the research-based conclusions

found in the knowledge synthesis in an important, valid and understandable

manner.

In question 7 there was less agreement than with any of the other

questions that simulations help educators to understand complexities of

teaching. Five of the respondents neither agreed nor disagreed, five agreed and

only two strongly agreed (Mean scores = study #1: 3.83 and study #2: 3.67). It

can be concluded that the complexities of teaching may not be as easily

simulated in a system dynamics simulation as the other elements addressed in

the study.

In light of the fact that the two validation study respondent groups were

diverse in many ways, and limited in size, one point must be addressed: the

results indicate that the observations from both of the groups are quite alike. The

diversity between groups may actually be considered a positive factor in

supporting the conclusion that the simulation seems to reflect what happens in

the classroom between teacher and student. Though there was opportunity to

strongly disagree with the results of the simulation sessions, most of the

respondents indicated a positive response with every one of the questions in the

survey instrument.

Page 91: Teacher behavior and student achievement.2

80

Implications

Some implications about the use of a computer simulation model such as

this have been identified. In the pages that follow, these implications are

categorized into four areas of education: teachers and instruction, administrators,

research, and schools as learning organizations. All of these areas can use

simulations similar to the one in this study as part of a learning laboratory to

practice and learn in "an environment that is risky, turbulent, and unpredictable"

(Kim, 1995, p. 46). This simulation exercise can provide educators with such an

environment.

Kim (1995) states that learning laboratories are designed to: create an environment that is of operational relevance in which managers can step out of day-to-day demands to: • reflect on their decision making • develop a common language • learn new tools for thinking systematically • discuss operational objectives and strategies in an open forum • test operating assumptions • experiment with new policies and strategies • and have fun. (p. 47)

The second element of learning laboratories—that a common language

can be developed among participants—is one area of education that needs to be

addressed here. For too long educators have lacked common vocabulary upon

which to describe events in schools. Simulations can help to formulate and apply

agreed-upon terminology which identifies certain behaviors/ideas/applications

that affect students, teachers, administrators and researchers in schools. After

playing the simulations, the participants can reflect on certain areas of concern,

and identify those elements by referring to the modeled research-based

conclusions in the simulation. Teachers and administrators can use the common

Page 92: Teacher behavior and student achievement.2

81

terminology to help better understand the complexities found in education and

to better understand their, often times, differing perspectives.

Learning laboratories are used in business and industry as a way to train

employees away from the job site. There are university programs which deal

solely with this type of instructional strategy. For example, one of these

programs at MIT's Sloan School of Management uses learning laboratories to

teach leadership training by incorporating system dynamics simulations into

systems thinking courses (Senge, 1990a).

Teachers and Instruction

Teacher Training

In the area of teacher training, learning laboratories using a model such as

the one in this study may be used as a risk-free environment to help student

teachers experiment with strategies for increasing student achievement without

actually entering the classroom setting. The students can attempt to increase

their "scores" by playing the simulation, or isolated parts of it—experiencing the

research-based conclusions in sessions with or without their master

teacher/professor present. The sessions can be stopped at any time, allowing the

instructor or end-user to intervene, where appropriate/necessary.

Staff Development

In the area of staff development, the simulation may help to introduce or

reinforce the research-based conclusions of increasing student achievement to

veteran teachers. These experts can attempt to optimize the system for

increasing student achievement. One advantage of this model is that, during

simulation sessions, teachers may suspend the individual simulation runs at any

Page 93: Teacher behavior and student achievement.2

82

time during the "school year" to modify teaching behaviors if the immediate

results do not reflect sufficient increases in achievement. The teachers can reflect

on their choices and change mid-year without having to start the simulation

sessions all over again.

An argument against such simulations in staff development activities, or

any other educational realm for that matter, is that these kinds of exercises are

scientifically-based representations of teaching and there is no representation of

the "art" of teaching in simulation technology. In response to the question of art

versus science in simulations, artists tend to realize that only a small amount of

the world is important and they visualize those important details. Kim (1995)

suggests that by paying attention to those details, artists might fail in capturing

"the core structures that are important, we may be the unwitting producers of

our own chaos" (p. 25). Simulations allow educators the opportunity to know

more about the processes in question, broadening the scope of what is possible

and what is happening out there in "real" terms, not just based on partial

perception, artistically speaking or otherwise.

Administrators

Staff Evaluation

In the area of supervisor/administrator assessment, the simulation may be

used to ascertain whether principals have a sufficient research knowledge base to

effectively observe and identify effective teaching behaviors during the

evaluation process. For example, principals can run the simulation and observe

a specific group of teacher behaviors as well as the student achievement resulting

from this unique combination of behaviors from the teacher. After the session,

the principals can give recommendations for the "virtual" teacher to improve

Page 94: Teacher behavior and student achievement.2

83

their performance. Then the principals can view their recommendations in

action during the following session to see if they predicted correctly.

In-Service Program Development

Superintendents may use this model as a tool for training administrators

about research-based conclusions in non-threatening learning laboratories.

Using the simulation to compare the outcomes from participants' simulated

sessions and the "ideal" session is one way to reinforce research findings of the

effectiveness of teacher behaviors which affect student achievement. This type of

comparison between ideal simulated outcomes and respondents' simulated

outcomes is one way to reinforce in-service programs which enhance desirable

teacher behaviors.

Research

Making Research Results Relevant

One implication for research in the field of education may be an

identification process of the gaps in quality of existing research as identified by

professors who visualize, for the first time, outcomes of a simulation model of

conclusions based on research. Forrester (1993) states that models can, and must

help to organize information in a more understandable way. With these types of

simulations researchers can "try out" their findings to "see" the conclusions

modeled in a visual, dynamic environment. Once researchers "see" the big

picture they may be able to identify areas which seem counterintuitive and

observe seemingly chaotic functions in real time, giving them more clarity in

visualizing distances between causality and time and space.

Page 95: Teacher behavior and student achievement.2

84

Making Research Results Clear

When simulations are put together with findings from the research, the

large amount of prose describing the processes and products becomes distilled

into "sound bites" of numerical equivalencies and one sentence descriptors. The

need for these equivalencies help to clarify to the user as well as the researcher

exactly what was found and the gaps in what else is needed to get the model to

function like the "real world."

Making Research Results Useful

Reading the wealth of research findings in education can be a tedious and

confusing ordeal. The chapter solely dedicated to teacher behavior and student

achievement (Brophy & Good, 1986) which was modeled in this simulation is

comprised of 47 pages of singled-spaced text and 205 citations within that text.

The body of research in itself is a useful tool for educators who have a need to

study such findings, but in a 15 minute simulated session of the conclusions the

findings can be manipulated in a classroom setting by a complete novice (as were

all of the respondents).

Schools as Learning Organizations

Simulations such as the one presented here will allow schools to draw

closer to becoming what Senge (1990a) calls learning organizations. A learning

organization is one that recognizes and celebrates interdependencies within the

organization versus defending the need for independence for individual parts of

that system, be it persons, departments, management, etc. Learning

organizations stop blaming individuals for systemic problems because the

organization is just that—a system, comprised of many interdependent parts,

Page 96: Teacher behavior and student achievement.2

85

and no one part is independent from the rest. Therefore, no one part or person in

the organization is to blame for a systemic problem as every part and all people

are involved in and affected by the system in one way or another.

The simulation presented in this study gives the individual or workgroup

a chance to see one subsystem in its entirety, where all the parts, as identified in

the research, are represented, for once, in a user-friendly, dynamic model of

teacher behavior as it affects student achievement. This chance to see the "big"

picture at one instant is necessary for persons in organizations to experience

interdependencies in training sessions for decision-making and policy analysis.

Business and industry use these kinds of simulations to forecast successful

events and learn about systemic structures and archetypes. Schools must break

the static, linear cause and effect mode of yesterday and begin to think in

systemic, circular, holistic approaches as learning organizations, changing

together in interdependent "webs" of relationships such as teacher behaviors and

student achievement.

Recommendations

The following recommendations for further study are made based on the

results of this research:

1. It might be helpful to include more classroom teachers in the

sample population, to insure that a more representative group is involved in the

validation process, as well as those previously mentioned.

2. It might be helpful to increase the sample size for statistical

purposes. The simulation sessions, including completion of the pre- and post-

questionnaires, were time consuming for both the respondents as well as the

researcher, but another study using the same simulation, questions, and

Page 97: Teacher behavior and student achievement.2

86

methodology but with a larger group of respondents could help to increase (or

decrease) the probability of significance.

3. It is inevitable that this model will be modified for continuous

improvement of how it behaves and predicts outcomes generated from the

strength and relationships between the variables. It is also necessary to

continuously validate the outcomes from this model after each attempt at

modification.

4. It might be helpful to develop a run-time version of the model for

distribution to end users which can be used without a facilitator present. This

might help the validation process by giving the respondent more liberty in

viewing and manipulating the model, lifting the pressures of time and/or the

personal interview.

5. It might be helpful to develop a similar simulation using another

type of instructional technique versus direct instruction (e.g., cooperative

learning, one-on-one instruction) and compare outcomes of additional

simulations. Some teaching techniques require implicit skills versus the explicit

ones described by Brophy and Good (1986). A comparison of those teaching

strategies might enable educators to better understand differences in teaching

styles and techniques.

6. A number of studies of other important areas in educational reform

(e.g., student behaviors, parent involvement, teacher training, school size,

financial and legal aspects of education, etc.) need to be connected to the existing

simulation in an attempt to investigate how factors inside and outside the

classroom setting affect student achievement. This type of study could continue

until a whole school district is described in system dynamics terms. Such a

simulation exercise could be a basis for a true technological tool for ongoing

systemic reform.

Page 98: Teacher behavior and student achievement.2

87

7. It might be helpful to develop changes in the model as the research-

based conclusions change. For example, a future version of the model might be

constructed to be sensitive to new developments in the field of education,

particularly as new research is received. Modeling programs such asithink! can

be continually improved as all of the submodels within the system can be

changed at any time without affecting the structure of the model, only the output

from the differing equations and data entry.

Page 99: Teacher behavior and student achievement.2

88

REFERENCES

Page 100: Teacher behavior and student achievement.2

89

REFERENCES

Alkin, M. C., Linden, M., Noel, J., & Ray, K. (Eds.). (1992). Encyclopedia of educational research (6th ed.). New York: MacMillan Publishing Company.

Bartz, D. E., & Miller, L. K. (1991). Twelve teaching methods to enhance student learning. Washington, DC: National Education Association.

Bell, T. (1993, April). Reflections one decade after A Nation At Risk. Phi Delta Kappan, 74, 592-597.

Briggs, J. (1992). Fractals: The patterns of chaos. New York: Touchstone.

Brophy, J., & Evertson, C. (1976). Learning from teaching: A developmental perspective. Boston: Allyn and Bacon.

Brophy, J. E., & Good, T. L. (1986). Teacher behavior and student achievement. In M. C. Wittrock (Ed.), Handbook of Research on Teaching (3rd ed., pp. 328-375). New York: MacMillan Publishing Company.

Corno, L. & Snow, R. E. (1986). Adapting teaching to individual differences among learners. In M. C. Wittrock (Ed.), Handbook of Research on Teaching (3rd ed., pp. 605-629). New York: MacMillan Publishing Company.

Csikzentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York: Harper & Row.

Darling-Hammond, L. (1993, June). Reframing the school reform agenda. Phi Delta Kappan, 74, 753-761.

Digate, G. A., & Rhodes, L. A. (1995, March). Building capacity for sustained improvement. The School Administrator, 52, 34-38.

Dunkin, M. J. (Ed.). (1987). The international encyclopedia of teaching and teacher education. New York: Pergamon Press.

Erikson, E. H. (1963). Childhood and society. New York: W. W. Norton & Company.

Forrester, J. W. (1961). Industrial dynamics. Cambridge, MA: MIT Press.

Forrester, J. W. (1968). Principles of systems. Cambridge, MA: MIT Press.

Forrester, J. W. (1993). System dynamics and the lessons of 35 years. In K. B. DeGreene (Ed.), Systems-based approach to policy-making. New York: Kluwer Academic Publishers.

Page 101: Teacher behavior and student achievement.2

90

Gagné, R. (1970). The conditions of learning. New York: Holt, Rinehart and Winston.

Gonzalez, J. J., & Davidsen, P. (1993). Integrating systems thinking and instructional science. Paper presented at the NATO Advanced Study Institute, July 12-23, Grimstad, Norway.

Good, T., & Grouws, D. (1979). The Missouri mathematics effectiveness project: An experimental study in fourth grade classrooms. Journal of Educational Psychology, 71, 355-362.

Hass, G. & Parkay, F. W. (1993). Curriculum planning: A new approach (6th ed.). Boston: Allyn and Bacon.

Hentschke, G. C. (1975). Management operations in education. Berkeley, CA: McCutchan Publishing Corporation.

High Performance Systems, Inc. (1994). Introduction to systems thinking and ithink! . Hanover, NH: Author.

Hiller, J., Fisher, G., & Kaess, W. (1969). A computer investigation of verbal characteristics of effective classroom lecturing. American Educational Research Journal, 6, 661-675.

Hunter, M., & Russell, D. (1981). Planning for effective instruction: Lesson design. In M. Hunter, Increasing your teaching effectiveness. Palo Alto, CA: The Learning Institute.

Hunter, M. (1967). Improved instruction. El Segundo, CA: TIP Publications.

International School Services, Inc. (1993). The ISS directory of overseas schools 1993-94 edition: The comprehensive guide to K-12 American and international schools worldwide. Princeton, NJ: Author.

Johnston, D. & Richmond, B. (1994). Getting started with ithink!: A hands-on experience. Hanover, NH: High Performance Systems, Inc.

Kim, D. (1995). System thinking tools: A user reference guide. Cambridge, MA: Pegasus Communications.

Kirst, M. W. (1993, April). Strengths and weaknesses of American education. Phi Delta Kappan, 74, 613-618.

Kreutzer, W. B. (1994). Creating your own management flight simulator: A step-by-step builder's handbook (with software reviews). In P. M. Senge, C. Roberts, R. B. Ross, B. J. Smyth, & A. Kleiner (Eds.), The fifth discipline handbook: Strategies and tools for building learning organizations. New York: Currency Doubleday.

Page 102: Teacher behavior and student achievement.2

91

Lannon-Kim, C. (1992, January). The vocabulary of systems thinking: A pocket guide. The Systems Thinker, 2(10), 3-4.

Lunenberg, F. C. & Ornstein, A. C. (1991). Educational administration: Concepts and practice. Belmont, CA: Wadsworth Publishing Company.

ModellData AS. (1993). Powersim: User’s guide and reference. Manger (Bergen), Norway: Author.

Nelson, J. O. (1993). School system simulation: An effective model for educational leaders. The AASA Professor, 16(1).

Norman, D. A. (1993). Things that make us smart: Defending human attributes in the age of the machine. Reading, MA: Addison-Wesley Publishing Company.

O’Toole, J. (1993). The executive’s compass: Business and the good society. New York: Oxford University Press.

Prigogine, I., & Stengers, I. (1984). Order out of chaos: Man’s new dialogue with nature. New York: Bantam Books.

Reilly, D. H. (1993, January). Educational leadership: A new vision and a new role within an international context. Journal of School Leadership, 3(1), 9-19.

Rhodes, L. A. (1994). Technology-driven systemic change. Paper presented at the annual International Conference on Technology and Education. March 27-30, London, England.

Richardson, G. P., & Pugh III, A. L. (1981). Introduction to system dynamics modeling with DYNAMO. Cambridge, MA: MIT Press.

Richardson, G. P. (1991). Feedback thought in social science and systems theory. Philadelphia, PA: University of Pennsylvania Press.

Roberts, N. H. (1974, September). A computer system simulation of student performance in the elementary classroom. Simulation & Gaming, 5, 265-290.

Schön, D. (1983). The reflective practitioner: How professionals think in action. New York: Basic Books.

Senge, P. (1990a). The fifth discipline: The art and practice of the learning organization. New York: Doubleday.

Senge, P. (1990b). The leader’s new work: Building learning organizations. Sloan Management Review, 32(1), 7-23.

Page 103: Teacher behavior and student achievement.2

92

Shulman, L. S. (1986). Paradigms and research programs in the study of teaching: A contemporary perspective. In M. C. Wittrock (Ed.), Handbook of Research on Teaching (3rd ed., pp. 3-36). New York: MacMillan Publishing Company.

Smith, C. B., & Klein, S. S. (1991). Synthesis research in language arts instruction. In J. Flood, J. M. Jensen, D. Lapp, & J. R. Squire (Eds.), Handbook of research on teaching the English language arts. New York: MacMillan Publishing Company.

Smith, L. & Land, M. (1981). Low-inference verbal behaviors related to teacher clarity. Journal of classroom interaction, 17, 37-42.

Solomon, D., & Kendall, A. (1979). Children in classrooms: An investigation of person-environment interaction. New York: Praeger.

Tobin, K. (1980). The effect of an extended teacher wait-time on science achievement. Journal of Research in Science Teaching, 17, 469-475.

Tobin, K., & Capie, W. (1982). Relationships between classroom process variables and middle-school science achievement. Journal of Educational Psychology, 74, 441-454.

Tyo, J. (1995). Simulation modeling tools. Informationweek, 535, 60-67.

Waddington, C. H. (1976). Tools for thought. St. Albans, England: Paladin.

Waldrop, M. M. (1992). Complexity: The emerging science at the edge of chaos and order. New York: Simon & Schuster.

War Manpower Commission. (1945). The training within industry report. Washington, DC: Bureau of Training.

Wheatley, M. J. (1992). Leadership and the new science: Learning about organization from an orderly universe. San Francisco: Berrett-Koehler Publishers.

Whicker, M. L. & Sigelman, L. (1991). Computer simulation applications: An introduction. (Applied Social Research Methods Series, Volume 25). Newbury Park, CA: Sage Publications.

Wittrock, M. C. (Ed.) Handbook of research on teaching (3rd ed.). New York: MacMillan Publishing Company.

Page 104: Teacher behavior and student achievement.2

93

APPENDIX A

Page 105: Teacher behavior and student achievement.2

94

APPENDIX A

Cover Letter To The Chief Executive Officer Of The School

Page 106: Teacher behavior and student achievement.2

95

APPENDIX B

Page 107: Teacher behavior and student achievement.2

96

APPENDIX B

Reply Form From Respondent

Page 108: Teacher behavior and student achievement.2

97

APPENDIX C

Page 109: Teacher behavior and student achievement.2

98

APPENDIX C

Follow-Up Letter To Respondent

Page 110: Teacher behavior and student achievement.2

99

APPENDIX D

Page 111: Teacher behavior and student achievement.2

100

APPENDIX D

Pre-Simulation Questionnaire To Be Used For Validation Purposes.

Please indicate your choice as to the statements given below. Choose the

answer that you feel will result in the highest overall achievement scores for the

grade level and socioeconomic level you have selected. The choices you pick will

be used as input for the computer simulation session. After the session, a post-

session questionnaire will be administered to record your reflections regarding

these inputs and the resulting score.

1. Choose socioeconomic status (SES)

Input: 0 = low-socioeconomic status (SES) or dependent/anxious

1 = high-SES or assertive/confident

2. Choose grade level

Input: Choose a grade level from 0 (Kindergarten) to 12

3. Choose amount of vagueness in terminology as used by the teacher

Input: Choose a number from 0 to 5, whereby 0 indicates the least amount of

vagueness in terminology and 5 indicates the most amount of vagueness in

terminology

4. Choose amount of questions/answer interactions per class period

Input: Choose from 0 to 24 teacher question/student answer interactions per 50

minute class period

Page 112: Teacher behavior and student achievement.2

101

5. Choose degree of redundancy

Input: 0 = little or no degree of redundancy of information given to students

1 = high degree of redundancy of information given to students

6. Choose if student call-outs are or are not allowed by the teacher

Input: 1 = student call-outs are allowed by teacher

3 = student call-outs are not allowed by teacher

7. Choose amount of higher level questions (Bloom's taxonomy)

Input: Choose from 0% to 100% of higher order type questions used by the

teacher

8. Choose response rate

Input: Choose from 0% to 100% of the correct response rate from the student

before the teacher moves on to a new question

9. Choose amount of wait-time6

Input: Choose from 0 to 10 seconds of post question wait-time before teacher

calls on a student for response to the question

10. Choose amount of praise from the teacher

Input: 0 = great deal of praise from the teacher

1 = little or no praise from the teacher

11. Choose amount of time negative feedback is simple negation

versus personal criticism

Input: Choose from 0% to 100% of the time teacher should use simple negation

versus personal criticism for incorrect responses to questions

6 In an attempt to continually improve the model, in the second validation study this question was modified to read from 0 to 3 seconds.

Page 113: Teacher behavior and student achievement.2

102

12. Choose amount of time correct responses are acknowledged

Input: Choose from 0% to 100% of the time in which correct responses are

acknowledged by the teacher

13. Choose amount of success in independent seat work

Input: Choose from 0% to 100% of the time in which student experiences success

in seat work assignments

14. Choose amount of help available during seat work

Input: 1 = help is readily available

3 = help is not readily available

15. Choose student cumulative G.P.A. (from 0.0 to 4.0)

Input: Grade Point Average (GPA) from previous year

4 = A; 3 = B; 2 = C, 1 = D, 0 = F

16. Choose amount of teacher experience7

Input: Select amount of experience in years from 0 to 10

17. Choose amount of organizational skills

Input: Choose 1 for high amount of organization

Choose .5 for fair amount of organization

Choose 0 for not much organization

18. Choose amount of hours worked per day8

Input: Choose from 8 to 16 hours a day worked by the teacher

7 In an attempt to continually improve the model, in the second validation study this question was modified to read from 0 to 40 years. 8 Optional section based upon assumptions, not on knowledge synthesis (KS) findings.

Page 114: Teacher behavior and student achievement.2

103

APPENDIX E

Page 115: Teacher behavior and student achievement.2

104

APPENDIX E

Post-Simulation Questionnaire To Be Used For Validation Purposes.

age range: under 30 ___ 30-50 ___ over 50 ___ nature of work: teacher ___ administrator ___ other ___ grade level: elementary ___ middle school ___ high school ___ degree of education: bachelor ___ master ___ doctor ___ Please indicate how important the following factors are to you in determining your perception of the outcomes of teacher/student simulation session. 1. Computer simulation of a classroom setting is very important:

______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor

disagree 2. Direct instruction (active teaching) is very important:

______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor

disagree 3. The computer simulation truly reflected what really happens in the

classroom:

______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor

disagree 4. I feel very confident that the research findings used in this study were

very accurately modeled in the computer simulation:

Page 116: Teacher behavior and student achievement.2

105

______ ______ ______ ______ ______

strongly agree neither disagree strongly agree agree disagree nor

disagree 5. The computer simulation session was very helpful in identifying

appropriate behaviors for a given classroom setting:

______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor

disagree 6. The computer simulation is a very good way to demonstrate to teachers

how certain behaviors function with one group of students while do not function with another group:

______ ______ ______ ______ ______

strongly agree neither disagree strongly agree agree disagree nor

disagree 7. The computer simulation helped to me to better understand the

complexities of classroom teaching in general:

______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor

disagree 8. I look forward to more computer simulations of this type:

______ ______ ______ ______ ______ strongly agree neither disagree strongly agree agree disagree nor

disagree

Page 117: Teacher behavior and student achievement.2

106

APPENDIX F

Page 118: Teacher behavior and student achievement.2

107

APPENDIX F

System Dynamics Computer Simulation Model Formulae

Burnout Sector

Burnout(t) = Burnout(t - dt) + (increase_in_burnout - dissipation) * dt

INIT Burnout = 0

DOCUMENT: Assuming more hours worked during months approaching each

semester break (final exams, report cards, programs, and parent conferences)

increase_in_burnout = IF(hours_worked_per_day≤8) THEN(0)

ELSE((hours_worked_per_day-8)

dissipation = Burnout*dissipation_frac

dissipation_frac = GRAPH(Burnout)

(0.00, 0.25), (10.0, 0.24), (20.0, 0.229), (30.0, 0.216), (40.0, 0.198), (50.0, 0.182), (60.0,

0.163), (70.0, 0.145), (80.0, 0.121), (90.0, 0.09), (100, 0.0175)

hours_worked_per_day = GRAPH(time)

(1.00, 8.08), (18.9, 8.92), (36.8, 10.3), (54.7, 14.2), (72.6, 13.4), (90.5, 10.0), (108, 9.44),

(126, 9.36), (144, 10.9), (162, 13.3), (180, 13.3)

DOCUMENT: Input:

Choose from 8 to 16 hours a day worked by the teacher

impact_of_burnout_on_enthusiasm = GRAPH(Burnout)

Page 119: Teacher behavior and student achievement.2

108

(0.00, 100), (10.0, 56.0), (20.0, 20.0), (30.0, 11.0), (40.0, 9.00), (50.0, 6.00), (60.0, 3.50),

(70.0, 3.00), (80.0, 2.50), (90.0, 1.00), (100, 0.00)

Giving Information

ENTHUSIASM(t) = ENTHUSIASM(t - dt) + (change_in_enthusiasm) * dt

INIT ENTHUSIASM = 75

DOCUMENT: "Enthusiasm . . . often correlates with achievement" (Brophy &

Good, 1986, p. 362)

change_in_enthusiasm = (impact_of_burnout_on_enthusiasm-ENTHUSIASM)

/adjustment_delay

Quality_of_Structuring(t) = Quality_of_Structuring(t - dt) +

(changes_in_structuring) * dt

INIT Quality_of_Structuring = 0

changes_in_structuring = MEAN((MANAGEMENT_IMPACT*GRADE_&

_MANAGEMENT), REDUNDANCY_IMPACT, (ENTHUSIASM*GRADE_&

_ENTHUSIASM), ACADEMIC_INTERACTIONS_IMPACT,

(CLARITY_IMPACT*GRADE_&_CLARITY))-Quality_of_Structuring

adjustment_delay = 2, amount_of_interactions = 24

DOCUMENT: Input:

Choose from 0 to 24 teacher question/student answer interactions per 50 minute

class period

Page 120: Teacher behavior and student achievement.2

109

redundancy = 1

DOCUMENT: Input:

0 = little or no degree of redundancy of information given to students

1 = high degree of redundancy of information given to students

vagueness_factor = 0

DOCUMENT: Input:

Choose a number from 0 to 5, whereby 0 indicates the least amount of vagueness

in terminology and 5 indicates the most amount of vagueness in terminology

ACADEMIC_INTERACTIONS_IMPACT = GRAPH(amount_of_interactions)

(0.00, 0.00), (12.0, 61.0), (24.0, 100)

DOCUMENT: "About 24 questions were asked per 50 minute period in the high

gain classes, . . . In contrast, only about 8.5 questions were asked per period in

the low-gain classes . . . " (Brophy & Good, 1986, p. 343).

CLARITY_IMPACT = GRAPH(vagueness_factor)

(0.00, 100), (1.25, 67.5), (2.50, 40.0), (3.75, 20.5), (5.00, 1.50)

DOCUMENT: "Smith and Land (1981) report that adding vagueness terms to

otherwise identical presentations reduced student achievement in all of 10

studies in which vagueness was manipulated" (Brophy & Good, 1986, p. 355).

GRADE_&_CLARITY = GRAPH(GRADE_LEVEL)

(0.00, 0.8), (1.20, 0.865), (2.40, 0.915), (3.60, 0.955), (4.80, 0.985), (6.00, 1.00), (7.20,

1.00), (8.40, 1.00), (9.60, 1.00), (10.8, 1.00), (12.0, 1.00)

DOCUMENT: "In later grades, lessons are typically with the whole class and

involve applications of basic skills or consideration of more abstract content.

Page 121: Teacher behavior and student achievement.2

110

Overt participation is less important than factors such as . . . clarity of statements

and questions. . . " (Brophy & Good, 1986, p. 365).

GRADE_&_ENTHUSIASM = GRAPH(GRADE_LEVEL)

(0.00, 0.805), (1.20, 0.86), (2.40, 0.915), (3.60, 0.97), (4.80, 0.995), (6.00, 1.00), (7.20,

1.00), (8.40, 1.00), (9.60, 1.00), (10.8, 1.00), (12.0, 1.00)

DOCUMENT: "[Enthusiasm] often correlates with achievement, especially for

older students" (Brophy & Good, 1986, p. 362). "In later grades, lessons are

typically with the whole class and involve applications of basic skills or

consideration of more abstract content. Overt participation is less important than

factors such as . . . enthusiasm. . . " (p. 365).

GRADE_&_MANAGEMENT = GRAPH(GRADE_LEVEL)(0.00, 1.00), (1.20, 1.00),

(2.40, 1.00), (3.60, 1.00), (4.80, 1.00), (6.00, 1.00), (7.20, 0.985), (8.40, 0.965), (9.60,

0.93), (10.8, 0.88), (12.0, 0.805)

DOCUMENT: "In the early grades, classroom management involves a great deal

of instruction in desired routines and procedures. Less of this instruction is

necessary in the later grades. . . " (Brophy & Good, 1986, p. 365).

REDUNDANCY_IMPACT = GRAPH(redundancy)

(0.00, 50.0), (1.00, 100)

DOCUMENT: "Achievement is higher when information is presented with a

degree of redundancy, particularly in the form of repeating and reviewing

general rules and key concepts" (Brophy & Good, 1986, p. 362).

Page 122: Teacher behavior and student achievement.2

111

Handling Seatwork

Level_of_Independence(t) = Level_of_Independence(t - dt) + (change_in

_independence) * dt

INIT Level_of_Independence = 100

change_in_independence = (SUCCESS_RATE_IMPACT-Level_of_Independence)

/AVAILABLE_HELP_IMPACT

AVAILABLE_HELP_IMPACT = help_available

help_available = 1

DOCUMENT: Input:

1 if help is readily available

3 if help is not readily available

success_rate = 90

DOCUMENT: Input:

Choose amount of time, from 0% to 100% of the time, in which student

experiences success in seatwork assignments

SUCCESS_RATE_IMPACT = IF (success_rate>89) THEN 100 ELSE 75

DOCUMENT: "For assignments on which students are expected to work on their

own, success rates will have to be very high—near 100%. Lower (although still

generally high) success rates can be tolerated when students who need help get it

quickly" (Brophy & Good, 1986, p. 364).

Page 123: Teacher behavior and student achievement.2

112

Questioning the Students

Management_of_Response_Opportunities(t) =

Management_of_Response_Opportunities(t - dt) + (change_in_management) * dt

INIT Management_of_Response_Opportunities = 100

change_in_management = (IF (STUDENT_CALLOUTS_IMPACT=0) THEN

(questioning_factors) else ((questioning_factors*3)+STUDENT_CALLOUTS

_IMPACT)/4)-Management_of_Response_Opportunities

percent_of_correct_responses = 75

DOCUMENT: Input:

Choose from 0% to 100% of the correct response rate from the student before the

teacher moves on to a new question

percent_of_higher_level_questions = 25

DOCUMENT: Input:

Choose from 0% to 100% of higher order type questions used by the teacher

postquestion_wait_time = 3

DOCUMENT: Input:

Choose from 0 to 10 seconds of postquestion wait-time before teacher calls on a

student for response to the question

questioning_factors = MEAN(CLARITY_IMPACT, COGNITIVE_LEVEL

_IMPACT, SUCCCESS_RATE_IMPACT, WAIT_TIME_IMPACT)

student_call-outs = 1

Page 124: Teacher behavior and student achievement.2

113

DOCUMENT: Input:

1 if student call-outs are allowed by teacher

3 if student call-outs are not allowed by teacher

COGNITIVE_LEVEL_IMPACT = GRAPH(percent_of_higher_level_questions)

(0.00, 0.00), (5.26, 59.5), (10.5, 79.5), (15.8, 91.5), (21.1, 100), (26.3, 100), (31.6, 96.5),

(36.8, 92.5), (42.1, 85.5), (47.4, 79.5), (52.6, 72.0), (57.9, 65.0), (63.2, 56.0), (68.4, 44.5),

(73.7, 34.5), (78.9, 27.0), (84.2, 18.0), (89.5, 11.5), (94.7, 5.00), (100.0, 0.5)

DOCUMENT: ". . . the frequency of higher-level questions correlates positively

with achievement, the absolute numbers on which these correlations are based

typically show that only about 25% of the questions were classified as higher

level" (Brophy & Good, 1986, p. 363).

STUDENT_CALLOUTS_IMPACT = GRAPH(SES+student_call-outs)

(1.00, 100), (2.00, -100), (3.00, 0.00), (4.00, 0.00)

DOCUMENT: "Student call-outs usually correlate positively with achievement

in low-SES classes but negatively in high-SES classes" (Brophy & Good, 1986, p.

363).

SUCCCESS_RATE_IMPACT = GRAPH(percent_of_correct_responses)

(0.00, 0.00), (10.0, 21.0), (20.0, 41.0), (30.0, 62.5), (40.0, 81.0), (50.0, 90.0), (60.0, 96.5),

(70.0, 100), (80.0, 100), (90.0, 96.5), (100, 86.0)

DOCUMENT: "Optimal learning occurs when students move at a brisk pace but

in small steps, so that they experience continuous progress and high success rates

(averaging perhaps 75% during lessons when a teacher is present, and 90-100%

when the students must work independently)" (Brophy & Good, 1986, p. 341).

Page 125: Teacher behavior and student achievement.2

114

WAIT_TIME_IMPACT = GRAPH(postquestion_wait_time)

(0.00, 0.00), (0.273, 25.5), (0.545, 45.0), (0.818, 59.5), (1.09, 70.5), (1.36, 79.5), (1.64,

85.0), (1.91, 90.0), (2.18, 93.5), (2.45, 97.0), (2.73, 99.5), (3.00, 100)

DOCUMENT: "Studies . . . have shown higher achievement when teachers pause

for about 3 seconds (rather than 1 second or less) after a question, to give the

students time to think before calling on one of them" (Brophy & Good, 1986, p.

363).

Reacting to Student Response

Quality_of_Teacher_Reactions(t) = Quality_of_Teacher_Reactions(t - dt) +

(change_in_reactions) * dt

INIT Quality_of_Teacher_Reactions = 0

change_in_reactions = (collective_responses-Quality_of_Teacher_Reactions)/

delay_in_adjusting

collective_responses = MEAN(CORRECT_RESPONSE_FEEDBACK_IMPACT,

INCORRECT_RESPONSE_IMPACT,(TEACHER_PRAISE_IMPACT * IMPACT

_PRAISE))

correct_response_feedback = 90

DOCUMENT: Input:

Choose amount of time, from 0% to 100% of the time, in which correct responses

are acknowledged by the teacher

incorrect_response_feedback = 100

Page 126: Teacher behavior and student achievement.2

115

DOCUMENT: Input:

Choose from 0% to 100% of the time teacher should use simple negation versus

personal criticism for incorrect responses to questions

teacher_praise = 0

DOCUMENT: Input:

0 = great deal of praise from the teacher

1 = little or no praise from the teacher

TEACHER_PRAISE_IMPACT = IF(SES+teacher_praise=0) OR

(SES+teacher_praise=2) THEN 100 else 75

DOCUMENT: "High-SES students . . . do not require a great deal of . . . praise.

Low-SES students . . . need more . . . praise for their work" (Brophy & Good,

1986, p. 365).

CORRECT_RESPONSE_FEEDBACK_IMPACT =

GRAPH(correct_response_feedback)

(0.00, 0.00), (4.76, 3.50), (9.52, 6.00), (14.3, 8.50), (19.0, 11.0), (23.8, 13.0), (28.6, 15.0),

(33.3, 17.5), (38.1, 21.0), (42.9, 25.5), (47.6, 31.0), (52.4, 38.5), (57.1, 46.5), (61.9, 56.0),

(66.7, 66.5), (71.4, 74.5), (76.2, 83.5), (81.0, 92.0), (85.7, 100), (90.5, 100), (95.2, 100),

(100.0, 93.5)

DOCUMENT: "Correct responses should be acknowledged as such, because

even if the respondent knows that the answer is correct, some of the onlookers

may not. Ordinarily (perhaps 90% of the time) this acknowledgement should

take the form of overt feedback" (Brophy & Good, 1986. p. 362).

Page 127: Teacher behavior and student achievement.2

116

delay_in_adjusting = GRAPH(collective_responses/Quality_of_Teacher

_Reactions)

(0.9, 0.35), (1.00, 2.00), (1.10, 10.0)

DOCUMENT: The graph that's drawn makes adjustment stick upward and

slippery downward.

IMPACT_PRAISE = GRAPH(GRADE_LEVEL)

(0.00, 1.00), (1.20, 1.00), (2.40, 1.00), (3.60, 1.00), (4.80, 1.00), (6.00, 1.00), (7.20,

0.985), (8.40, 0.965), (9.60, 0.93), (10.8, 0.88), (12.0, 0.795)

DOCUMENT: "Praise and symbolic rewards that are common in the early

grades give way to the more impersonal and academically centered instruction

common in the later grades" (Brophy & Good, 1986, p. 365).

INCORRECT_RESPONSE_IMPACT = GRAPH(incorrect_response_feedback)

(0.00, 0.00), (10.0, 0.00), (20.0, 8.00), (30.0, 11.5), (40.0, 16.5), (50.0, 23.5), (60.0, 29.5),

(70.0, 38.5), (80.0, 53.0), (90.0, 100), (100, 100)

DOCUMENT: "Following incorrect answers, teachers should begin by indicating

that the response is not correct. Almost all (99%) of the time, this negative

feedback should be simple negation rather than personal criticism, although

criticism may be appropriate for students who have been persistently

inattentive" (Brophy & Good, 1986, p. 364).

Page 128: Teacher behavior and student achievement.2

117

Student

Current_Student_Achievement(t) = Current_Student_Achievement(t - dt) +

(achievement_change) * dt

INIT Current_Student_Achievement = 0

achievement_change = behavior_impact-Current_Student_Achievement

behavior_impact = MEAN(Level_of_Independence, Management_of_Response

_Opportunities, Quality_of_Structuring, Quality_of_Teacher_Reactions,

(Teacher_Expectation*GRADE_&_EXPECTATIONS))

GRADE_LEVEL = 6

DOCUMENT: Input:

Choose a number from 0 (Kindergarten) to 12

SES = 0

DOCUMENT: Input:

0 = low-Socioeconomic status (SES) or dependent/anxious

1 = high-SES or assertive/confident

GRADE_&_EXPECTATIONS = GRAPH(GRADE_LEVEL)

(0.00, 0.8), (1.20, 0.875), (2.40, 0.925), (3.60, 0.96), (4.80, 0.99), (6.00, 1.00), (7.20,

1.00), (8.40, 1.00), (9.60, 1.00), (10.8, 1.00), (12.0, 1.00)

DOCUMENT: ". . . in the later grades . . . it becomes especially important to be

clear about expectations . . ." (Brophy & Good, 1986, p. 365).

Page 129: Teacher behavior and student achievement.2

118

Teacher

Average_Management_Level(t) = Average_Management_Level(t - dt) +

(change_in_managing) * dt

INIT Average_Management_Level = 0

change_in_managing = (EXPERIENCE_IMPACT-Average_Management_Level) *

organizational_skills

Teacher_Expectation(t) = Teacher_Expectation(t - dt) + (change_in_pcvd_ability)

* dt

INIT Teacher_Expectation = 0

DOCUMENT: "Achievement is maximized when teachers . . . expect their

students to master the curriculum" (Brophy & Good, 1986, p. 360).

"Early in the year teachers form expectations about each student's academic

potential and personality. . . If the expectations are low . . . the student's

achievement and class participation suffers" (Dunkin, 1987, p. 25).

change_in_pcvd_ability = ((ACADEMIC_POTENTIAL_INDICATOR*25)-

Teacher_Expectation)

ACADEMIC_POTENTIAL_INDICATOR = 4

DOCUMENT: Input:

Grade Point Average (GPA) from previous year

4=A; 3=B; 2=C, 1=D, 0=F

Page 130: Teacher behavior and student achievement.2

119

organizational_skills = 1

DOCUMENT: Input:

Choose 1 for high amount of organization

Choose .5 for fair amount of organization

Choose 0 for not much organization

years_experience = 10

DOCUMENT: Input:

Select amount of experience in years from 0 to 10

EXPERIENCE_IMPACT = GRAPH(years_experience)

(0.00, 0.00), (1.00, 3.50), (2.00, 13.5), (3.00, 27.0), (4.00, 52.5), (5.00, 71.0), (6.00, 83.0),

(7.00, 90.0), (8.00, 95.5), (9.00, 98.5), (10.0, 100)

DOCUMENT: "... the majority of teachers solved only 5 of the original 18

teaching problems of first-year teachers in fewer than 3 years. Several years may

be required for teachers to solve problems such as classroom management and

organization" (Alkin, 1992, p. 1382).

MANAGEMENT_IMPACT = GRAPH(Average_Management_Level)

(0.00, 0.5), (10.0, 40.0), (20.0, 61.0), (30.0, 73.5), (40.0, 83.0), (50.0, 87.0), (60.0, 91.0),

(70.0, 94.0), (80.0, 95.5), (90.0, 98.0), (100, 100)

DOCUMENT: "Students learn more in classrooms where teachers establish

structures that limit pupil freedom of choice, physical movement, and

disruption, and where there is relatively more teacher talk and teacher control of

pupils' task behavior" (Brophy & Good, 1986, p. 337).

Page 131: Teacher behavior and student achievement.2

120

VITA

Jorge O. Nelson was born in Vancouver, WA, on September 28, 1957. He

attended elementary, junior and senior high school in Fremont, Nebraska,

graduating in 1975. He graduated from the Harry Lundberg School of

Seamanship, Piney Point, MD, in 1978 as an ordinary seaman. Following a short

tour of duty in the U.S. Merchant Marine, he entered Tacoma Community

College, Tacoma, WA, in 1980 and graduated with an Associate in Arts and

Sciences in 1983. He entered The Evergreen State College, Olympia, WA, in 1983

and graduated with a Bachelor's of Arts, major in Elementary Education, minor

in Drama in 1985. He received his teaching credential from the University of

Puget Sound, Tacoma, WA, in 1985. In 1985 he started his teaching career in a

self-contained sixth grade class at the International School Bangkok, Thailand.

He was hired by the International School Islamabad, Pakistan, in 1987 to teach

middle and high school technology education. He enrolled in a degree program

through Michigan State University, MI, in 1985 and graduated with a Masters of

Arts in Curriculum and Teaching in 1988. In 1990, he began his administrative

work at the American School of Asunción, Paraguay, as Assistant Director. In

1992, after receiving a doctoral fellowship sponsored by the Office of Overseas

Schools, U.S. Department of State and The University of Memphis, he moved to

Memphis, TN, and enrolled as a doctoral student in the College of Education,

Department of Leadership. He was awarded the Outstanding Student Award

for Scholarship, Professional Accomplishment, and Commitment in Educational

Leadership in 1994 and graduated with an Ed.D. in Administration and

Supervision in 1995.

He is presently employed as the Director of the American School of

Durango, México.