preservice elementary teachers' evaluations of elementary students' scientific models: an...
TRANSCRIPT
This article was downloaded by: ["University at Buffalo Libraries"]On: 05 October 2014, At: 06:59Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK
International Journal of ScienceEducationPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/tsed20
Preservice Elementary Teachers'Evaluations of Elementary Students'Scientific Models: An aspect ofpedagogical content knowledge forscientific modelingMichele M. Nelson a & Elizabeth A. Davis aa Department of Educational Studies , School of Education,University of Michigan , Ann Arbor , MI , USAPublished online: 27 Jul 2011.
To cite this article: Michele M. Nelson & Elizabeth A. Davis (2012) Preservice Elementary Teachers'Evaluations of Elementary Students' Scientific Models: An aspect of pedagogical content knowledgefor scientific modeling, International Journal of Science Education, 34:12, 1931-1959, DOI:10.1080/09500693.2011.594103
To link to this article: http://dx.doi.org/10.1080/09500693.2011.594103
PLEASE SCROLL DOWN FOR ARTICLE
Taylor & Francis makes every effort to ensure the accuracy of all the information (the“Content”) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoever as tothe accuracy, completeness, or suitability for any purpose of the Content. Any opinionsand views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Contentshould not be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions, claims,proceedings, demands, costs, expenses, damages, and other liabilities whatsoever orhowsoever caused arising directly or indirectly in connection with, in relation to or arisingout of the use of the Content.
This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Preservice Elementary Teachers’
Evaluations of Elementary Students’
Scientific Models: An aspect of
pedagogical content knowledge for
scientific modeling
Michele M. Nelson∗ and Elizabeth A. DavisDepartment of Educational Studies, School of Education, University of Michigan, Ann
Arbor, MI, USA
Part of the work of teaching elementary science involves evaluating elementary students’ work.
Depending on the nature of the student work, this task can be straightforward. However,
evaluating elementary students’ representations of their science learning in the form of scientific
models can pose significant challenges for elementary teachers. To address some of these
challenges, we incorporated a modeling-based elementary science unit in our elementary science
teaching methods course to support preservice teachers in gaining knowledge about and
experience in evaluating students’ scientific models. In this study, we investigate the approaches
and criteria preservice elementary teachers use to evaluate elementary student-generated
scientific models. Our findings suggest that with instruction, preservice elementary teachers can
adopt criterion-based approaches to evaluating students’ scientific models. Additionally,
preservice teachers make gains in their self-efficacy for evaluating elementary students’ scientific
models. Taken together, these findings indicate that preservice teachers can begin to develop
aspects of pedagogical content knowledge for scientific modeling.
Keywords: Pre-service; Modeling; Primary school
Introduction
Elementary school teachers are noted for being subject matter generalists. As part of
their everyday work, elementary teachers teach lessons in very different subject areas;
International Journal of Science Education
Vol. 34, No. 12, August 2012, pp. 1931–1959
∗Corresponding author. Horizon Research, Inc., 326 Cloister Ct., Chapel Hill, NC 27514, USA.
Email: [email protected]
ISSN 0950-0693 (print)/ISSN 1464-5289 (online)/12/121931–29
# 2012 Taylor & Francis
http://dx.doi.org/10.1080/09500693.2011.594103
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
from math to health, reading to social studies, and everything in between. To properly
prepare future elementary teachers for this range of responsibilities, teacher education
programs emphasize subject-specific pedagogies as part of required teaching methods
courses in the core academic subject areas: math, language arts, social studies, and
science. In elementary science teaching methods courses, preservice elementary tea-
chers learn principles and practices conducive to effective science teaching and thus,
student learning in science. One artifact that can be used to assess students’ science
learning is their scientific models. Here, models serve as visual representations of
the students’ understandings of the science being portrayed. Therefore, teachers
can gain valuable insights into student learning by carefully analyzing these forms
of student work. However, skillful evaluation of elementary students’ scientific
models is not an obvious or intuitive practice for preservice elementary teachers. In
this study, we examine four preservice elementary teachers’ developing scientific
model evaluation practices during the semester of their undergraduate elementary
science teaching methods course.
Theoretical Framework and Literature Review
Authentic scientific practices in teaching and learning science are central to ongoing
reforms within science education (American Association for the Advancement of
Science, 1993, 2009; Ford & Wargo, 2007; National Research Council, 1996,
2010). Recently, scientific modeling has received increasing attention as an instru-
mental pedagogy in the inquiry-oriented teaching and learning of science (Kenyon,
Schwarz, & Hug, 2008; Schwarz, 2009; Windschitl, Thompson, & Braaten,
2008a). In scientific modeling-based instructional approaches, students engage in
many authentic scientific practices by creating and using representations of their
understandings of science concepts in the form of scientific models. Scientific
models are two- or three-dimensional representations that highlight the central fea-
tures and key relationships among components of a simplified system or scientific
phenomena for the purposes of understanding, communicating, and/or generating
predictions about the system or phenomena in question (e.g., Gilbert & Boulter,
2001; Harrison & Treagust, 2000). After they construct models, students use the
models for communicative and sense-making purposes, evaluate the model’s fitness
according to a set of criteria related to its purpose, and revise the model to better
align with those evaluation criteria. Taken together, model construction, use, evaluation,
and revision comprise the elements of modeling practice. Modeling practice is a non-
linear, iterative approach to learning science content, in which students take an
active, evidence-based role in reshaping their own conceptual understandings of the
science content (Schwarz et al., 2009). At the same time, students gain experience
with multiple authentic procedural aspects of learning and doing science, including
making sense of data, generating and revisiting predictions, and engaging in scientific
argumentation, consistent with science education standards and reform-oriented
goals for students’ science learning. Thus, incorporating a focus on scientific model-
ing into a typical elementary science methods course provides rich opportunities for
1932 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
preservice elementary teachers to engage in multiple authentic scientific practices
themselves, and to explore and develop proficiency in engaging children in those
same scientific practices. While not the only choice science educators might make,
adding an emphasis on scientific modeling provides numerous affordances for
supporting novices in becoming well-started beginning science teachers.
In order to teach science using often unfamiliar modeling-based approaches,
preservice teachers must learn what scientific modeling entails and learn how to
apply this knowledge for pedagogical purposes (Justi & van Driel, 2005; Kenyon,
Davis, & Hug, 2011; Windschitl, 2003; Windschitl & Thompson, 2006). Supporting
students’ active engagement in their own science learning using modeling-based ped-
agogies is demanding work, particularly for beginning teachers (Crawford & Cullin,
2004; De Jong, van Driel, & Verloop, 2005; Schwarz, 2009). In addition to helping
students understand the content, teachers must also help students develop under-
standings about the nature of scientific models and modeling practice. For
example, when working with a model of the solar system, students should recognize
that the three-dimensional model represents the order of the planets relative to the
sun, and they should also form understandings about the nature of this representation
as a scientific model—namely, that the model need not be an accurate scalar represen-
tation in order to be a useful learning tool, and that the model can be revised on the
basis of new understandings and scientific evidence, as was recently demonstrated by
the changed status of Pluto. In essence, then, teachers must help students understand
not only factual and conceptual aspects of the science content involved, but also help
students see how scientific models and modeling can be useful in developing and
enhancing their own science content understandings.
Another key element of effective, model-centered science teaching is teacher self-
efficacy (Bandura, 1997) for teaching using modeling-based pedagogies. In other
words, teachers need to believe that they are capable of successfully teaching
science using modeling-based pedagogies to achieve desired student learning out-
comes. Preservice elementary teachers often need extensive support in developing
self-efficacy for science teaching in general, for a variety of reasons including low
confidence in their science content knowledge and lack of familiarity with inquiry-
oriented science pedagogies (Bleicher, 2004, 2006; Howitt, 2007; Schoon &
Boone, 1998; Sherman & MacDonald, 2007; Watters & Diezmann, 2007). Scientific
modeling is one such inquiry-oriented science pedagogy that is typically new to
preservice teachers, but with carefully designed methods course experiences, they
can become more familiar with what scientific modeling and modeling-based
pedagogies entail (Crawford & Cullin, 2004; Justi & van Driel, 2005; Kenyon
et al., 2011; Schwarz, 2009; Windschitl & Thompson, 2006; Windschitl, Thompson,
& Braaten, 2008b).
Part of effectively using scientific modeling in a teaching capacity involves the ability
to interpret, evaluate, and assess student-generated scientific models. Understandings
about the nature and purposes of scientific models and modeling practice, collectively
termed ‘metamodeling knowledge’ (Schwarz & White, 2005), ground the critique of
student-generated scientific models. Teachers must therefore consider students’
Preservice Teachers’ Scientific Model Evaluations 1933
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
scientific models along several dimensions. For example, teachers must assess stu-
dents’ models for evidence of accurate content understanding as well as evidence of
student understandings about what constitutes a scientific model and whether the
model fulfills its intended purpose. In order to effectively evaluate students’
models, teachers must have an idea of the relevant criteria with which they are evalu-
ating model quality, including model clarity, salience, and consistency with empirical
evidence (Schwarz et al., 2009). They must also have an idea of the pertinent learning
goals, with respect to the science content and inquiry processes associated with scien-
tific models and modeling practice. Finally, teachers need also consider the student’s
model in light of what can reasonably be expected of the student developmentally. In
other words, teachers must be able to look at a student’s scientific model and be able
to simultaneously evaluate it along a variety of dimensions, including using scientific
model evaluation criteria, assessing progress toward desired learning outcomes, and
keeping in mind the guiding principles of metamodeling knowledge (Kenyon et al.,
2011; Schwarz et al., 2009a). Recognition, articulation, and successful execution of
scientific modeling practices in pedagogical capacities are foundational to further
development of specialized teacher knowledge and skills in these areas (Smithey,
2007).
This self-assured ability to evaluate learners’ scientific models is an aspect of a larger
construct that we call ‘pedagogical content knowledge for scientific modeling’, or
PCK for scientific modeling (Schwarz, Meyer, & Sharma, 2007). The notion of
PCK for scientific modeling builds upon the theoretical and practical work of many
others, both in science education and in education more broadly (e.g., Grossman,
1990; Magnusson, Krajcik, & Borko, 1999; Shulman, 1986; van Driel, De Jong, &
Verloop, 2002). Magnusson and colleagues have described pedagogical content
knowledge in science teaching as consisting of five components: orientations toward
science teaching, knowledge and beliefs about science curriculum, knowledge and
beliefs about students’ understanding of specific science topics, knowledge and
beliefs about assessment in science, and knowledge and beliefs about instructional
strategies for teaching science. Each component can be considered using scientific
modeling as a lens. Table 1 depicts the scientific modeling analogs of Magnusson
and colleagues’ five components of PCK for science teaching.
For example, PCK-SM entails knowledge and beliefs about when and how scientific
modeling-based instructional strategies are warranted and beneficial for teaching
science. Additionally, PCK-SM for modeling-based instructional strategies in science
entails the tacit component of self-efficacy for effectively employing these pedagogies
to promote student learning. Others have described teacher knowledge and beliefs
about science teaching involving scientific modeling with respect to several of Magnus-
son and colleagues’ five components of PCK in science teaching (Crawford & Cullin,
2004; Justi & Gilbert, 2002; Justi & van Driel, 2005; Schwarz et al., 2007; van Driel
& De Jong, 2001). Here, we build upon this knowledge base as we focus on four pre-
service teachers’ changing knowledge, scientific model evaluation skills, and self-
efficacy for evaluating scientific models as aspects of their developing PCK-SM for
evaluating students’ scientific models (indicated in bold on Table 1).
1934 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
The following research questions guided the study’s design, execution, and data
analysis:
(1) How do preservice elementary teachers evaluate elementary students’ scientific
models?
and
(2) How do preservice elementary teachers’ knowledge, skills, and self-efficacy for model
evaluation change during the elementary science teaching methods semester?
Table 1. Components of PCK for science teaching and PCK for scientific modeling
Component of PCK for
science teaching (from
Magnusson et al., 1999)
PCK for scientific modeling
(PCK-SM) analog
What does this element of PCK-
SM include?
Orientations toward science
teaching
Orientations toward science
teaching using scientific
modeling
Teacher-directed, learner-directed,
activity-based, etc., orientations in
the use of scientific modeling
pedagogies
Knowledge and beliefs about
science curriculum
Knowledge and beliefs about
scientific modeling in science
curriculum
Knowledge and beliefs about:
when scientific modeling is
appropriate for the curriculum; how
to incorporate scientific modeling
into curricula (for which topics,
using which modeling practices,
etc.)
Knowledge and beliefs about
students’ understanding of
specific science topics
Knowledge and beliefs about
students’ understanding of
scientific modeling
Knowledge and beliefs about:
students’ understanding of science
content represented in scientific
models; students’ understanding of
scientific modeling practices,
epistemology, and nature of
scientific models (meta-modeling
knowledge); how to interpret and
critique students’ scientific models
Knowledge and beliefs about
assessment in science
Knowledge and beliefs about
assessment in science using
scientific modeling
Knowledge and beliefs about:
how to assess students’ content and
scientific practice knowledge using
modeling strategies; when to use
scientific modeling to assess
students’ understanding of content
and scientific practices
Knowledge and beliefs about
instructional strategies in
science
Knowledge and beliefs about
instructional strategies in
science using scientific modeling
Knowledge and beliefs about:
when and how to use scientific
modeling instructionally; when and
how to support students in learning
about scientific modeling
Preservice Teachers’ Scientific Model Evaluations 1935
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Methods
Context
The study was conducted as part of an elementary science teaching methods course
for undergraduates pursuing initial elementary teacher certification. As part of this
methods course, and to become familiar with scientific modeling, preservice teachers
experienced a modified version of an elementary science unit focussed on evaporation
and condensation using an inquiry-oriented, scientific modeling-based approach (see
Kenyon et al., 2008). Briefly, preservice teachers observed evaporation and conden-
sation in the context of a solar still made out of a modified 2 liter plastic bottle covered
with plastic wrap, containing a reservoir for the collection of purified condensate from
‘dirty’ water (see depictions in Figs. 1, 2, and 4). Using the solar still and other experi-
ences with the phenomena, preservice teachers created their own scientific models of
evaporation and condensation in the form of drawings. Preservice teachers were then
introduced to scientific model evaluation as a means to analyze and prompt revision of
initial models, with an emphasis on three core, inter-related criteria: consistency with
empirical evidence (i.e., does the model match the data?), sense-making (i.e., does the
model make sense to the user?), and communicative power (e.g., clarity, complete-
ness, etc.) of the models. Other model evaluation criteria, including generativity
and proper use of scientific terminology, were briefly presented as additional criteria
for consideration. Model evaluation using criteria was revisited and reinforced several
times in the methods course.
Participants
Preservice elementary teachers who participated in this study were in their third seme-
ster of a four-semester teacher preparation program at a large midwestern university
in the USA. Four preservice teachers were purposefully selected on the basis of their
initial ideas about scientific models (i.e., a range of responses to a pretest item) to par-
ticipate in all aspects of this study. These case study preservice teachers were typical of
the larger cohort of preservice elementary teachers in terms of age and ethnicity (i.e.,
they were Caucasian and approximately 21 years old), and in terms of the data in this
study. None majored or minored in science, and all had a language arts emphasis
(concentration or minor concentration) as part of their undergraduate teacher edu-
cation. Three of the four focal preservice teachers were placed in local second-
grade classrooms, and the other focal preservice teacher was placed in a local fifth-
grade classroom for the third-semester teaching practicum.
For one data source, all 35 preservice teachers enrolled in an undergraduate
elementary science teaching methods course and contributed written data. These
data were included to illustrate cohort-wide patterns in the use of model evaluation
criteria. The first author taught scientific modeling elements of the course and
worked with preservice teachers in the field. The second author was the lead course
instructor, and instructed roughly half the 35-member cohort, but was not the instruc-
tor for any of the four case study teachers.
1936 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Data Sources
The primary data sources consisted of one-on-one interviews with the focal four pre-
service elementary teachers in this study and written mid-semester model evaluation
homework assignments from all 35 preservice teachers in the methods course, includ-
ing the four focal preservice teachers. Interviews with preservice elementary teachers
were conducted prior to and following methods course instruction about evaluation of
scientific models. In these interviews, preservice teachers were asked to carry out
‘think aloud’ evaluations of elementary student-generated scientific models (Ericsson
& Simon, 1980). Two elementary-student generated scientific models of evaporation
and condensation in a solar still were labeled ‘Model 1’ and ‘Model 2’ in both pre- and
post-interviews. These models are presented in Figures 1 and 2.
Figure 1. Student model 1 of evaporation and condensation in a solar still
Figure 2. Student model 2 of evaporation and condensation in a solar still
Preservice Teachers’ Scientific Model Evaluations 1937
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Figure 3 presents the student-generated model of germ transmission that was used
in the post-interview (Piko & Bak, 2006, p. 649), as described below.
For each student-generated model, preservice teachers were first asked to provide a
verbal evaluation of the student’s work and then reflect on the criteria used to evaluate
the student models. Evaporation and condensation models were used in both pre- and
post-interviews to establish comparable model evaluation data. To measure self-
efficacy for model evaluation, preservice teachers were also asked to share how they
felt about their model evaluations in initial and final interviews. The germ
transmission model was used solely in the post-interview and served as an evaluable
student-generated scientific model in a novel content area. Pre- and post-interview
protocols are provided in the appendix.
Written data consisted of a portion of a mid-semester homework assignment in
which preservice teachers were asked to evaluate the model shown in Figure 4
(with the instructions provided to them).
The models in Figures 1, 2, and 4 were selected from a set of elementary student-
generated models of evaporation and condensation in a solar still (Schwarz et al.,
2009). We chose these models because they offered visually very different examples
of student work that could be readily critiqued using model evaluation criteria dis-
cussed in the methods course.
Data Coding and Analysis
Data from interview transcripts and written model evaluations were coded to capture
scientific model evaluation criteria stressed in the methods course, as well as emergent
model evaluation criteria. Table 2 illustrates the codes used in coding these data.
A subset of the interview transcript and written model evaluation data was coded by
another researcher and the results were tested for inter-rater reliability. Initial com-
parisons indicated greater than 80% agreement in assigned codes, and discrepancies
in data coding were resolved through discussion (Porter, 2006). Due to the relatively
small sample sizes (n ¼ 4 and n ¼ 35), statistical analyses were not conducted.
Figure 3. Student model of germ transmission [taken from Piko & Bak (2006), p. 649]
1938 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Interview questions specific to preservice teacher self-efficacy for model evaluation
yielded data that were coded and analyzed comparatively for each preservice teacher.
Self-report statements of a preservice teacher’s confidence in her initial model evalu-
ations were compared with similar self-report statements from that preservice tea-
cher’s final model evaluations.
Coded data were analyzed to identify themes and trends within and across individ-
uals for the various data sources (Borman, Clarke, Cotner, & Lee, 2006; Yin, 2006).
For example, coded data from each preservice teacher’s written model evaluation
were compiled and compared in aggregate to identify trends in criterion usage
among preservice teachers. Similarly, coded data from each interview item for each
preservice teacher were compared with coded data for other interview items for the
same preservice teacher, as well as being compared with coded data for the same inter-
view item for other preservice teachers to identify intra- and inter-individual themes
and trends, respectively.
Findings
The four focal preservice teachers varied in their knowledge, skills, and self-efficacy
with respect to scientific model evaluation. Overall, our findings suggest that each pre-
service teacher evolved uniquely in her model evaluation skills and criteria of empha-
sis. Despite these individual differences, we found that the four focal preservice
teachers broadened their understanding of scientific modeling-based pedagogies
and made gains in their self-efficacy for evaluating student-generated scientific
models. Additionally, we found that across the entire group of 35 preservice teachers,
model evaluations tended to stress model evaluation criteria that were emphasized
during methods course instruction. In the following narratives, we describe each
Figure 4. Methods course assignment instructions and student model of evaporation and
condensation in a solar still. As a final component of this assignment, consider the following
student model of evaporation and condensation and provide both an evaluation of the model and
any feedback you would give the student
Preservice Teachers’ Scientific Model Evaluations 1939
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
preservice teacher’s changing knowledge, skills, and self-efficacy for evaluating
scientific models at several points during the elementary science teaching methods
course.
Arielle. Arielle was a language arts major and math minor in the School of Edu-
cation, and her practicum placement was a local second-grade classroom.
Initial verbal model evaluations. At the beginning of the semester, Arielle’s first
evaluation of a student-generated scientific model of evaporation and condensation
Table 2. Coding scheme for model evaluation criteria
Code Explanation/example
Sense-makinga Preservice teacher refers to the explanatory power of the model and/or
mentions sense-making on the part of the model creator or audience. For
example, ‘the model allows the user to make sense of what happens in the
solar still’
Communicationa Preservice teacher mentions model clarity, completeness, and/or
explicitly mentions communication in the model evaluation. For
example, ‘the model clearly communicates the student’s ideas about
evaporation’
Consistency with
evidenceaPreservice teacher comments on the scientific accuracy of the model
and/or how the model is supported by empirical data. For example, ‘this
model does not show condensation on the plastic wrap, which the
student observed in the experiment’
Aesthetics and features Preservice teacher mentions any or all of the following model criteria/components: neatness, artistic quality, arrows, labels, key, zoom view,
title, and internal consistency. For example, ‘this model is neatly labeled’
Generativity Preservice teacher refers to model generalizability, model-based
predictions, model applications, and/or compares the model with the
actual phenomenon. For example, ‘this model could also be used to
explain fog on car windows’
Mechanism or process Preservice teacher mentions change over time, process, mechanism,
causality, and/or variables or influential factors that govern the
phenomenon. For example, ‘the model doesn’t say why evaporation and
condensation happen’
Terminology Preservice teacher cites the use of scientific terms and/or proper use of
terminology. For example, ‘the student uses the term “evaporation” in
his written description’
Other Preservice teacher refers to the salience of model components, the
definition of a scientific model, the purpose of the model, and/or other
metamodeling knowledge. For example, ‘different models convey similar
ideas’
aThese were covered explicitly as model evaluation criteria in the methods course.
1940 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
(Figure 1) was focussed on the student’s use of features and aesthetics, specifically
labels:
I think this is a good model because she clearly labels the sections . . . if I were to just look
at this without the labeling I would have no idea what this was but she even labels the
materials like the tops of the stick and the pure water . . . I would say this is a good
model . . . (Arielle, initial interview)
When presented with the second example of a student model of evaporation and
condensation (Figure 2), Arielle’s model evaluation reflected a comparative quality.
That is, student model 2 was juxtaposed with student model 1:
As far as the model itself, I like model 2 better—it’s more detailed and it shows the entire
process . . . But I really do like the key, and I really like how the model 2 student drew out
the steps because it’s important to know that there’s steps behind this . . . and that’s not
mentioned in model 1. (Arielle, initial interview)
Arielle went on to say:
. . . I think if I had gotten them both at once I would have said model 1 isn’t very good just
because this one is so detailed, there’s a key, it’s a cycle . . . I think if I were to say ‘this is a
model’ I would say model 2 . . . I probably wouldn’t have said as many good things about
model 1 if I had seen them at the same time. (Arielle, initial interview)
Here, Arielle explained that her preference for model 2 was based on its level of
detail and the model’s depiction of a process (‘cycle’), something that was not
included pictorially in the first model she evaluated. She also noted that her sense
of what constitutes a ‘good’ model relied heavily on a comparison of the two students’
work.
Mid-semester written model evaluation. In a written evaluation of a student model of
evaporation and condensation (Figure 4), Arielle again noted the importance of the
model’s aesthetics and features in the forms of labels and details:
The student labeled the plastic wrap, cap, water and stick in the model. I liked how this
student also included a written description in addition to the model to help viewers inter-
pret what is going on in the model. (Arielle, written model evaluation)
Final verbal model evaluations. Above, Arielle specifically noted the benefit of text
in helping others make sense of the model (communication). This additional focus on
the role of explanatory text accompanying the model was carried into her final model
evaluations (Figures 1 and 2) at the end of the semester:
. . . But I do think that . . . for any model you would feature a written description . . . for
them to write out a short description of the process helps me understand that they under-
stand what’s going on, rather than just drawing a picture and labeling. (Arielle, final
interview)
This quote suggests that Arielle was concerned with being able to understand what
the model’s creator understood about the phenomenon depicted in the model (sense-
Preservice Teachers’ Scientific Model Evaluations 1941
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
making). Arielle demonstrated a shift from a surface-level critique to a more in-depth
analysis of student understanding on the basis of what was communicated in the
model. She also had a better sense of what constitutes a scientific model, as the follow-
ing quote from her end-of-semester evaluation of evaporation/condensation model 1
(Figure 1) suggests:
I think that it would be better if the student had drawn it in more than one drawing and
showed in her drawing how it changes over time rather than writing it, but I still under-
stand what’s going on from her written information. (Arielle, final interview)
Here, Arielle approached model evaluation with an idea of what she expected to see
in a student model, something that was not present in her initial model evaluations.
Arielle’s end-of-semester evaluation of a student-generated model of germ trans-
mission (Figure 3) further illustrates this point:
. . . this is not a model, I would say. [MN: Why do you say that?] Because this is not . . . I
would say this is more just a drawing of a man coughing and obviously this is not in any
way scientific . . . (Arielle, final interview)
Ideas about scientific models and modeling. In a more general sense, Arielle also
acknowledged that her ideas about the roles of scientific models and modeling in
instruction had changed:
. . . well I used to think of a scientific model as just the visual of anything in science, it
could be in 3D or drawing, but now I realize . . . that children have to learn from the
process of drawing the model, there’s learning going on through the drawing of the
model . . . So I didn’t think about the process of making the model and the revisions to
the model and adding to the model before, but I think that’s really important. (Arielle,
final interview)
Here, we see that Arielle recognized the learning inherent in scientific modeling.
Self-efficacy for model evaluation. In addition to realizing the pedagogical benefits
of scientific modeling and adopting a more systematic approach to student model
evaluation, Arielle’s self-efficacy for student model evaluation improved, as she
indicated in the post-interview:
I feel more comfortable about it, I think about all the things you can include in a model . . .
before I didn’t really think about [that] . . . now if I were thinking about evaluating a model
I would know how to make a grading rubric and I would say does this include labels, is
this clear, does it show the process accurately, does it include all the factors that we
learned . . . but before I just kind of looked at it and was like ‘yep, this looks good’.
(Arielle, final interview)
Here, Arielle articulated specific features that would allow her to discern levels of
model quality when assessing student work. Arielle also implied that she had a
better sense of the bases for model revision and what the end product of model
revision should entail.
1942 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Summary for Arielle. In sum, Arielle’s model evaluation approach transitioned
from an initially comparative one to an approach that favored a standard set of
model evaluation criteria. Arielle focussed consistently on the use of labels and the
inclusion of details in students’ scientific models (model aesthetics and features),
and also became more attentive to whether the model depicted a process over time
and whether it was accompanied by a written explanation or description. She
expressed confidence in her developing ability to evaluate student-generated scientific
models in a more systematic manner.
Kali. Kali was a math major and language arts minor and also did her practicum
work in a local second-grade classroom. Unfortunately, she did not complete the
mid-semester written model evaluation homework assignment, and so those data
are lacking for Kali.
Initial verbal model evaluations. Like Arielle, Kali evaluated the two student models
of evaporation and condensation relative to one another in her initial, beginning-of-
semester interview. About model 1 (Figure 1), Kali said:
I just see that everything is labeled nicely . . . even the process of what’s happening, he has
words to describe the state that the water’s in . . . like he says ‘evaporating water’ . . . (Kali,
initial interview)
In contrast, about model 2 (Figure 2) Kali said:
Well this one’s really detailed . . . this one compared to that one, this one’s in steps so it’s a
little more clear to the viewer . . . this person has a lot more explanation of why the water
drips down the plastic wrap . . . and then he also has a key so he doesn’t have to write in
‘evaporation’ or ‘condensation’, . . . he also has what state the water’s in . . . so I feel like
this one gives you a more substantive process . . . but I don’t necessarily think it’s better
because that one [model 1] includes all the same aspects . . . I feel like your first instinct
would be to look at these and say this one [model 2] is better because it has so much to it,
but I think they in general have the same qualities or same aspects. (Kali, initial interview)
Here, we see that Kali was focussed on model aesthetics and features via the use of
labels and descriptors, as well as the depiction of a process.
Final verbal model evaluations. In the final statement from Kali’s evaluation of
model 2 above, we see a theme that resurfaces in her later model evaluations,
namely, that different models can represent the same information equally well.
Kali’s end-of-semester evaluations of these same two student-generated models
(Figures 1 and 2) reiterated this theme:
I think that both of these students maybe could have added too why is the water evapor-
ating . . . I think that I’d probably grade these both the same because they’re both just
missing a few of like the evidence things that they might have learned . . . but other
than that I think this one [model 2] just has more steps to it. (Kali, final interview)
As in her initial model evaluations, Kali mentioned the importance of a mechanism
or a process being depicted in the models, through her focus on why. In both pre- and
Preservice Teachers’ Scientific Model Evaluations 1943
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
post-interviews, Kali indicated that she was looking for students’ science understand-
ing (i.e., sense-making) in their models, and recognized that similar conceptual
understandings may be represented in visually different models.
Ideas about scientific models and modeling. Kali perceived that her ideas about the
pedagogical significance of scientific models and modeling also changed during the
semester:
I didn’t ever think about having students draw models, I thought that the teacher just
showed a model, and I just never thought about the whole revision and all that stuff
and making it this process—I think that makes more sense because they learn more
through that . . . (Kali, final interview)
In this comment, Kali suggested that her initial understanding of scientific models
was as static teaching tools; that she had not considered modeling practices, or having
students engage in these as an approach to learning science content.
Self-efficacy for model evaluation. Also changed was Kali’s self-efficacy for evalu-
ation of models of evaporation and condensation, which improved during the
semester.
I felt more confident with these two [evaporation/condensation] models because we had
done that . . . but I feel more confident too just knowing that when we evaluated our
models, just the things we kept in mind and we had . . . different ways we wanted to rep-
resent different things in our models . . . we all had the same evidence . . . [W]ith this one
[germ transmission model] I didn’t feel as confident but I did feel like there needed to be a
little bit more and maybe just in terms of writing . . . showing more detail I guess. But I
definitely feel a little bit more confident than I did before . . . (Kali final interview)
Kali attributed this shift in confidence largely to having experienced the evapor-
ation/condensation modeling unit in the methods course.
Summary for Kali. Overall, we see that Kali’s evaluations of student models of
evaporation and condensation became more thorough during the methods course.
She stressed the importance of having experienced the evaporation/condensation
unit as a factor in her increased confidence in evaluating students’ models. Kali
also maintained a ‘different but equal’ stance in her evaluation of visually very differ-
ent student models. Furthermore, Kali expressed some tentativeness about evaluating
a student model on germ transmission, which suggests that transfer of these model
evaluation skills to other science topics may require additional support.
Lezli. Lezli was a language arts major and social studies minor. Lezli’s teaching prac-
ticum took place in a local fifth grade classroom.
Initial verbal model evaluations. Lezli’s initial model evaluations reflected a focus on
the general concepts of evaporation and condensation over the particular aspects of
what was happening in the solar still example. About model 1 (Figure 1), Lezli said:
1944 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
I think that the student understood . . . what was happening in the solar still . . . the thing
that kind of troubles me about it is it says ‘the dirty water’ evaporates, so I’m wondering if
they’re thinking only dirty water can evaporate . . . that it’s more of a purifying process
than it is part of the water cycle . . . (Lezli, initial interview)
Then, when shown model 2 (Figure 2), Lezli said:
. . . she really understands the practical process, like, the water goes up, sticks to the saran
wrap, falls in because of the marble, but I’m not really sure based on this picture if she
understands evaporation and condensation . . . like I think she understands it in the
context of the solar still . . . but would she understand it in a real-world context . . .
(Lezli, initial interview)
Lezli was concerned with whether the student understood the general processes of
evaporation and condensation instantiated in the solar still. She also indicated some
doubt as to the student’s ability to use the model generatively, or explain the phenom-
ena in other contexts.
Mid-semester written model evaluation. In her mid-semester written evaluation of a
student model of evaporation and condensation (Figure 4), Lezli stated:
The student does a good job of showing what happened in the solar still, while the
description is missing some key items . . . the model does a substandard job of communi-
cating. The drawn model and the description correspond beautifully, but the description
is inaccurate, thus communicating incorrect information. (Lezli, written model
evaluation)
Lezli’s evaluation of student work emphasized the communicative power of the
model and its scientific accuracy, two criteria for model evaluation explicitly discussed
in the methods course.
Final verbal model evaluations. In her end-of-semester model evaluations, Lezli
reiterated themes from previous model evaluations and introduced assessment as a
new purpose for scientific modeling. When evaluating model 1 (Figure 1), Lezli said:
. . . it’s their drawing, so it’s their understanding . . . [T]he explanation is really important
because looking at somebody’s drawing we may have no idea what they were talking about
. . . [T]hey’re using the vocabulary that was taught . . . [T]his might be toward the end of
the lesson, or it could be the teacher testing them to see what they do know about it to see
where he/she should go from there . . . (Lezli, final interview)
Then, when evaluating model 2, she noted:
. . . the key shows me that they know what they’re talking about. It is more artistically well-
made than this one [model 1] but I understand them both the same way. And I think both
students understand the concept of evaporation, through the solar still . . . (Lezli, final
interview)
Particularly in her evaluation of model 1, Lezli suggested that evaluation of student
models is useful for teachers in shaping future instruction to be responsive to learners’
understandings. Similarly to Kali, Lezli noted that both students may have
Preservice Teachers’ Scientific Model Evaluations 1945
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
understood the concepts of evaporation and condensation but might have chosen to
represent those understandings in different, equally valid ways.
Sense-making, another model evaluation criterion addressed in the methods
course, was highlighted in Lezli’s evaluation of a student’s germ transmission model
(Figure 3):
I’m not really sure if the student knows about germ transmission . . . [T]here’s no mention
of what they know about sickness and how it gets transmitted . . . I wasn’t sure just looking
at this if the student made sense of germ transmission. (Lezli, final interview)
Here, Lezli also suggested that the model should include a mechanistic explanation
which would reveal the student’s understanding of the underlying science content.
Ideas about scientific models and modeling. When prompted to reflect on her under-
standings of models and modeling in the final interview, Lezli said that her conception
of scientific models and modeling and the role they play instructionally changed
during the semester. In the final interview, she stated:
I think that in the beginning . . . I wasn’t thinking of having them draw a picture or to get
their understanding or having them make a diagram or a graph or like those things didn’t
seem like models to me, but now they do . . .. I think I was just thinking of it as a very
broad, end-of-unit assessment thing . . . not something you can use at the beginning . . .
to start to gauge thinking about what kids do know or what they’re confused about . . .
I was just solely thinking of it as an end-of-unit thing, not something you could do
midway through and before and basically all the way through . . . (Lezli, final interview)
Like many other preservice teachers, Lezli expanded her initial ideas about scienti-
fic models as three-dimensional representations to include additional, more abstract
representations such as diagrams and graphs. Similarly, she recognized the purposes
that scientific models and modeling can serve at various points during a science unit,
rather than functioning solely as end-of-unit assessments.
Self-efficacy for model evaluation. Lezli offered the following comment on her own
growth in evaluating student models:
I think that I was looking at them in a broader sense, and I’m not so concerned about little
things, . . . I want to make sure that they get the concept, and if they can show it to me . . .
Not so much ‘it’s not pretty and it’s not this’ it’s the meat of what they need to know is
what I’m looking for. (Lezli, final interview)
In this comment, Lezli’s emphasis on purposeful analysis of student work reveals
her increased confidence in her own model evaluation skills.
Summary for Lezli. Unlike other preservice teachers in this study, Lezli indicated
that her end-of-semester approach to model evaluation was more holistic than it was
initially. In the above excerpts, Lezli cited the importance of student sense-making
underlying their models and the importance of communicating those understandings.
While this is consistent with others’ criterion-based approaches to model evaluation,
1946 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Lezli was unique in her explicit statement of preference for a gestalt appraisal of
student work despite her use of multiple, specific model evaluation criteria to deter-
mine students’ understandings of the underlying science concepts.
Madison. Madison was a social studies major and a language arts minor who
admitted in the first interview that ‘science isn’t my major area of interest’.
Madison did her practicum teaching in a local second-grade classroom and said
that she had come to enjoy science a little more after having seen it taught by her coop-
erating teacher.
Initial verbal model evaluations. Madison began her student model evaluations with
the idea that the model should explain the student’s understanding of the ‘why’
behind the phenomenon, in addition to demonstrating proper usage of the science ter-
minology. In response to the first model of evaporation/condensation (Figure 1),
Madison said:
I think the student has a good grasp on what’s happening . . . they use the words evaporate,
condenses . . . from both the words and the drawings I would say that the student under-
stands. I think that it’s a good explanation . . . (Madison, initial interview)
She continued with the following remarks about model 2 (Figure 2):
Wow, this one right away visually I like it . . . I don’t know that it says anything about evap-
oration . . . I guess in the key it says evaporation so they drew little arrows, so that’s kind of
neat that they have the key and everything, but the actual diagram doesn’t say evaporation
. . . but I just thought of something for both of them, I think I would want to know or
would want them to explain why the water is evaporating . . . (Madison, initial interview)
In her evaluation of model 2, Madison initially emphasized its aesthetic qualities. As
she gained more experience, Madison’s model evaluations became more nuanced, as
is seen in her mid-semester and final model evaluations.
Mid-semester written model evaluation. Madison’s written evaluation of a student
model of evaporation and condensation was highly structured and explicitly addressed
the three central criteria emphasized in the course: consistency with evidence/scien-
tific ideas, sense-making, and communication. To this, Madison added:
I am guessing that this model was created as an introduction to the evaporation and con-
densation unit, similar to how we used the solar still as our intro to the unit. My evalu-
ation would probably be slightly different if this model were created later in the unit
(because I feel that it would need more scientific ideas embedded into it). (Madison,
written model evaluation)
Here, Madison stated that in addition to considering the model in light of the three
core evaluation criteria, she would also evaluate the model within its context in the
unit.
Preservice Teachers’ Scientific Model Evaluations 1947
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Final verbal model evaluations. Madison’s end-of-semester evaluations of student
models echoed her initial emphases on mechanistic, aesthetic, and communicative
aspects of models. About model 1 (Figure 1), she said:
. . . I like that they have the labels . . . it’s a little messy or sloppy though, maybe they could
have a little key . . . the explanation beneath it is good, I like that they put what was hap-
pening into words . . . it uses ‘evaporate’ and ‘condenses’ but it doesn’t really talk about
how it’s evaporating or condensing . . . there’s no mention of what’s making it do that
. . . (Madison, final interview)
About model 2 (Figure 2), Madison said something similar:
So this one has a key, I like that . . . each drawing isn’t as messy or cluttered, I guess—they
don’t have to label every little thing because it’s in the key . . . I like that there’s the multiple
drawings of it . . . and they have a little explanation with each one . . . so it says condensing
but it never says anything . . . oh, I guess evaporation is in the key, but in the little expla-
nations there’s no mention of evaporation . . .. . . I kind of like this one better because it has
the change over time, I think that’s important. (Madison, final interview)
Unlike the other preservice teachers in this study, Madison’s explicitly criterion-
based evaluation of student work was maintained in her evaluation of the student
germ transmission model (Figure 3). She said,
. . . but there’s no explanation of what’s going on and . . . there’s no labels, no key or any-
thing, so it’s kind of hard to make sense of what exactly they are trying to get across . . . and
to know whether they know what they’re talking about or drawing . . . (Madison, final
interview)
Here, we also see Madison’s continued focus on student sense-making of the scien-
tific ideas conveyed in the model.
Ideas about scientific models and modeling. Madison acknowledged that her ideas
about scientific models and modeling had changed:
I guess my idea of a scientific model has expanded now from class . . . drawings, physical
models, charts, graphs, diagrams . . . a scientific model can be a lot of things that express
the idea or are representations of a phenomenon or scientific idea . . . I’ve learned a lot
more about how they can be useful for the students by just their construction, their learn-
ing about the ideas, and I didn’t say this in the beginning because I didn’t realize it but the
revision of it, going back and evaluating them, and re-looking at them and making new
models . . . I think that’s how students learn from it. (Madison, final interview)
Like Arielle, Madison highlighted the student learning that scientific modeling
promotes.
Self-efficacy for model evaluation. In addition to learning more about the pedagogi-
cal advantages of scientific modeling, Madison stated that her evaluation criteria
became more focussed, which contributed to her increased confidence in her model
evaluations.
1948 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
I think myevaluations are ok, I think they’re probablybetter thanat the beginning of the seme-
ster . . . I had some things to focus on in terms of evaluating . . . like, whether it makes sense,
and ideas are communicated, the scientific ideas—like I mentioned seeing the actual terms
and stuff like that . . . I think my evaluations are getting better. (Madison, final interview)
Here, Madison identified that having articulated model evaluation criteria contrib-
uted to her improvement in evaluating student models.
Summary for Madison. Unlike the other preservice teachers in this study, Madison
applied these same evaluation criteria to models representing different science topics,
which underscores the importance and relevance of having a set of criteria for her
practice of model evaluation. Specifically, Madison maintained an emphasis on the
importance of mechanism and the proper use of terminology in the student models
throughout the study.
Summary: Preservice teachers’ model evaluations in the interviews
Taken together, the results of focal preservice teachers’ evaluations of student models
indicate that some criteria, such as model aesthetics and features, were prominent
across preservice teachers, while other criteria, such as model generativity, were less
common. More importantly, the ways in which preservice teachers employed criteria
in evaluating student models differed in notable ways between the initial and the final
interviews. Arielle’s initial model evaluations centered on model aesthetics and fea-
tures at a surface level (model as drawing), whereas her final model evaluations
explored the purposes for model aesthetics and features in sense-making and com-
munication capacities (model as conceptual medium). Kali and Lezli underscored
the use of text and terminology as indicators of student understanding in their
initial model evaluations, whereas their final model evaluations emphasized student
sense-making and communication in a more global, less heavily language-dependent
manner. Finally, Madison’s attention to models conveying a mechanism or a process
changed from being an addendum in her initial model evaluations to a central focus
underlying her final model evaluations. In all cases, preservice teachers’ final model
evaluations demonstrated a deeper understanding of modeling-based pedagogy and
the principles underlying criteria-driven student scientific model evaluation.
All preservice teachers’ written model evaluations
To examine which model evaluation criteriawere perhaps most tractable among the larger
set of preservice teachers, we analyzed written evaluations of an evaporation and conden-
sation model for the presence of model evaluation criteria. (The model was shown in
Figure 4 and was part of a homework assignment given in the methods course.) As this
was preservice teachers’ first experience providing written model evaluations, we
focussed mainly on the criteria cited in their responses, rather than the quality of those
responses. Figure 5 shows the aggregate data as percentages of preservice teachers who
used coded, grouped criteria (see Table 2 for coding scheme and example criteria).
Preservice Teachers’ Scientific Model Evaluations 1949
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Recall that criteria relating to sense-making, communication, and consistency with
scientific ideas and empirical evidence were three model evaluation criteria stressed
instructionally with the preservice teachers. Preservice teachers’ use of these three
model criteria was evident in their written evaluations of a student model (Figure 4),
with sense-making and communication occurring in 86 and 77%, respectively, of the
model evaluations (see Figure 5). Although the third criterion emphasized during
instruction—consistency with scientific ideas and empirical evidence—included state-
ments about the scientific accuracy of the model, it only appeared in roughly half the
model evaluations. Furthermore, as the data show, almost all preservice teachers men-
tioned features and aesthetics in their model evaluations. The preservice teachers noted
that model labels, keys, and neatness, for example, enabled the audience to make sense
of the model and improved the model’s communicative power. Thus, the utility of the
model’s features and aesthetics was to promote communication and sense-making.
Also noteworthy was the frequency of critiques of the ‘hows and whys’ behind the
student’s model of evaporation, captured in the ‘mechanism/process’ code. Many
preservice teachers (69%) took up the notion that scientific models should explain
the underlying processes associated with the phenomena (purely descriptive) and/or the mechanistic reasoning behind those processes (causal). The idea that a
model should convey procedural and/or mechanistic information was discussed on
several occasions during the methods course, although it was not explicitly and con-
sistently referenced as a major model evaluation criterion.
These teachers did not, however, tend to emphasize the issue of generativity—that
models should be able to be used to make explanations or predictions for other contexts.
Only 23% of the model evaluations incorporated evaluation along this criterion. During
the methods course, this criterion was discussed only briefly and was rarely invoked as a
model criterion. As this finding suggests, exploring preservice teachers’ perceptions of
model generativity would provide an interesting topic for further study.
How do the focal preservice teachers compare with the entire cohort in their written
model evaluations? Focal preservice teachers tended to highlight the same model
Figure 5. Preservice teachers’ use of criteria when evaluating a student-generated model of
evaporation and condensation (n ¼ 35)
1950 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
evaluation criteria as their peers in their written model evaluations. For example, Lezli and
Madison invoked the evaluation criteria repeatedly emphasized in the methods course.
This likely influence of instruction upon preservice teachers’ student model evaluations,
along with other notable themes observed in the data, is elaborated upon next.
Discussion
When examining the data across individuals in this study, we identified several themes
relating to the two research questions framing this study: how do preservice elementary
teachers evaluate elementary students’ scientific models? and how do preservice elemen-
tary teachers’ knowledge, skills, and self-efficacy for model evaluation change during the
elementary science teaching methods semester? Of course, no study is without limit-
ations, and our relatively small sample size and case study approach precludes general-
izations about preservice elementary teachers’ scientific model evaluations. The study
does, however, provide insight into a novel area of research: the development of PCK
for scientific model evaluation. We briefly discuss three overarching and inter-related
themes and their implications for preservice elementary science teacher education
with regard to the elements of PCK-SM identified in bold in Table 1.
Theme 1: Changes in knowledge
In this study, preservice teachers acknowledged their own gains in knowledge, particu-
larly having to do with scientific models, modeling practice, and modeling-based peda-
gogy. Specifically, Kali and Lezli spoke of how their ideas of what constituted a scientific
model had expanded to include two-dimensional representations such as diagrams and
graphs, echoing results from similar studies (Crawford & Cullin, 2004; Justi & van
Driel, 2005; Schwarz, 2009; Windschitl & Thompson, 2006). Arielle, Kali, and
Madison explicitly mentioned that models of the solar still should include processes
or mechanisms of evaporation and condensation. And in the final interviews, all four
preservice teachers noted that they now viewed engaging in modeling practice as a
means by which students could learn science. Model evaluation and revision were high-
lighted by Madison, who felt that these practices were particularly important to student
learning. Madison also identified another aspect of her learning that was consistent with
other preservice teachers’ statements: gaining a set of articulated criteria for scientific
model evaluation. Thus, these teachers developed more sophisticated modeling and
metamodeling knowledge during the course of the semester.
In addition to gaining knowledge about scientific models, modeling practice, and
modeling-based pedagogy, preservice teachers in this study gained subject matter
knowledge about evaporation and condensation. Kali alluded to this point in her
statements about having experienced the evaporation and condensation unit from
the student’s point of view and having more confidence in the student solar still
model evaluations than in the evaluation of a student model of germ transmission
(a topic that was not covered in the methods course). Except Madison, the inter-
viewed preservice teachers gave much more detailed evaluations of the student
Preservice Teachers’ Scientific Model Evaluations 1951
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
solar still models than they did of the student germ transmission model in final inter-
view, which suggests that they had a better idea of what they were hoping to see in
terms of science content in the solar still models. This was not surprising, as we did
not expect that preservice teachers would have extensive knowledge of germ trans-
mission mechanisms; however, we were interested in seeing whether preservice tea-
chers would apply the same set of model evaluation criteria to a novel student
model. Whereas the other preservice teachers evaluated the germ transmission
model roughly in terms of criteria discussed in the methods course, Madison was
the only participant who explicitly applied the same set of model evaluation criteria
to the germ transmission model evaluation as she did to the solar still models.
Knowledge relevant to both scientific model evaluation and science content is
evident in preservice teachers’ written evaluations of the student model of evaporation
and condensation. The majority of preservice teachers in this study applied model
evaluation criteria that were explicitly addressed in the methods course in their evalu-
ations of the student’s solar still model. In doing so, they simultaneously addressed the
scientific accuracy of student’s representation of evaporation and condensation embo-
died in the model.
What does this mean for preservice teachers’development of PCK for scientific model
evaluation? To effectively engage in evaluation of students’ scientific models, preservice
elementary teachers must develop content knowledge about the science represented in
the model, as well as knowledge of scientific models and modeling practices, including
knowledge of relevant model evaluation criteria. With appropriate supports in place,
development in these forms of modeling-related knowledge is possible (Crawford &
Cullin, 2004; Davis, Nelson, & Beyer, 2008; De Jong et al., 2005; Schwarz, 2009;
Windschitl et al., 2008b). While preservice teachers had an idea of the criteria with
which to evaluate a student-generated scientific model in a science topic not explored
in the methods course, their evaluations of this student work lacked the depth and rich-
ness of their evaluations of student models of evaporation and condensation, a very fam-
iliar topic for them. Additionally, developing preservice teachers’ understanding of the
science concepts supported systematic, criterion-based evaluation of students’ models,
as the changes after the initial interview data suggest. Supporting preservice teachers’
developing modeling practice in a variety of science content areas is warranted and
likely to benefit their modeling-based instructional practice (Justi & van Driel, 2005;
Kenyon et al., 2011; Schwarz, 2009). By helping preservice teachers recognize opportu-
nities for and advantages of modeling-based pedagogies in ways that are connected to
understandings of modeling and scientific content, teacher educators help establish
the foundation for developing PCK for scientific model evaluation.
Theme 2: Changes in model evaluation skills
As the semester unfolded, preservice teachers’ evaluations of students’ scientific
models became more standardized and skillful. As many of the focal preservice tea-
chers noted, they gained a set of articulated model evaluation criteria as part of
their knowledge relating to models and modeling. This was evident in many of the
1952 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
written model evaluations, which were structured to reflect the criteria emphasized in
the course. Our findings suggest that preservice teachers applied specific model evalu-
ation criteria in their mid-semester and end-of-semester model evaluations to a much
higher degree than they did in their initial, beginning-of-semester model evaluations.
These findings extend the body of work studying preservice teachers’ scientific mod-
eling knowledge- and skill-development by focussing on their knowledge, skills, and
self-efficacy around a very specific practice: evaluating elementary students’ scientific
models (Crawford & Cullin, 2004; Justi & van Driel, 2005; Schwarz, 2009; Schwarz
et al., 2009; Windschitl et al., 2008b). Arielle and Kali, in particular, shifted from
evaluating student models comparatively to evaluating student models individually,
applying a set of pre-determined evaluation criteria.
Changes in model evaluation skills relate closely to changes in knowledge and self-
efficacy. Increased knowledge of model evaluation criteria was evident in preservice tea-
chers’ mention of these criteria in their later, more highly skilled model evaluations.
Madison, in particular, explicitly noted the criteria she was applying in her end-of-
semester interview model evaluations and reflected upon her own improvement in
model evaluation attributable to knowing a set of articulated model evaluation criteria.
Kali noted that her confidence in her model evaluations increased, in part, due to
having experienced the evaporation and condensation unit, a sentiment that was
echoed by other preservice teachers in this study. As preservice teachers gained more
experience with evaporation/condensation model evaluation, their evaluations became
more focussed on what the model creator (the student) understood about evaporation
and condensation, and less self-conscious about their own critiques of the model.
Overall, our findings with respect to preservice teachers’ increased self-efficacies for
content knowledge-based scientific model evaluation relate favorably with, and extend
upon, other empirical findings of methods, course experience-mediated increases in pre-
service teacher knowledge and confidence (Palmer, 2006; Sherman & MacDonald,
2007; Windschitl & Thompson, 2006).
What does this mean for preservice teachers’development of PCK for scientific model
evaluation? In the development of skills related to PCK for scientific model evaluation,
experience, practice, and self-efficacy play instrumental roles. Having multiple oppor-
tunities to engage in the work of scientific modeling—particularly evaluating and revis-
ing models of evaporation and condensation—reinforced for preservice teachers the
subject matter knowledge captured in the models and introduced them to novel model-
ing-based pedagogies (Kenyon et al., 2011). Furthermore, these multiple, scaffolded
exposures to modeling facilitated preservice teachers’ mindful, focussed, and repeated
work developing their own modeling skills, science subject matter knowledge, and con-
fidence in their abilities to effectively analyze student models. Such experiences position
preservice teachers to develop the beginnings of PCK, termed ‘PCK-readiness’
(Smithey, 2007), by helping them identify, articulate, and carry out modeling practices
and pedagogies. In addition, these experiences with scientific modeling promote con-
sideration of modeling as viable instructional methodology. As Kali noted, engaging
in an elementary science unit involving model evaluation was essential to developing
both skills and confidence in evaluating scientific models, the subject of theme 3.
Preservice Teachers’ Scientific Model Evaluations 1953
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Theme 3: Changes in self-efficacy for model evaluation
Preservice teachers’ self-efficacy for evaluating student-generated scientific models also
increased as preservice teachers gained knowledge and experience with scientific model
evaluation. All four preservice teachers in this study felt that their model evaluations had
improved over the course of the semester, for a variety of reasons including increased
experience with model evaluation and additional knowledge of model evaluation criteria.
Preservice teachers’ self-efficacy for model evaluation appeared to be topic-specific and
related to the preservice teacher’s subject matter knowledge in the model topic (Palmer,
2006; Schoon & Boone, 1998). Kali alluded to this point in her statement about the role
of experiencing the evaporation and condensation unit in the methods course. Mean-
while, Arielle stated that the student’s germ transmission model was ‘not a model’,
but she did not elaborate on how this model was ‘not scientific’ in the same ways in
which she evaluated the scientific accuracy of a student’s solar still model. Madison pro-
vides an interesting counterexample as she was the only preservice teacher to evaluate
the germ transmission and evaporation/condensation models systematically, applying
the same criteria in each model evaluation. Overall, though, all four preservice teachers
demonstrated increased confidence in their model evaluations over the course of the
semester, which was reflected in their self-reports.
What does this mean for preservice teachers’ development of PCK for scientific
model evaluation? Part of developing PCK for scientific model evaluation lies in
developing confidence in one’s own knowledge and skills in model evaluation. To
what degree experience and knowledge each figure into self-efficacy for model evalu-
ation was not explored in this study and is likely to vary on an individual basis (Blei-
cher, 2009; Howitt, 2007; Palmer, 2006). However, supporting preservice elementary
teachers’ development of requisite knowledge to effectively evaluate scientific models
and providing multiple opportunities to engage in the work of model evaluation is
likely essential to bolstering confidence in their own model evaluation, and thus fos-
tering their development of PCK for scientific model evaluation.
Implications
Scientific modeling pedagogies provide rich opportunities for learners and teachers to
engage in multiple authentic scientific practices. For example, evaluating scientific
models entails critical examination of how well models represent data-derived under-
standings of scientific phenomena and consideration of the ways in which these under-
standings are made accessible to model audiences. In light of our findings about
preservice teachers’ evaluations of student-generated scientific models, we offer two
recommendations concerning preservice elementary teacher education in scientific
modeling.
First, teacher educators should incorporate multiple supports for developing preservice tea-
chers’ knowledge, skills, and self-efficacy in evaluation of student-generated scientific models.
As our data and others’ data suggest, preservice teachers’ modeling practice benefits
from instructional time dedicated to scientific modeling in a teaching methods course
1954 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
(Crawford & Cullin, 2004; Kenyon et al., 2011; Windschitl & Thompson, 2006).
Preservice teachers’ model evaluations became more knowledge-based, skilled,
and confident over the methods course semester, and preservice teachers themselves
were aware of these changes and accordingly demonstrated higher levels of self-
efficacy for student model evaluation. As part of this instructional time spent on
scientific modeling, we recommend providing preservice teachers with opportunities
not only to learn about scientific modeling, but to also engage in scientific modeling in
ways that are grounded in specific science content. Part of these instructional experi-
ences with scientific model evaluation should entail the use of well-articulated,
specific criteria and support preservice teachers in applying those judiciously selected
criteria to the evaluation of scientific models in a range of content areas, with
emphases on how those model evaluation criteria relate to the science subject
matter in question (Kenyon et al., 2011). As our data show, those model evaluation
criteria that were repeatedly emphasized (such as sense-making) were present more
consistently in preservice teachers’ model evaluations than criteria that featured
into the course less prominently (such as generativity). This has implications for
teacher educators’ choices of which model evaluation criteria to stress. Preservice
teachers benefit from (a) carefully scaffolded use of criteria in evaluating lesson
plans (Beyer, 2009; Davis, 2006; Schwarz et al., 2008) and (b) explicitly guided
work with scientific modeling (Kenyon et al., 2011; Nelson & Davis, 2009;
Schwarz, 2009; Sherman & MacDonald, 2007; Windschitl et al., 2008b). Similar
supportive measures regarding scientific model evaluation should help preservice
teachers develop this element of modeling-based pedagogy.
Second, teacher educators should familiarize themselves with modeling-based pedagogies and
develop their own skills and PCK for teaching preservice teachers how to incorporate modeling-
based pedagogies in elementary science instruction. For many teacher educators, this may
necessitate learning new approaches to the teaching of science (Windschitl et al.,
2008a). Often, it will require teacher educators to support preservice teachers in analyz-
ing and modifying existing curriculum materials to infuse a modeling focus. Here, we
describe one aspect of our initial attempts to support preservice elementary teachers
in acquiring the knowledge, skills, and self-efficacy necessary to effectively incorporate
modeling-based instructional approaches in the elementary science classroom. In
doing this work, we have gained an awareness of our own limitations regarding model-
ing-based pedagogies. Therefore, we recommend that time and effort in elementary
teacher education be devoted to scientific modeling—for preservice teachers as well as
their instructors, which means that teacher educators need to have sound understand-
ings of and experience with modeling-based pedagogies in science and how to teach
others to do this work. Thus, science teacher educators too must develop PCK for teach-
ing preservice science teachers (Abell, Rogers, Hanuscin, Lee, & Gagnon, 2009; Smith,
2000), and we propose that PCK for modeling-based pedagogies be an important part of
this PCK for teaching preservice science teachers (Kenyon et al., 2011).
In closing, we hope that this work will inform future studies of preservice elemen-
tary teachers’ developing of PCK with respect to scientific modeling. As our findings
suggest, it is an area ripe for further study.
Preservice Teachers’ Scientific Model Evaluations 1955
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Acknowledgements
This research was funded by the National Science Foundation under grant ESI-
0628199 to Northwestern University for the MoDeLS project. The opinions
expressed herein are those of the authors and not necessarily those of the NSF. We
thank the entire MoDeLS research group, at eight institutions, for their help in think-
ing about these ideas. We also thank the preservice teachers with whom we work,
especially those who agreed to participate in this study.
References
Abell, S.K., Rogers, M.A.P., Hanuscin, D.L., Lee, M.H., & Gagnon, M.J. (2009). Preparing the
next generation of science teacher educators: A model for developing PCK for teaching
science teachers. Journal of Science Teacher Education, 20(1), 77–93.
American Association for the Advancement of Science. (1993, 2009). Benchmarks for science literacy,
(Publication). Retrieved February 26, 2009, from AAAS: http://www.project2061.org/
publications/bsl/online/ch11/c-als11b.htm
Bandura, A. (1997). Self-efficacy: The exercise of control. New York: Freeman.
Beyer, C. (2009). Using reform-based criteria to support the development of preservice elementary teachers’
pedagogical design capacity for analyzing science curriculum materials, Dissertation, University of
Michigan, Ann Arbor.
Bleicher, R.E. (2004). Revisiting the STEBI-B: Measuring self-efficacy in preservice elementary
teachers. School Science and Mathematics, 104(8), 383–391.
Bleicher, R.E. (2006). Nurturing confidence in preservice elementary science teachers. Journal of
Science Teacher Education, 17(2), 165–187.
Bleicher, R.E. (2009). Variable relationships among different science learners in elementary science-
methods courses. International Journal of Science and Mathematics Education, 7(2), 293–313.
Borman, K.M., Clarke, C., Cotner, B., & Lee, R. (2006). Cross-case analysis. In J.L. Green, G.
Camilli, & P.B. Elmore (Eds.), Handbook of complementary methods in education research
(pp. 123–139). Mahwah, NJ: Lawrence Erlbaum Associates.
Crawford, B., & Cullin, M. (2004). Supporting prospective teachers’ conceptions of modeling in
science. International Journal of Science Education, 26, 1379–1401.
Davis, E.A. (2006). Preservice elementary teachers’ critique of instructional materials for science.
Science Education, 90(2), 348–375.
Davis, E.A., Nelson, M.M., & Beyer, C.J. (2008). Using educative curriculum materials to support teachers
in developing pedagogical content knowledge for scientific modeling. Paper presented at the Annual
Conference of the National Association for Research in Science Teaching, Baltimore, MD.
De Jong, O., van Driel, J.H., & Verloop, N. (2005). Preservice teachers’ pedagogical content knowledge of
using particle models in teaching chemistry. Journal of Research in Science Teaching, 42(8), 947–964.
Ericsson, K.A., & Simon, H.A. (1980). Verbal reports as data. Psychological Review, 87(3), 215–251.
Ford, M.J., & Wargo, B.M. (2007). Routines, roles, and responsibilities for aligning scientific and
classroom practices. Science Education, 91(1), 133–157.
Gilbert, J., & Boulter, C. (Eds.). (2001). Developing models in science education. Dordrecht: Kluwer.
Grossman, P.L. (1990). The making of a teacher: Teacher knowledge and teacher education. New York:
Teachers College Press.
Harrison, A., & Treagust, D. (2000). A typology of school science models. International Journal of
Science Education, 22(9), 1011–1026.
Howitt, C. (2007). Pre-service elementary teachers’ perceptions of factors in an holistic methods course
influencing their confidence in teaching science. Research in Science Education, 37(1), 41–58.
1956 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Justi, R., & Gilbert, J. (2002). Science teachers’ knowledge about and attitudes towards the use ofmodels
and modeling in learning science. International Journal of Science Education, 24(12), 1273–1292.
Justi, R., & van Driel, J. (2005). The development of science teachers’ knowledge on models and
modelling: Promoting, characterizing, and understanding the process. International Journal of
Science Education, 27(5), 549–573.
Kenyon, L., Davis, E.A., & Hug, B. (2011). Design approaches to support teachers in modeling
practices. Journal of Science Teacher Education, 22(1), 1–21.
Kenyon, L., Schwarz, C., & Hug, B. (2008). The benefits of scientific modeling: Constructing, using,
evaluating, and revising scientific models helps students advance their scientific ideas, learn to
think critically, and understand the nature of science. Science and Children, 46(2), 40–44.
Magnusson, S., Krajcik, J., & Borko, H. (1999). Nature, sources, and development of pedagogical
content knowledge for science teaching. In J. Gess-Newsome & N. Lederman (Eds.), Examin-
ing pedagogical content knowledge: The construct and its implications for science education
(pp. 95–132). The Netherlands: Kluwer Academic Publishers.
National Research Council. (1996). National Science Education Standards. Washington, DC:
National Academy Press.
National Research Council. (2010). A framework for science education. Preliminary public draft.
National Research Council of the National Academies. Retrieved from http://www.aapt.org/
Resources/upload/Draft-Framework-Science-Education.pdf
Nelson, M.M., & Davis, E.A. (2009). Preservice elementary teachers’ science lesson plan analyses and
modifications concerning scientific modeling: Insights into teacher knowledge application. Paper pre-
sented at the Annual Conference of the National Association for Research in Science Teaching,
Garden Grove, CA.
Palmer, D.H. (2006). Sources of self-efficacy in a science methods course for primary teacher edu-
cation students. Research in Science Education, 36(4), 337–353.
Piko, B.F., & Bak, J. (2006). Children’s perceptions of health and illness: Images and lay concepts in
preadolescence. Health Education Research, 21(5), 643–653.
Porter, A.C. (2006). Curriculum assessment. In J.L. Green, G. Camilli, & P.B. Elmore (Eds.),
Handbook of complementary methods in education research (pp. 141–159). Mahwah, NJ: Lawrence
Erlbaum Associates.
Schoon, K.J., & Boone, W.J. (1998). Self-efficacy and alternative conceptions of science of preser-
vice elementary teachers. Science Education, 82(5), 553–568.
Schwarz, C. (2009). Developing preservice elementary teachers’ knowledge and practices through
modeling-centered scientific inquiry. Science Education, 93(4), 720–744.
Schwarz, C., Meyer, J., & Sharma, A. (2007). Technology, pedagogy, and epistemology: Opportu-
nities and challenges of using computer modeling and simulation tools in elementary science
methods. Journal of Science Teacher Education, 18(2), 243–269.
Schwarz, C.V., Gunckel, K.L., Smith, E.L., Covitt, B.A., Bae, M., Enfield, M., & Tsurusaki, B.
(2008). Helping elementary preservice teachers learn to use curriculum materials for effective
science teaching. Science Education, 92(2), 345–377.
Schwarz, C.V., Reiser, B.J., Davis, E.A., Kenyon, L., Acher, A., Fortus, D., . . . Krajcik, J. (2009).
Developing a learning progression for scientific modeling: Making scientific modeling accessi-
ble and meaningful for learners. Journal of Research in Science Teaching, 46(6), 632–654.
Schwarz, C.V., & White, B.Y. (2005). Metamodeling knowledge: Developing students’ understand-
ing of scientific modeling. Cognition and Instruction, 23(2), 165–205.
Sherman, A., & MacDonald, L. (2007). Pre-service teachers’ experiences with a science education
module. Journal of Science Teacher Education, 18(4), 525–541.
Shulman, L.S. (1986). Those who understand: Knowledge growth in teaching. Educational
Researcher, 15(2), 4–14.
Smith, D.C. (2000). Content and pedagogical content knowledge for elementary science teacher
educators: Knowing our students. Journal of Science Teacher Education, 11(1), 27–46.
Preservice Teachers’ Scientific Model Evaluations 1957
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Smithey, J. (2007). Preservice elementary teachers’ development of PCK-readiness about learners’ science
ideas, Dissertation, University of Michigan, Ann Arbor.
van Driel, J.H., & De Jong, O. (2001). Investigating the development of preservice teachers’ pedagogical
content knowledge. Paper presented at the Annual Meeting of the National Association for
Research and Science Teaching, St. Louis, MO.
van Driel, J.H., De Jong, O., & Verloop, N. (2002). The development of preservice chemistry tea-
chers’ pedagogical content knowledge. Science Education, 86(4), 572–590.
Watters, J.J., & Diezmann, C.M. (2007). Multimedia resources to bridge the praxis gap: Modeling
practice in elementary science education. Journal of Science Teacher Education, 18(3), 349–375.
Windschitl, M. (2003). Inquiry projects in science teacher education: What can investigative experi-
ences reveal about teacher thinking and eventual classroom practice? Science Education, 87(1),
112–143.
Windschitl, M., & Thompson, J. (2006). Transcending simple forms of school science investigation:
The impact of preservice instruction on teachers’ understandings of model-based inquiry.
American Educational Research Journal, 43(4), 783–835.
Windschitl, M., Thompson, J., & Braaten, M. (2008a). Beyond the scientific method: Model-based
inquiry as a new paradigm of preference for school science investigations. Science Education,
92(5), 941–967.
Windschitl, M., Thompson, J., & Braaten, M. (2008b). How novice science teachers appropriate
epistemic discourses around model-based inquiry for use in classrooms. Cognition and Instruc-
tion, 26(3), 310–378.
Yin, R.K. (2006). Case study methods. In J.L. Green, G. Camilli, & P.B. Elmore (Eds.), Handbook
of complementary methods in education research (pp. 111–122). Mahwah, NJ: Lawrence Erlbaum
Associates.
Appendix. Interview protocols
To give me a better sense of your background, can you first tell me a little bit about
your experiences as a science learner? Do you (or did you) enjoy learning science?
(initial interview only)
How do you feel about teaching science at the elementary level? What are the most
important aspects of teaching science for you, as a teacher? What are you hoping that
your future students will get out of learning science in your classroom? Why are these
aspects important to you? (initial interview only)
Table A1. Interview questions and probes
Initial interview Final interview
When I say ‘scientific model’ what do you think
of? Can you give me some examples of scientific
models? Why would you consider these
scientific models? (contrast with something that
is not a scientific model?)
When I say ‘scientific model’ what do you think
of? Can you give me some examples of scientific
models? Why would you consider these
scientific models? Do you think your ideas about
scientific models have changed over the course
of the semester?
How do you think scientific models can be
useful for teachers? for learners? (refer to answer
from pretest) Can you say more about the
answers you gave in the journal exercise?
How do you think scientific models can be
useful for teachers? for learners? Do you think
your ideas about this have changed over the
course of the semester?
(Continued)
1958 M. M. Nelson and E. A. Davis
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4
Table A1. (Continued)
Initial interview Final interview
Let us take a look at your answers to the pretest
question about firefly luminescence . . .
(not asked in final interview)
Can you tell me about your thought process
when you were answering this question?
Think aloud #1: Think aloud #1:
For the next part of the interview, I am also
going to ask you to think out loud as you answer
the next few questions. OK—here is the
scenario: you are an elementary teacher who is
taking a look at some student work. The
students have been asked to draw a picture
model of evaporation/condensation in science.
Here is one student’s model of evaporation/condensation:
For the next part of the interview, I am also
going to ask you to think out loud as you answer
the next few questions. OK—here is the
scenario: you are a fourth-grade teacher who is
taking a look at some student work. The
students have been asked to draw a picture
model of evaporation/condensation in science.
Here is one student’s model of evaporation/condensation:
Now imagine that you are evaluating this
student’s model. Please ‘think aloud’ and tell me
how you would analyze and evaluate this
student’s model of evaporation/condensation
Now imagine that you are evaluating this
student’s model. Please ‘think aloud’ and tell me
how you would analyze and evaluate this
student’s model of evaporation/condensation
What types of considerations did you take into
account when judging the quality of this model?
What types of considerations did you take into
account when judging the quality of this model?
(What criteria did you use when judging the
quality of this model?)
(What criteria did you use when judging the
quality of this model?)
After: How do you feel about your analysis of
this student’s model?
Think aloud #2: same idea, different student’s
model of evaporation/condensation
Think aloud #2: same idea, different student’s
model of evaporation/condensation
Now imagine that you are evaluating this
student’s model. Please ‘think aloud’ and tell me
how you would evaluate this student’s model of
evaporation/condensation
Now imagine that you are evaluating this
student’s model. Please ‘think aloud’ and tell me
how you would evaluate this student’s model of
evaporation/condensation
What types of considerations did you take into
account when judging the quality of this model?
What types of considerations did you take into
account when judging the quality of this model?
After: How do you feel about your analysis of
this student’s model? In hindsight, is there
anything you would change about what you said
about either the first or the second student’s
models?
If I asked you to compare the two models, what
would you say?
If I asked you to compare the two models, what
would you say?
(none) Germ Transmission Model: now, imagine you
are evaluating a fourth grader’s model of germ
transmission. Again, please ‘think aloud’ and tell
me how you would evaluate this student’s work
(none) How do you feel about your evaluations of these
student models? Is this different from the first
interview when I asked you to evaluate student
models? If so, how?
Preservice Teachers’ Scientific Model Evaluations 1959
Dow
nloa
ded
by [
"Uni
vers
ity a
t Buf
falo
Lib
rari
es"]
at 0
6:59
05
Oct
ober
201
4