the journal of mathematical behavior volume 32 issue 3 2013 [doi 10.1016_j.jmathb.2013.04.002] kim,...

20
Journal of Mathematical Behavior 32 (2013) 377–396 Contents lists available at ScienceDirect The Journal of Mathematical Behavior journal h om epa ge: ww w.elsevier.com/locate/jmathb Multiple levels of metacognition and their elicitation through complex problem-solving tasks Young Rae Kim a,, Mi Sun Park a , Tamara J. Moore a , Sashank Varma b a STEM Education Center, University of Minnesota, 320 Learning & Environmental Science Building, 1954 Buford Avenue, St. Paul, MN 55108, United States b Department of Educational Psychology, 165 Education Sciences Building, University of Minnesota, Minneapolis, MN 55455, United States a r t i c l e i n f o Article history: Available online 25 June 2013 Keywords: Metacognition Definition building Operationalizing definitions Model-eliciting activities a b s t r a c t Building on prior efforts, we re-conceptualize metacognition on multiple levels, looking at the sources that trigger metacognition at the individual level, the social level, and the environmental level. This helps resolve the paradox of metacognition: metacognition is per- sonal, but it cannot be explained exclusively by individualistic conceptions. We develop a theoretical model of metacognition in collaborative problem solving based on models and modeling perspectives. The theoretical model addresses several challenges previously found in the research of metacognition. This paper illustrates how metacognition was elicited, at the environmental level, through problems requiring different problem-solving processes (definition building and operationalizing definitions), and how metacognition operated at both the individual level and the social level during complex problem solving. The re-conceptualization of metacognition has the potential to guide the development of metacognitive activities and effective instructional methods to integrate them into existing curricula that are necessary to engage students in active, higher-order learning. © 2013 Elsevier Inc. All rights reserved. 1. Introduction In the first overview of the new science of learning, Bransford, Brown, and Cocking (2000) emphasized the critical role of metacognition in successful learning. Metacognition is the process in which students monitor, assess, and modify their own learning progress. It can help students develop their knowledge for teaching themselves and improve positive learn- ing transfer to new settings and events. This has been demonstrated in numerous studies across multiple disciplines (e.g., Bielaczyc, Pirolli, & Brown, 1995; Borkowski, Carr, & Pressely, 1987; Muir, Beswick, & Williamson, 2008; Rasekh & Ranjbary, 2003; Schraw, 1998; White & Frederickson, 1998). These studies demonstrate the need for instructional approaches to help students become more metacognitive about their learning. However, more needs to be understood about the mechanisms of metacognition, how to effectively encourage students’ metacognition in problem solving, and how to promote the devel- opment of students’ metacognitive abilities a mechanism that enables one efficiently to organize, monitor, and regulate what one knows to reach a goal successfully. Metacognition has traditionally been defined at the individual level, as thinking about one’s own thinking (Flavell, 1976). In the research presented here, we re-conceptualize the construct of metacognition on multiple levels, considering thinking about thinking at the individual level, the social level, and the environmental level. At the individual level, a student has access Corresponding author at: Department of Curriculum and Instruction, STEM Education Center, University of Minnesota, 320 Learning & Environmental Science Building, 1954 Buford Avenue, St. Paul, MN 55108, United States. Tel.: +1 612 807 0951; fax: +1 612 626 0993. E-mail addresses: [email protected] (Y.R. Kim), [email protected] (M.S. Park), [email protected] (T.J. Moore), [email protected] (S. Varma). 0732-3123/$ see front matter © 2013 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.jmathb.2013.04.002

Upload: izzulislam

Post on 26-Dec-2015

15 views

Category:

Documents


0 download

DESCRIPTION

oh yea

TRANSCRIPT

Page 1: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

Mc

Ya

5b

S

AA

KMDOM

1

ooiB2soow

Ia

S

0h

Journal of Mathematical Behavior 32 (2013) 377– 396

Contents lists available at ScienceDirect

The Journal of Mathematical Behavior

journa l h om epa ge: ww w.elsev ier .com/ locate / jmathb

ultiple levels of metacognition and their elicitation throughomplex problem-solving tasks

oung Rae Kima,∗, Mi Sun Parka, Tamara J. Moorea, Sashank Varmab

STEM Education Center, University of Minnesota, 320 Learning & Environmental Science Building, 1954 Buford Avenue, St. Paul, MN5108, United StatesDepartment of Educational Psychology, 165 Education Sciences Building, University of Minnesota, Minneapolis, MN 55455, United

tates

a r t i c l e i n f o

rticle history:vailable online 25 June 2013

eywords:etacognitionefinition buildingperationalizing definitionsodel-eliciting activities

a b s t r a c t

Building on prior efforts, we re-conceptualize metacognition on multiple levels, lookingat the sources that trigger metacognition at the individual level, the social level, and theenvironmental level. This helps resolve the paradox of metacognition: metacognition is per-sonal, but it cannot be explained exclusively by individualistic conceptions. We developa theoretical model of metacognition in collaborative problem solving based on modelsand modeling perspectives. The theoretical model addresses several challenges previouslyfound in the research of metacognition. This paper illustrates how metacognition waselicited, at the environmental level, through problems requiring different problem-solvingprocesses (definition building and operationalizing definitions), and how metacognitionoperated at both the individual level and the social level during complex problem solving.The re-conceptualization of metacognition has the potential to guide the development ofmetacognitive activities and effective instructional methods to integrate them into existingcurricula that are necessary to engage students in active, higher-order learning.

© 2013 Elsevier Inc. All rights reserved.

. Introduction

In the first overview of the new science of learning, Bransford, Brown, and Cocking (2000) emphasized the critical rolef metacognition in successful learning. Metacognition is the process in which students monitor, assess, and modify theirwn learning progress. It can help students develop their knowledge for teaching themselves and improve positive learn-ng transfer to new settings and events. This has been demonstrated in numerous studies across multiple disciplines (e.g.,ielaczyc, Pirolli, & Brown, 1995; Borkowski, Carr, & Pressely, 1987; Muir, Beswick, & Williamson, 2008; Rasekh & Ranjbary,003; Schraw, 1998; White & Frederickson, 1998). These studies demonstrate the need for instructional approaches to helptudents become more metacognitive about their learning. However, more needs to be understood about the mechanismsf metacognition, how to effectively encourage students’ metacognition in problem solving, and how to promote the devel-pment of students’ metacognitive abilities – a mechanism that enables one efficiently to organize, monitor, and regulatehat one knows to reach a goal successfully.

Metacognition has traditionally been defined at the individual level, as thinking about one’s own thinking (Flavell, 1976).n the research presented here, we re-conceptualize the construct of metacognition on multiple levels, considering thinkingbout thinking at the individual level, the social level, and the environmental level. At the individual level, a student has access

∗ Corresponding author at: Department of Curriculum and Instruction, STEM Education Center, University of Minnesota, 320 Learning & Environmentalcience Building, 1954 Buford Avenue, St. Paul, MN 55108, United States. Tel.: +1 612 807 0951; fax: +1 612 626 0993.

E-mail addresses: [email protected] (Y.R. Kim), [email protected] (M.S. Park), [email protected] (T.J. Moore), [email protected] (S. Varma).

732-3123/$ – see front matter © 2013 Elsevier Inc. All rights reserved.ttp://dx.doi.org/10.1016/j.jmathb.2013.04.002

Page 2: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

378 Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396

to internal sources to monitor or regulate her/his cognitive processes. However, this raises what we term the paradox ofmetacognition: metacognition is personal, but it cannot be explained exclusively by individualistic conceptions (e.g., Iiskala,Vauras, & Lehtinen, 2004; Iiskala, Vauras, Lehtinen, & Salonen, 2011; Vauras, Iiskala, Kajamies, Kinnunen, & Lehtinen, 2003).For example, individuals may experience a time when they are “stuck.” If individuals only have access to their own internalthinking to help them resolve their obstacle – thinking which caused them to get stuck in the first place – how can theymake progress?1 Another example might be individuals lacking good self-regulation abilities. How can they detect theirown (false) cognition when it goes awry, and adapt their thinking? One way to resolve this paradox is to observe that inaddition to their internal psychological resources, individuals also have access to external sources that trigger metacognitivethinking. These external sources include both social triggers that come from other people interacting with the individual, andenvironmental triggers that come from interacting with the environment in which one is learning. This access to externalsources of metacognition informs how metacognitive failures, such as an absence of checking behavior (Stacey, 1992),metacognitive blindness, metacognitive vandalism, and metacognitive mirage (Goos, 2002), can be prevented or resolved.The research reported here shows how metacognition, operating at multiple levels (individual, social, and environmental),functions during complex collaborative problem solving to overcome the paradox of metacognition, and ultimately informsinstructional practice.

We consider metacognition in the context of solving complex problems of the kind found in mathematics and scienceclassrooms. In particular, we focus on two processes required by complex problem solving, definition building and operational-izing definitions. These processes are strongly dependent upon (1) whether or not problems involve directed information,including clear definitions and unique solution paths, to accomplish well-defined goals; and (2) the degree to which problemsare directed in the conceptualization and planning of the problem-solvers’ final argument. Non-triviality and complexity indefinition building and operationalizing definitions are the representative sources that trigger metacognition at the environ-mental level. They hold potential for revealing metacognition at the individual, and especially the social levels (e.g., Efklides,2006; Iiskala et al., 2011; Prins, Veenman, & Elshout, 2006). The central research questions of this study are:

(1) How is metacognition elicited through the definition building and operationalizing definitions processes during complexcollaborative problem solving?

(2) How does metacognition operate during complex collaborative problem solving at the individual, social, and environ-mental levels?

The type of complex problems we consider here are Model-Eliciting Activities (MEAs). MEAs are team-oriented, interdis-ciplinary, and realistic problem-solving tasks that reveal participants’ thinking (Chamberlin & Moon, 2005; Diefes, Moore,Zawojewski, Imbrie, & Follman, 2004; Lesh & Doerr, 2003; Lesh, Hoover, Hole, Kelly, & Post, 2000; Moore & Diefes-Dux,2004; Moore & Hjalmarson, 2010; Moore, Diefes-Dux, & Imbrie, 2006). They were initially created by mathematics educa-tors as a research tool for exploring students’ conceptual understanding and problem-solving strategies (Lesh et al., 2000;Lesh & Lamon, 1992). Thus, they work well as an authentic method for verbal protocol analysis, since students are requiredto verbalize their thoughts while working on MEAs in teams in natural classroom settings. Here, the Paper Airplane Con-test MEA (described in Section 3.3) was used to explore how metacognition on multiple levels can be fostered by complexcollaborative problem solving.

2. Multiple levels of metacognition

In this section, we review a subset of the expansive literature on metacognition, focusing on the distinction among theindividual, social, and environmental levels. We also review the literature suggesting the utility of MEAs for studying thedevelopment of metacognition on multiple levels as authentic methodological tools and sources of metacognition at theenvironmental level.

This research study has adopted the Models and Modeling Perspectives (MMP) in order to study metacognition. MMP hasbeen expanded and applied to the teaching and learning of various subjects, in particular in STEM (Science, Technology, Engi-neering, and Mathematics) education. The focus of MMP is on how students develop conceptual systems of interpretation thatinclude the complexity of their daily lives and their knowledge and experiences of various content domains in collaborativemodeling problem-solving settings (Lesh & Doerr, 2003). These perspectives provide new insights into metacognition com-

pared with traditional viewpoints of metacognition (Lesh & Zawojewski, 2007; Lesh, Lester, & Hjalmarson, 2003). One focusof MMP research is on individuals’ conceptual systems, including both cognitive (e.g., understanding, skills) and metacog-nitive (e.g., beliefs, awareness) components. The MMP assumes that both cognitive and metacognitive components withinthe holistic conceptual systems interactively and bi-directionally influence each other. For example, as students increase

1 There are anecdotal examples of mathematicians and natural scientists being “stuck” on hard problems and then becoming “unstuck” on their own,following an “incubation” period lasting many days, weeks, or months (Hadamard, 1954). Indeed, the Gestalt tradition made much of the role of “incubation”in problem solving (Duncker, 1945). However, modern psychological studies have found little evidence that “incubation” plays a fundamental role in problemsolving (Kaplan & Simon, 1990). Thus, the proposal that there are problems that people can solve on their own (i.e., given internal knowledge sources) andadditional problems they can solve in collaboration with others (i.e., given external knowledge sources) merits investigation.

Page 3: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396 379

tdmmoms

tomttpacpb

taw

Fig. 1. Diagram showing how the different internal and external triggers affect cognition and metacognition.

heir understanding, their metacognition is triggered effectively because encouraging individual metacognition helps stu-ents develop better understanding (Lesh & Zawojewski, 2007; Lesh et al., 2003). The MMP assumes that thinking becomesetacognitive when the individuals shift from “thinking WITH” the cognitive components to “thinking ABOUT” them, byonitoring, controlling, and regulating them (Lesh et al., 2003). The MMP assumes that metacognition could be devel-

ped along with dimensions similar to cognitive development. For example, when observing the development of students’etacognitive abilities in problem solving, we are expected to observe the gradual patterns of metacognitive behaviors,

uch as external toward internal, concrete toward abstract, simple toward complex, and so on (e.g., Lesh et al., 2003).Another focus of MMP research is on the situated, environmental, and social nature of metacognition. The MMP assumes

hat metacognition is closely related to particular content and situations because it draws on individuals’ interpretationsf the content and situations based on their conceptual systems. By valuing the situated and environmental nature ofetacognition, the MMP assumes that the productivity of metacognitive functions often varies across problems, and even

he stages of problem solving. This is due to the focus of (sub)tasks being different across problems and changed duringhe problem-solving processes. For example, brainstorming may be more productive in the early stages of problem-solvingrocesses rather than at later stages, such as assessment (Lesh et al., 2003). The social nature of metacognition is also

research focus of the MMP. Research from the MMP often investigates problem solving while students are engaged inollaborative teams, and compares teams with individuals. The MMP suggests that investigations focusing on a team oftenrovide a useful way to inform how one individual’s mind works in context, in particular how one individual’s thinkingecomes metacognitive (Lesh et al., 2003; Magiera & Zawojewski, 2011).

The three potential triggers of metacognition – individual, social, and environmental – affect the individual’s cogni-

ion and metacognition. Fig. 1 shows the framework used in this study, based on the MMP view of metacognition and

re-conceptualization of metacognitive triggers. The diagram in the top half of Fig. 1 indicates the extended sources tohich an individual has access for triggering metacognition. The sources of metacognition, as starting places for triggering

Page 4: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

380 Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396

metacognition, are categorized as internal and external. The internal sources are individuals’ conceptual systems that includeboth cognitive and metacognitive components. The external sources are partitioned into (1) others’ conceptual systems incollaborative activities and (2) environmental sources. The function machine in the bottom half of Fig. 1 shows how individ-uals’ thinking becomes metacognitive. For example, when an individual who works in a group uses a mathematical conceptof average to solve a problem without any evaluation, he/she thinks WITH the cognitive component “knowledge of the aver-age” from herself/himself (IC) or from group members (OC). On the other hand, when an individual evaluates whether theconcept of average is proper for the problem situation, he/she thinks ABOUT the cognitive component, and her/his thinkingbecomes metacognitive. In Fig. 1, the use of the Venn diagram emphasizes that the sources of metacognition are not activeagents but rather starting places for the metacognitive activities. The use of the function machine representation emphasizesour focus on the functions of metacognition that an individual as the unique agent of metacognition triggers.

The following sections review literature supporting the multiple levels of metacognition conceptualized for this study.

2.1. Metacognition at the individual level

Metacognition commonly refers to thinking about our own thinking, but the definition is not useful enough to studymetacognitive behavior of children in mathematical problem solving and in learning mathematics (Schoenfeld, 1987; Wilson& Clarke, 2002). Research on metacognition frequently presents knowledge of cognition and regulation of cognition as the twomain aspects of metacognition (e.g., Flavell, 1976; Garofalo & Lester, 1985; Schoenfeld, 1987). However, there have beendifferent views on the main aspects for the construct of metacognition, how to categorize these main aspects, and how toestablish the relationship between them. These discrepancies have caused confusion over the term metacognition (Garofalo& Lester, 1985; Wilson & Clarke, 2002). Schoenfeld (1987), for example, separated beliefs as a distinct category from theknowledge of cognition while others consider them as a type of subjective knowledge (Efklides, 2006; Garofalo & Lester,1985; Norman, 1981).

Focusing on the functions of metacognition itself, there is agreement upon the operational definition of metacognition, i.e.metacognition as manifestations of the monitoring and regulatory function. For example, to define the construct of metacog-nition, Wilson and Clarke (2002) emphasized the two non-regulatory functions of metacognition: “awareness individualshave of their own thinking” (Awareness) and “their evaluation of that thinking” (Evaluation). These monitoring functionsare distinguished from the regulatory function of metacognition, “their regulation of that thinking” (Regulation). Similarly,Efklides (2006) presented three facets of metacognition distinguished by their manifestations as a function of monitoring andcontrol. The monitoring functions are metacognitive knowledge – knowledge about one’s own cognition and metacognitiveexperiences – metacognitive judgment and assessment based on monitoring the features or outcomes of the task at hand.The control function of metacognition is metacognitive skills – the knowledge of the procedures needed to control cognition(Efklides, 2006).

However, the existing literature has paid less attention to the sources of metacognition that are not active agents, butrather starting places for the metacognitive activities. For example, knowledge and attributes that individuals have are thesources to which they have access for eliciting metacognition. That is why the current study takes the MMP as a theoreticalframework for the construct of metacognition. As mentioned above, a primary focus of MMP research is on individuals’conceptual systems that include both cognitive and metacognitive components. At the individual level, the sources ofmetacognition are the individual’s conceptual systems that are cumulated with prior knowledge and experiences. Individualdifferences play a critical role because people possess differing conceptual systems.

For this study, we merged the viewpoint of the MMP with a portion of Goos’ (2002) metacognitive constructs framework,which she developed based on the episode-based frameworks of Schoenfeld (1985) and Artzt and Armour-Thomas (1992).We will use our framework to identify students’ specific metacognitive behaviors while collaboratively solving a complexmathematical task. In Goos’ (2002) framework for analyzing verbal protocols in a collaborative problem-solving setting, sheconsidered the monitoring and regulatory functions of metacognition that would be appropriate and expected at each stageof the problem-solving processes. In particular, the monitoring function involves assessment of one’s own thinking: assess-ment of knowledge, assessment of understanding, assessment of strategy appropriateness, assessment of progress toward goal,assessment of strategy execution, and assessment of the accuracy or sense of a result. The regulatory function of metacognitionis triggered based on these assessing processes: identifying new (alternative) information (strategy), reinterpreting problem,changing strategy, correcting errors, and so on. The current study adopted this classification of metacognitive activities asmanifestations of the monitoring and regulatory functions. However, Goos presented this framework as linear, because sheadopted a linear progression viewpoint of problem solving. MMP considers problem-solving processes to be iterative andinherently tied to the prior knowledge of learners. Therefore, problem solving is not linear, nor can we expect metacognitiveprocesses (monitoring and regulating) to be so. This has led to a redesign of the operational definition of metacognition(Fig. 2). For our representation of metacognitive processes, we have rearranged Goos’ (2002) framework to represent theMMP view of how problem solving occurs.

2.2. Metacognition at the social level

All manifestations of metacognition, however, cannot be reduced to the individual level only (Iiskala et al., 2004, 2011). Theparadox of metacognition posits that metacognition is personal, but it is not sufficient to completely explain metacognition

Page 5: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396 381

betmm&

wVr(atcoime

oai(ibw

ibo

ptmspc11

ta

Fig. 2. An MMP view of metacognitive activities during problem solving.

y exclusively drawing on individualistic conceptions (Iiskala et al., 2004, 2011; Jost, Kruglanski, & Nelson, 1998; Vaurast al., 2003; Volet, Vauras, & Salonen, 2009). For example, it is difficult for individuals who have little metacognitive abilityo monitor and evaluate their own learning and understanding. Therefore, how do these individuals overcome temporary

etacognitive failures? Questions such as this one have led researchers to call increased attention to the social nature ofetacognition (Efklides, 2006, 2008; Goos, Galbraith, & Renshaw, 2002; Iiskala et al., 2004, 2011; Jost et al., 1998; Magiera

Zawojewski, 2011; Vauras et al., 2003).One way to resolve this paradox is to consider a dual agent organization of metacognition, an individual and a group in

hich the individual is engaged (Iiskala et al., 2004, 2011; Vauras et al., 2003; Volet, Summers, & Thurman, 2009; Volet,auras, et al., 2009). In this perspective, a group is considered as a whole agent of “metacognition at the social level.” Severalesearchers have conceptualized metacognition at the social level using language such as socially shared metacognitionIiskala et al., 2004, 2011), shared-regulation (Vauras et al., 2003), and co-regulation (Volet, Summers, et al., 2009). A groups a social system (Salomon & Globerson, 1989) is recognized as an entity pursuing a common goal. The metacognitionriggered by the group members operates as a whole to jointly monitor and regulate a cognitive process toward a commononsensual goal (Iiskala et al., 2011; Vauras et al., 2003; Volet, Vauras, et al., 2009). Thus, a unit for analyzing verbal protocols isften an episode that is a set of cognitive or metacognitive activities triggered by the members in a group as a whole entity, notndividual activities (Iiskala et al., 2004, 2011; Vauras et al., 2003; Volet, Summers, et al., 2009). This perspective emphasizes

etacognition as social processes that are not context variables that facilitate metacognition as individual processes (Iiskalat al., 2011).

Collaborative problem-solving settings have provided a useful window into research on metacognition through the lensf individual cognitive theories, as well as through the lens of social cognitive theories (Lesh et al., 2003). However, the maingent of metacognition is still an individual, regardless of whether the individual engages in collaborative teams or worksndependently. Metacognition itself is a mental process within an individual drawing on the individuals’ conceptual systemsLesh et al., 2003). Iiskala et al. (2011) criticized this individualistic conception of metacognition. However, we consider anndividual as the unique agent of metacognition. The situated, environmental and social nature of metacognition needs toe explored by keeping the focus on the unique agent of metacognition. Metacognition is not an autonomous entity overhich an individual has no control; thus, it can be practiced and developed (Lesh et al., 2003; Schoenfeld, 1987).

Re-conceptualizing metacognition on multiple levels, focusing on the sources that trigger metacognition at both thendividual and social levels, is a different way to resolve the paradox of metacognition because metacognition is supportedy external sources, in addition to internal ones. The agent of metacognition is an individual who has access to the sourcesf metacognition at both the individual and social levels.

At the social level, one source is the description of the development of conceptual systems of other individuals, such aseers or teachers. One person talking aloud about the development of her or his thinking around a particular conceptualiza-ion of a problem can provide feedback for another person’s thinking about her or his own conceptualization. Considering

etacognitive feedback from one’s mind to another’s helps resolve the paradox of metacognition, and is an important con-equence of having students solve problems collaboratively (Lesh et al., 2003; Lesh & Zawojewski, 2007). Interactions witheers and teachers are the main sources that encourage individuals to retest their own learning process, to monitor theirurrent level of understanding, and to detect and repair their misconceptions (Carr & Biddlecomb, 1998; Goos & Galbraith,996; Goos et al., 2002; Goos, 1994, 2002; Kramarski & Mevarech, 2003; Lesh & Zawojewski, 2007; Lesh et al., 2003; Schraw,

998).

Social sources influencing metacognition are important variables in changing students’ attitudes and beliefs towardhemselves as mathematical problem-solvers, toward mathematical problems, and toward mathematics itself, all of whichre important components in mathematical metacognition (Schoenfeld, 1987). For example, Wilkins and Ma (2003) found

Page 6: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

382 Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396

that social sources, such as high expectations from teachers and influence from peers, have a significant impact on thedevelopment of positive attitudes toward mathematics. By contrast, individual sources, such as prior achievement andeducational aspirations, are not significantly related to changes in attitude, and in fact, can have a negative impact.

2.3. Sources of metacognition at the environmental level

The external sources of metacognition are not limited to interactions among participants in collaborative activities (Goos,1994, 2002; Goos et al., 2002). Interactions between a person (or persons) and the learning environment are also importantsources that trigger metacognition (Iiskala et al., 2011; Magiera & Zawojewski, 2011; Volet, Vauras, et al., 2009). Interac-tions with the learning environment are potential sources encouraging students to develop metacognitive ability. Theseinteractions help students unpack misconceptions and repair them through metacognitive processes operating at both theindividual and social levels.

The learning environment, as an external source, supports metacognition through classroom activities and throughspecific problem-solving tasks (Lesh et al., 2003; Lesh & Zawojewski, 2007). Each activity or task involves a different focusof problem solving, such as analyzing and creating, which directly affects the focus of metacognition (Lesh et al., 2003;Stacey, 1992). Problems requiring different levels in conceptual and cognitive demands of the problem-solving processescan produce differing metacognitive functions involving different focuses of metacognition within problem-solvers (Leshet al., 2003). Task complexity is another important factor in the elicitation of metacognition. Metacognition is triggeredmore during difficult problems (Helms-Lorenz & Jacobse, 2008; Iiskala et al., 2004, 2011; Prins et al., 2006; Stahl, Pieschl, &Bromme, 2006; Vauras et al., 2003).

The following sections describe in more detail the sources of metacognition at the environmental level. In particular,we use the primary data collection tools of MMP, Model-Eliciting Activities (MEAs). We use MEAs in this study for twofunctions: as an authentic methodological tool for studying metacognition on multiple levels, and as an environmentalsource for supporting metacognition.

2.3.1. The learning environment and metacognitionTraditional lectures and individual problem-solving tasks, such as textbook word problems, do not encourage or elicit

metacognition, particularly at the social and environmental levels, because they do not require higher-order thinking. There-fore, these environments are insufficient for investigating our expansion of the metacognition construct. Teacher-centeredapproaches to instruction allow limited metacognitive processes. However, this is highly dependent on the knowledge ofthe teacher because these approaches are associated chiefly with the transmission of knowledge. Brown (2003) argued thatteachers in direct instruction environments retain control over students’ learning and focus more on content rather thanon students’ processing. It is not expected that students will develop broad and flexible metacognitive ability in this staticenvironment (Lesh et al., 2003). For example, traditional lectures include traditional word problems that encourage studentsto apply learned processes in a rote way, but not to develop or practice their own ways of thinking (Cardelle-Elawar, 1995).

Traditional word problems provide students with the information necessary to solve problems. Students only have toidentify the proper procedure and the correct inputs to accomplish concrete, well-defined goals. These types of problemsencourage students to maintain a “right path” by removing obstacles in the path. To encourage metacognition through prob-lem solving, different types of problem-solving tasks are required. In student-centered approaches, students are encouragedto build their own strategies and enhance their learning through monitoring and evaluating the processes and productsof social activities (Bransford et al., 2000). Social activities can be emphasized through many active learning environmentsusing pedagogical constructs such as MEAs, cooperative learning, and project-based learning.

2.3.2. Problem-solving activities and the focus of metacognitionThe literature classifies problem solving into two distinct classes. Common labels for these classes are well-defined versus

ill-defined (Kitchener, 1983; Schraw, Dunkle, & Bendixen, 1995) and well-structured versus ill-structured (Jonassen, 1997;Shin, Jonassen, & McGee, 2003). Well-defined problems are completely specified by the information given (initial state),the end goal of the problem (goal state), the methods at hand (operators), and the space of possibilities (problem space)(Newell & Simon, 1972). Examples of well-defined problems include proving logical theorems and completing puzzles suchas the Tower of Hanoi. A characteristic of well-defined problems is the existence of a “correct” or “optimal” path between theinitial and goal state. The focus of such problems is to make progress toward the goal state through sequentially applying theoperators to the initial inputs. Students’ metacognitive processes (monitoring and regulating) focus on moving along the pathwithout deviation, including removing obstacles. Thus, the focus of metacognition is “to maximize positive characteristicswithin initially (adequate) ways of thinking” (Lesh et al., 2003, p. 387).

In contrast, in ill-defined problems, some of the information present in well-defined problems is missing. This can be theinitial state, the goal state, the operators for moving between states, and methods for achieving goals and subgoals (such

as removing obstacles). An important part of solving an ill-defined problem is finding or defining this missing information.Thus, solving an ill-defined problem requires extensive metacognition to formalize the informal, and to evaluate the resultingformalizations. Ill-defined problems typically involve complex real-world contexts, whereas well-defined problems tend tolook like traditional textbook word problems.
Page 7: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396 383

Table 1Principles for guiding MEA development (Lesh et al., 2000).

Principle Description

Reality Requires the activity to be posed in a realistic mathematical context and to be designed so that the students caninterpret the activity meaningfully from their different levels of mathematical ability and general knowledge

Model construction Ensures the activity requires the construction of an explicit description, explanation, or procedure for amathematically significant situation

Model documentation Ensures that the students are required to create some form of documentation that will reveal explicitly how they arethinking about the problem situation

Self-assessment Ensures that the activity contains criteria the students can identify and use to test and revise their current ways ofthinking

Generalizability Also known as the Model Share-Ability and Re-Usability Principle. Requires students to produce solutions that are

mwMaaoc

2

oav1

cbnt

meHfaki2w

cpsa

tstTMdtm

M

shareable with others and modifiable for other closely related engineering situationsEffective prototype Ensures that the model produced will be as simple as possible, yet still mathematically significant for learning

purposes (i.e., a learning prototype, or a “big idea” in mathematics)

Of course, many problems in everyday life may not be divide-able into the dichotomous categories indicated above, anday involve different levels of complexity. The MEAs considered for this study do not completely fit into the categories ofell-defined and ill-defined. They involve characteristics of both, a consequence of the six design principles to which everyEA must adhere (Lesh et al., 2000; Moore & Diefes-Dux, 2004) (Table 1). The focus of problem solving for MEAs is to develop

n effective method or model satisfying specific criteria for success and quality, rather than to identify the proper procedurelready existing (the focus of well-defined problems) or to find or define only one part of missing information (the focusf ill-defined problems). Thus, the focus of metacognition when solving MEAs is “to minimize negative characteristics ofurrent (inadequate) ways of thinking and develop beyond them” (Lesh et al., 2003, p. 388).

.3.3. Model-eliciting activitiesThe environmental considerations articulated in the prior section led us to consider MEAs for studying metacognition

n multiple levels. This section provides more details about MEAs and how they support the study of metacognition byddressing several criticisms of self-report methods commonly used in research on metacognition, including: accessibility,eridicality, retrieval issues, completeness, and reactivity due to artificial setting (Ericsson & Simon, 1980; Garofalo & Lester,985; Goos & Galbraith, 1996; Schoenfeld, 1985; Wilson & Clarke, 2002).

MEAs are the most common type of problem-solving activity within the MMP (Lesh & Zawojewski, 2007). They areomplex, open-ended problems in which problem-solvers need to define constructs for solving the task at hand (definitionuilding), then mathematize those definitions and develop arguments for why these mathematical constructs meet theeeds of the situation (operationalizing definitions). These types of problems encourage problem-solvers to think abouthinking, both their own and others’, and to monitor or regulate possible alternative processes.

MEAs are problem-solving tasks related to real-world situations and involve the development or design of mathematicalodels, where models are conceptual systems that describe, explain, or represent an experience, a complex series of experi-

nces, or another system for a purpose (English, 2008; Hamilton, Lesh, Lester, & Brilleslyper, 2008; Lesh & Doerr, 2003; Lesh &arel, 2003). An important attribute of MEAs is that they require students to develop a procedure, or model, to do something

or someone else, thus requiring students to document their thinking through speaking and writing (thus addressing theccessibility of students’ mental processes). Students reveal their thinking about the given context based on their existingnowledge or experiences and their development of new ideas throughout participation in MEAs (thus addressing retrieval

ssues: what students are saying is what they are thinking, rather than what they were thinking) (Diefes et al., 2004; English,008; Hamilton et al., 2008; Lesh & Doerr, 2003; Lesh et al., 2000). This enables tracing the conceptual process of students,hich provides opportunities to detect and repair students’ misconceptions (Hamilton et al., 2008).

MEAs require that student teams go through multiple cycles of revision, usually referred to as “express-test-revise”ycles, where teams express their current ways of thinking about the problem solution, test those ideas through informationrovided by the problem or by other team members, and revise their thinking based on these tests. The final models thattudents develop are required to be transportable, reusable, and sharable (Hamilton et al., 2008; Lesh & Doerr, 2003), anttribute that aids in the necessity of the revision cycles.

MEAs are fundamentally social, team-oriented tasks. In natural classroom settings (thus reducing the problem of reac-ivity), three to four students in a group share their multiple perspectives to develop a model satisfying the needs of theituation. This also addresses the issue of veridicality, because students in this environment spontaneously verbalize theirhoughts to achieve a common goal, rather than by request (English, 2008; Hamilton et al., 2008; Moore & Diefes-Dux, 2004).hroughout the problem-solving processes in MEAs, students have to make several agreements. This collaborative aspect ofEAs assures much more verbalization of students’ thoughts, which helps with completeness (Schoenfeld, 1985). Here, stu-

ents reveal their cognitive processes much more than in most learning environments due to the need to communicate their

hinking to one another in order to solve the problem. This provokes considerable metacognitive functions, with students

onitoring and regulating their own and each other’s thinking (Goos & Galbraith, 1996).Working in small groups, students also develop collaboration skills (Diefes et al., 2004; English, 2008; Hamilton et al., 2008;

oore & Diefes-Dux, 2004; Moore et al., 2006), which help teachers address the need for communication in mathematics

Page 8: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

384 Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396

(NCTM, 2000). In this regard, MEAs are thought-revealing and collaborative activities (Lesh & Doerr, 2003; Lesh et al., 2000),making them excellent research sites in which to study our re-conceptualized definition of metacognition.

2.3.4. MEAs as the sources of metacognition at the environmental levelThe individual, and especially the social levels of metacognition can potentially be revealed through solving complex

problems such as MEAs (Iiskala et al., 2004, 2011; Prins et al., 2006). MEAs are useful tools allowing students to engage inand develop their metacognition because they are purposefully designed to provide enough information to allow studentsto self-assess their understanding (individual) and require students to work in teams where multiple perspectives will bepresented (social) (Hamilton et al., 2008; Lesh et al., 2003; Magiera & Zawojewski, 2011).

MEAs are often designed requiring the need to define nebulous constructs and operationalize those definitions. Because ofthe complexity and undirected nature of these processes, a team’s first solution attempt will likely be suboptimal. Therefore,multiple cycles of revision, where students criticize each other’s thinking, are typically necessary. These cycles reveal howstudents monitor and modify their own and others’ thinking in an active and dynamic way over the course of problemsolving (Lesh & Doerr, 2003; Lesh et al., 2003). The productivity of metacognitive functions provoked during these modelingcycles varies because the focus of subtasks is different (Lesh et al., 2003). For example, during earlier cycles, brainstormingmay be more productive than assessment. The whole course of problem solving provides students with a place to practiceand develop various metacognitive abilities (Lesh et al., 2003).

Students often have difficulties evaluating their own solutions, but peers can help with this evaluation. When studentswork in teams, they evaluate each other’s ideas, serving a metacognitive role for one another (e.g., Goos et al., 2002; Goos,2002; Hurme, Merenluoto, & Järvelä, 2009). Multiple perspectives are typically useful when solving MEAs (Moore & Diefes-Dux, 2004). MEAs lead to multiple solution attempts by students, and more sources for the development of metacognitionon the social level as team members criticize solution attempts that do not meet their own standards of goodness.

2.3.5. Task complexity, definition building and operationalizing definitionsSeveral research studies have identified task difficulty in terms of task complexity as an important factor in the elicitation

of metacognition (Efklides, 2006; Helms-Lorenz & Jacobse, 2008; Iiskala et al., 2004, 2011; Prins et al., 2006; Vauras et al.,2003). They suggest that metacognition tends to emerge more frequently in difficult versus easy tasks. Task difficulty drawson both the conceptual and cognitive demands of a task (Efklides, 2006; Stahl et al., 2006). Stahl et al. (2006) distinguished taskcomplexity from task difficulty to refer to the cognitive demands of a task. For example, they argued that while “memorizingvery difficult facts” is simpler than “applying an easy formula” in terms of task complexity, the former could be much moredifficult than the latter because of the higher level of cognitive demand. During problem solving, students can monitor taskcomplexity and regulate their goal setting and planning accordingly (Stahl et al., 2006).

We focus on task complexity as an environmental source for triggering metacognition. We assume that task complexityinvolves both the conceptual and cognitive demands of a task. The cognitive demands of a task seem to be more contextualthan the conceptual demands of a task that are “a function of one’s developmental level and/or of domain-specific knowledge”(Efklides, 2006, p. 6), and therefore draw on the individual conceptual systems. Research often classifies the complexity oftasks in terms of their processing complexity: how many steps are involved to get to a goal state, such as one-step versusfour-step addition, subtraction, multiplication and division problems (Iiskala et al., 2011). To classify task complexity, Stahlet al. (2006) used the revised Bloom’s taxonomy (Anderson & Krathwohl, 2001), which provides a comprehensive set ofclassifications for learners’ cognitive processes of different complexity: remember, understand, apply, analyze, evaluate,and create.

However, the levels of complexity in MEAs used for the current study cannot be classified in terms of their processingcomplexity or the hierarchical taxonomy of cognitive processes. This is because, as shown in Section 2.3.3, multiple cycles ofrevision (“express-test-revise”) are typically necessary for students to create transportable, reusable, and sharable models(Hamilton et al., 2008; Lesh & Doerr, 2003; Lesh et al., 2003). Thus, we consider task complexity to refer to the differentlevels in conceptual and cognitive demands of the problem-solving processes, “definition building” and “operationalizingdefinitions,” required in MEAs. To differentiate the levels of complexity in solving problems, we look to the degree to whichproblems require students to (1) build definitions and (2) operationalize those definitions. This analysis parallels the viewof metacognition as a management issue (Schoenfeld, 1987).

Definition building is the process by which problem-solvers interpret given contexts and conceptualize problems, buildingmeanings based on existing conceptual systems (Lesh & Doerr, 2003; Lesh & Harel, 2003). Definition building comes to theforefront by putting students into situations requiring them to define fuzzy ideas such as “Which is the most rough?” or“Who is the best volleyball player?” These constructs are ill-defined. “Roughness” depends on the context in which you areworking; for example, blacktop used to pave streets may be rough to the touch, but may be smooth when driving on it. Thebest volleyball player depends on multiple variables such as ability to serve, spike, jump, move quickly, and so on. Theseexamples illustrate situations where students must define qualitative constructs before proceeding to solve problems.

Operationalizing definitions is the process by which a qualitative construct is made measurable in order to formalizethe goal of a problem. This process requires identifying evaluative criteria for what counts as a “good” or “better” solu-tion. Definition building is inherently tied to this process, but these are separate ideas in problem solving. The process ofoperationalizing definitions involves subtasks such as the following:

Page 9: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

(((((

t

a2b

dtiats

3

3

Bsro(to

3

ymMss

asfcwmap

3

setfs

Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396 385

1) quantifying qualitative information;2) converting all information to a homogeneous form of representational media;3) choosing variables that are consistent with the definition built;4) sampling from data to represent the given context; and5) choosing mathematical operations that are consistent with the definition built.

In business and industry, understanding and designing complex systems mainly involves the operationalizing of defini-ions that are not straightforwardly mathematized.

In this study, we chose to use the terms “definition building” and “operationalizing definitions” in order to emphasize thection involved compared to the more common terms of “concept definitions” and “operational definitions” (Harel & Koichu,010). For example, students in this study were required to define the concepts for themselves within a particular context;y contrast, “concept definitions” typically indicates that definitions are provided for students.

So far, we have described the theoretical model of metacognition in complex collaborative problem-solving activitieseveloped in the current study, and the rationale for using MEAs to study metacognition. The theoretical model of metacogni-ion helps make clear the distinction of cognitive and metacognitive behaviors, and the distinction of metacognitive functionsn the domain of problem solving. It is also expected to work as an effective basis to identify and interpret metacognitivectivities in collaborative modeling activities. Using the theoretical model as a framework and MEAs, a case study is designedo explore how metacognition functions during complex collaborative problem solving, and how it operates at the individual,ocial, and environmental levels.

. Methodology

.1. Case study research design

A single-case naturalistic case study method was used to investigate metacognitive processes (Darke, Shanks, &roadbent, 1998; Yin, 2004). The purpose of the case study research was to explore in depth one team’s individual andocial metacognition: how students’ thinking develops metacognitively while working within a group in a natural class-oom setting, in an MEA where both the definition building and operationalizing definitions processes were needed. Becausef these descriptive, naturalistic, and inductive characteristics of the study, qualitative research methods meet the needsBogdan & Biklen, 2003; Creswell, 2006). Data were collected from audio recordings of the students in one team working onhe Paper Airplane Contest MEA. In an effort to establish rigor and credibility for the study, parameters such as triangulationf data source and multiple researcher analysis was employed as described by Darke et al. (1998).

.2. Setting

This study took place in a girls’ middle school located in a central part of South Korea during the 2008–2009 schoolear. Two researchers implemented the Paper Airplane Contest MEA in a class of 32 female 8th grade students who hadixed-level mathematics abilities. Eighteen of the students scored 20 or more points out of the total 30 on the 2009 Nationaliddle School Students Mathematics Exam, and four students did not pass the exam (the cut-off score is 13). Five of the

tudents failed in both the Computation and Word Problem sections. Two students did not reach the cut-off in the Functionection and seven failed the Geometry Problem section.

The participants in this study were four team-members in the class. This team was chosen for this study because theydopted positive attitudes during their session and verbalized their conceptualization of problems over the course of problemolving. The students worked on the MEA for about 135 min including the time for a break and for filling out the consentorm. First, the students read a newspaper article that served as an advanced organizer and that engaged them in the problemontext. They also discussed a set of warm-up questions about the newspaper article for 20 min as a whole class. Then, theyorked on the problem statement involving the best floater problem and the most accuracy problem for 70 min. Finally, theyade short group presentations to share their solutions to the whole class for 20 min. The implementation was videotaped

nd each group of four students was audio recorded to explore student conversations throughout the MEA group activityeriod. The group conversations of the four students in this study were translated from Korean into English for analysis.

.3. Paper airplane contest MEA

This study used an MEA entitled “Paper Airplane Contest” that was translated from English into Korean. This MEA providedtudents with a complex problem-solving experience in which multiple levels of metacognitive functions were invoked. We

xplore how the different sub-tasks within the MEA required different levels of conceptual and cognitive demands of thewo problem-solving processes, “definition building” and “operationalizing definitions,” and therefore how the sub-tasksunctioned differently as the sources at the environmental level for eliciting students’ metacognition at the individual andocial levels. The following sections contain an overview of the problem.
Page 10: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

386 Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396

Fig. 3. Landing points for four paper airplanes thrown by three pilots.

3.3.1. Math-rich newspaper article as the individual warm-upThe students were first asked individually to read a math-rich newspaper article that (a) described how to make and

toss a variety of different types of paper airplanes and (b) provided information about the paper airplane contest. The articledescribed several problems that occurred at last year’s paper airplane contest. Some flight characteristics that were testedwere: (a) how far the planes flew, and (b) how long the planes stayed in the air. However, it proved difficult to judge someof these characteristics because the planes performance depended on which “pilots” tossed them. For this reason, one yearlater, the organizers of the paper airplane contest decided that three pilots should fly each plane, and that the same threepilots should fly all of the planes. Then students individually answered several warm-up questions about the newspaperarticle and discussed their answers as a whole class.

3.3.2. Teams creating a judging scheme for a paper airplane contestTeams of four students then worked on the MEA problem statement. They were asked to write a letter to the judges of

the paper airplane contest. The letter needed to provide a procedure which would allow the judges to decide which airplanewas (a) the most accurate flier and (b) the best floater. We refer to these as the most accurate problem and the best floaterproblem, respectively. The teams were given Table 2, a sample of data from a trial contest to develop and test their procedure.Fig. 3 is a graph of the data in Table 2, separating out the landing points for each plane regardless of pilot and the landingpoints for each pilot regardless of plane. Both the table and the figure show the results from the trial contest in which threepilots flew four different paper airplanes. The “pilot” stood at a point S(0, 0) on the floor, and their goal was to toss the planesso that they came as close as possible to the point X(25, 25), which was the “target” for the flights.

3.4. Analysis methods

The resulting verbal protocols were parsed into the problem-solving processes “definition building” and “operationalizingdefinitions” for each sub-task within the MEA. A finer grained analysis of conversational statements was then carried outto identify each member’s behaviors, using a coding scheme consistent with the framework developed in Figs. 1 and 2. Forexample, the coding of NI (new idea) and IC (individual cognitive components) was given when potentially useful informationor alternative strategy was mentioned based on the individual’s prior knowledge or experience. While coding the data, someconversational statements fit into two or more categories, so cross-indexing was used. For example, if a student assessedher understanding of the problem and revealed a way of potentially useful thinking, and if the assessment and idea weremade based on her prior knowledge and experience but prompted by others’ way of thinking, then her response wouldbe coded by AU (assessment of understanding)/NI (new idea), and IL (individual level) & SL (social level). Table 3 showsthe coding scheme used in this study for deciding the distinction between “thinking WITH” and “thinking ABOUT” and thedistinction among the multiple levels. It also includes more detailed examples related to the distinctions and several notes

for the coding decisions made in this study.

To reduce coding errors, two researchers carefully read the transcript several times. Each researcher coded the data basedon the coding schemes as mentioned above that were agreed upon before data analysis. The Cohen’s K coefficient of inter-rater agreement for coding the data was 0.94. This value is within a range that indicates an acceptable level of reliability

Page 11: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

Y.R.

Kim

et al.

/ Journal

of M

athematical

Behavior 32 (2013) 377– 396

387

Table 2Information about four paper airplanes flown by three different pilots.

Flight Pilot F Pilot G Pilot H

Distance fromstart

Time inflight

Distance totarget

Anglefromtarget

Distance fromstart

Time inflight

Distance totarget

Anglefromtarget

Distance fromstart

Time inflight

Distance totarget

Anglefromtarget

Plane A 1 22.4 1.7 15.2 16 30.6 1.6 14.5 23 39 1.8 7.5 −102 26.3 1.7 16.7 26 31.1 1.6 11.9 19 36.3 1.7 4.3 −63 31.6 1.7 7.1 10 26.7 2.2 8.9 −4 35.9 2.2 9 −14

Plane B 1 32.1 1.9 7.6 −11 35.9 1.9 14.3 −23 43.7 2.0 9.5 62 42.2 2.0 9.2 −9 39 2.1 11.1 16 29 2.0 7.6 73 27.2 2.1 10.2 −11 25.6 2.0 11.7 12 36.9 1.9 12.4 19

Plane C 1 19.2 1.8 16.6 −8 42.9 2.0 9.8 9 35.1 1.6 2.8 42 28.7 1.9 9.3 11 44.6 2.0 9.3 −1 37.2 2.2 2 −13 23.6 2.1 17.3 −25 35.7 2.2 3.2 −5 42 2.1 9.8 10

Plane D 1 28.1 1.5 8.9 9 37.2 2.1 20.2 −32 41.7 2.2 10.1 112 31.6 1.6 14.8 −24 46.6 2.0 11.4 −2 48 1.9 14.1 −83 39.3 2.3 9.1 12 34.7 1.8 22.2 −36 44.7 1.7 11.5 −9

Page 12: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

388Y.R

. K

im et

al. /

Journal of

Mathem

atical Behavior

32 (2013) 377– 396

Table 3Coding scheme for deciding the distinction between “thinking WITH” and “thinking ABOUT” and the distinction among the multiple levels.

Problem-solving behaviors Sources to which an individual has access

Distinction Description Individual Level Social Level Environmental Level

Cognitive activities:Think WITH

Without any evaluation, an individualthinks WITH a cognitive component:

Thinking WITH the cognitivecomponent due to oneself

Thinking WITH the cognitivecomponent due to others

Thinking WITH the cognitivecomponent due to a learningenvironment

e.g., Given a set of numeric data,student S1 thinks with “knowledge ofthe average” to solve a problem.

e.g., Student S1’s thinking with herown “knowledge of the average”

e.g., Student S1’s thinking with“knowledge of the average” istriggered by another student’s idea touse average.

e.g., Student S1’s thinking with herown “knowledge of the average” due tosomething in the problem thatindicates that concept, such as “Whatis the average of the numbers?”

Metacognitive Activities:Think ABOUT

With evaluation, an individual thinksABOUT a cognitive or metacognitivecomponent:

Thinking ABOUT the cognitive ormetacognitive component due tooneself

Thinking ABOUT the cognitive ormetacognitive component due toothers

Thinking ABOUT the cognitive ormetacognitive component due to alearning environment

e.g., Given a set of numeric data,student S1 thinks about “knowledge ofthe average” as to whether it is properto solve a problem.

e.g., Student S1’s thinking about herown “knowledge of the average” istriggered by her own realization of amistake.

e.g., Student S1’s thinking about herown “knowledge of the average” istriggered by another student pointingout a mistake.

e.g., Student S1’s thinking about herown “knowledge of the average” istriggered by something in the problemthat makes her original solution notviable, such as competing variables inthe problem.

Notes on how the coding scheme is implemented:1. The unit for analyzing verbal protocols is each comment made by an individual.2. The coding decision for each comment is made on the basis of the overall scenario of students’ dialogs within the group rather than on the basis of each individual statement made.Ex. (1) When student S4 led a problem-solving process by suggesting a new idea with the comment, “We need to compare the speed to distance. . .don’t we?” (Statement 60, p. 39) This comment could be codedat the individual level not because of the subject “we,” but due to the overall scenario.Ex. (2) When student S1 made the comment, “. . .we need to calculate this again. . .” (Statement 61, p. 39) This comment could be coded as “thinking WITH” because her thinking still stuck in calculation withoutany evaluation based on the overall scenario.Ex. (3) When student S1 made the comment, “Why is this furthest?” (Statement 48, p. 34) This comment could be coded at both individual and social levels because her thinking ABOUT a way of thinking (theindividual level) occurred due to thinking ABOUT S2’s way of thinking (the social level).3. Non-verbal and verbal cues are considered as important factors in making coding decisions.Ex. When students made comments with some cues of evaluation, such as “I think. . .” (Statement 1, p. 31), “um. . .” (Statement 33, p. 33), “. . .don’t we?” (Statement 60, p. 39), “Ah. . .compare. . .” (Statement 61,p. 39), and so on. The comments could be coded as “thinking ABOUT” because they indicate an evaluation occurring.

Page 13: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

(t

to

4

(

(

arhw

4

aot““p“otddw

4

tofi1faa1

234

5

6[flO

Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396 389

Altman, 1991; Fleiss, 1981; Landis & Koch, 1977). The coding discrepancies were resolved via discussion and consensus sohat 100% agreement was reached.

The conversational statements presented here are assigned with a consecutive numbers. According to the coding results,hey are annotated to indicate students’ problem-solving behaviors, their monitoring and regulatory activities, and the levelsf sources that triggered them.

. Results and discussion

Recall that the two research questions are:

1) How is metacognition elicited through the definition building and operationalizing definitions processes during complexcollaborative problem solving?

2) How does metacognition operate during complex collaborative problem solving at the individual, social, and environ-mental levels?

We first address research question (1) by exploring how the complexity of a task (in this case, a subtask) can encour-ge students’ metacognition, especially focusing on definition building and operationalizing definitions. We then addressesearch question (2) by focusing on students’ metacognition on multiple levels as they participated in the MEA, particularlyow students monitored or evaluated their own thinking processes within the group, and how students’ thinking processesere transferred to others through the team interactions.

.1. Effect of task complexity on the construct of metacognition at the environmental level

This section focuses on the effect of task complexity on metacognition. It provides data from student interactions toddress research question (1). Recall that the MEA involves solving two problems, with regard to definition building andperationalizing those definitions. The most accurate problem involves less complexity than the best floater problem, andhus less social metacognition. This is perhaps because the phrase “most accurate” is somewhat clearer than the phrasebest floater” for students to conceptualize based on their existing knowledge and experiences. For example, the wordaccurate” might remind students of words like “target” or “aim”, which bring to mind games of accuracy, and this elicitsrior experience. By contrast, the word “floater” could be associated in students’ mind with words such as “levitate” orhover”, which are much more nebulous for students to define. In addition to the different levels in conceptual demandsf the problem-solving processes, another possible reason why the most accurate problem involves less complexity thanhe best floater problem is the graphical representations with the target point X (Fig. 3). This might reduce the cognitiveemand of the problem-solving processes for the most accurate problem. These speculations about the complexity of theifferent sub-tasks within the MEA were supported by the following description of the students’ problem-solving sessionith transcript excerpts.

.1.1. The most accurate problem as a source of metacognition at the environmental levelFor the most accurate problem, students exhibited relatively clear goals, such as defining the most accurate flier as “closest

o target,” which in turn led to a common solution path without many iterations needed to meet the goal. For the processf definition building, they made a tacit agreement without much argument. All the students merely looked at the data tond which flier was most accurate without a specific discussion on what “most accurate” means (Statements 12, 14, 15, and6 below). The students partially revealed their definition of “most accurate” as “closest to the target” in the conversationsor testing the accurate flier (Statements 18, 20, and 22). (In contrast, they clearly discussed the definition of the best floatermongst themselves at the beginning of solving that problem, as we will see below.) The tacit agreement of what “mostccurate” means was illustrated as follows:. S1: I think first we need to know what our tasks are. (Assessment–understanding/New idea: Thinking ABOUT a way of thinking,

Individual Level). S2: Tasks?

. S1: I mean. . .we need to know what they (the judges) want us to do?. S2: Right, we should find them. We first need to find which flight is the accuracy. (Assessment–understanding: Thinking ABOUT S1’s way of

thinking, Social Level). S1: We need to find this. . .

Definition building for the most accurate problem:. S2: What is the accuracy?Students then worked for a few minutes to try to figure out their overall goals for the task. The students also discussed what a definition of bestoater should be.]perationalizing defintion for the most accurate problem:

Page 14: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

390 Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396

12. S1: The criteria of the accuracy. . .we should know the criteria first. . .so, I think we can use the graphs (Fig. 3) for that. (New idea: ThinkingABOUT a way of thinking, Individual Level)

13. S2: Really!14. S1: First, let’s find the accurate flier. . .then. . .what should be the criteria of (the accurate) flier?15. S3: The criteria of (the accurate) flier. . .well. . . (Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking, Social Level)16. S2: The criteria are. . .? (Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking, Social Level)17. S3: What are they (the graphs in Fig. 3) saying?18. S1: . . .um. . .Let’s see. . .this has accurate. . .arrival point and flight path. . .(New idea: Thinking ABOUT a way of thinking, Individual Level)19. Other Ss: Oh. . .yes. (Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking, Social Level)20. S1: (After) deciding the range. . .and we can see the flight in the range. . .(New idea: Thinking ABOUT a way of thinking, Individual Level)

[Students defined an “in bounds” and “out-of-bounds” range and were only considering the in bounds flights for accuracy.]21. S3: What to do next? . . .(Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking, Social Level &

Assessment–progress toward goal, Individual Level)22. S1: Then, in the graphs, we should find the flight nearby the target. . .(Assessment–progress toward goal, Individual Level & Social Level)23. Other Ss: Ok. . .let’s see. . .24. S1: By the pilot. . .25. S3: By the Pilots. . .26. S1: When we see this graph (Landing points for the four planes, A, B, C and D in Fig. 3), closest to X (target) is C. . .C is three times in the

range. . . (Assessment–strategy execution, Individual Level)27. S2: No. . . (Assessment–strategy execution: Thinking ABOUT S1’s way of thinking, Social Level)28. S1: Here. . .C, C, C. . .Three times. . .29. S2: Aha! (Assessment–strategy execution: Thinking ABOUT S1’s way of thinking, Social Level)

[Students spent a few minutes to explore the graphs.]30. S4: Let’s just calculate C (average of distance to target). (New idea: Thinking ABOUT a way of thinking, Individual Level)31. S1: But. . .when we see the next graph (Landing points for the three pilots, F, G and H, in the Fig. 3). . .the most accurate (that is) the

closest (to target) is H and F, and the smallest angle is F. . .F and G. (Assessment–strategy execution, Individual Level)[One student (S1) reminded the group what the best floater means, and then the students went back to working on calculations for the most accurate.]34. S1: The accuracy. . .35. S2: The accuracy is . . ..we decide as H and C. . .36. S1: Right. H is the floater (who threw the accurate flier) . . .and C is most accurate. . .because the accuracy is the closest to target.

(Assessment–accuracy of result: Thinking ABOUT S2’s way of thinking, Social Level)37. S3: We are done for one question.

The students did not question what “most accurate” meant at all. They quickly tried to find the most accurate flier withoutdefining what “accuracy” in this context means (Statements 12, 15, and 16 above). The students had a clear common goalin approaching this part of the problem, and they felt confident and certain of the solution path (e.g., Statements 20, 22, 26,30, etc.). In other words, this problem, in which outcome goals seem to be apparent, gives students confidence to monitortheir own problem solving.

4.1.2. The best floater problem as a source of metacognition at the environmental levelThis section illustrates how the best floater problem differentially worked as an environmental source for effectively

encouraging students’ metacognition at both the individual and social levels, compared to the most accurate problem. Forthe notion of “the best floater,” each student had different definitions, such as “the longest time” (Statement 8 below), “thefurthest distance” (Statement 9), and “the slowest flight” (Statements 32 and 33). This is evidence of differences in theindividual conceptual systems based on prior knowledge and experience. The different level of complexity in the definitionbuilding process for the best floater versus the most accurate problem elicited multiple perspectives from students. Onestudent (S1) made an initial definition of “the best floater” as “the slowest flight” (Statement 32) with group members’passive agreement in the beginning of the problem solving (Statement 33). The group had a concrete definition of theproblem at that time because no members argued about it anymore. The excerpts below show how the students reachedagreement on their first definition of the best floater.

Definition building for the best floater problem:

7. S3: We also need to know the best floater.8. S4: We. . .um. . .how about find the longest time? (New idea: Thinking ABOUT a way of thinking, Individual Level)9. S1: The furthest distance from the start and the shortest time in flight. . .(New idea: Thinking ABOUT a way of thinking, Individual Level)10. S4: Don’t we find the longest time? (Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking & Thinking ABOUT her

own way of thinking, Social Level & Individual Level)11. S1: Ah. . .the longest time in flight. . . (Assessment–strategy appropriateness, Social Level & Individual Level)[Team spent about 10 min talking about the most accurate problem]

Definition building for the best floater problem:

32. S1: We already decided the best floater as the slowest flight. . .right? (Assessment–progress toward goal, Individual Level)33. S2: um. . .the slowest flight. . .(Assessment–understanding: Thinking ABOUT S1’s way of thinking, Social Level/Assessment–strategy

appropriateness, Social Level, & Assessment–progress toward goal, Social Level)

[Students returned to calculating average of distances to target.]

The initial definition of the best floater subsequently caused some controversies within the students in the group. Theyrepeatedly monitored and evaluated their conceptualization of the problem as individuals and against others’ feedback orcriticism. The following excerpt illustrates this.

Page 15: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

33

4

444444

44

45[

66

666666

[

7

777

7

77

fdaatiw

4

gwopsido(d

Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396 391

Operationalizing defintion for the best floater problem:8. S2: Best floater?9. S1: First, let’s write down our decision. C floater and H pilot (for the accuracy). . .best floater. . . the furthest distance and the longest time

in flight. . .distance divided by time. . .(best floater) that speed is smallest. . . (Assessment–progress toward goals, Individual Level, & ThinkingWITH knowledge of speed)

0. S2: To get the speed. . .they gave us the calculator. . . (Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking, SocialLevel, & Environmental Level)

1. Other Ss: Right. . .right. . .(Assessment–strategy appropriateness: Thinking ABOUT S2’s way of thinking, Social Level)2. S1: Distance divided by time. . . (Thinking WITH knowledge of speed)3. S2: Distance divided by time. . .? (Assessment–knowledge: Thinking ABOUT S1’s knowledge of speed, Social Level)4. S3: Lowest speed (for the best floater)? (Assessment–understanding: Thinking ABOUT understanding of the definition, Individual Level)5. S1: Yes, the lowest speed. . .(Assessment–understanding: Thinking ABOUT S3’s understanding of the definition, Social Level)6. S1: So, we need to first calculate the average of these three (distances from start and times in flight for each plane). (Assessment–progress

toward goals: Thinking ABOUT Ss’ way of thinking, Social Level)[Calculating average and then speed]

7. S2: Something strange. . .strange. (Assessment–sense of result, Individual Level)8. S1: Why is this furthest? (Assessment–sense of result: Thinking ABOUT S2’s way of thinking led thinking ABOUT a way of thinking, Social

Level & Individual Level)9. S2: We are correct. But, (something’s) strange. . .(Assessment–accuracy or sense of result, Individual Level)0. S3: What’s the problem? (Assessment–understanding: Thinking ABOUT S2’s way of thinking, Social Level)Students tried to figure out what caused the problem. They spent about 5 min for re-calculation of speed.]

Operationalizing defintion for the best floater problem:0. S4: We need to compare the speed to distance. . .don’t we? (New idea: Thinking ABOUT a way of thinking, Individual Level)1. S1: Ah. . .compare. . .we need to calculate this again. . .(Assessment–strategy appropriateness: Thinking ABOUT S4’s way of thinking,

Social Level, & Thinking WITH a strategy, calculation)2. S3: C has the longest time but. . .(Assessment–sense of result, Individual Level)3. S2: Speed is . . .4. S4: We need to compare the speed to distance. . .how about that? (Assessment–progress toward goals, Individual Level)5. S1: Right. . .(Assessment–progress toward goals: Thinking ABOUT S4’s way of thinking, Social Level)6. S2: We need to calculate distances again. (Thinking WITH a strategy, calculation of average)7. S4: No, we already have distances on the data. (Assessment–strategy appropriateness: Thinking ABOUT S2’s strategy of average, Social

Level)For a few minutes the group discusses their initial definition of best floater.]

Operationalizing defintion for the best floater problem:2. S1: But, in this case, the shorter distance is, the smaller speed is. . .uh. . .what’s the problem? (Assessment–sense of result, Social Level &

Individual Level)3. S3: The larger a denominator is. . .(Assessment–knowledge: Thinking ABOUT knowledge of fraction, Individual Level)4. S2: The longest time is. . .5. S4: The less time, the faster speed is. (Assessment–sense of result: Thinking ABOUT S1’s way of thinking, Social Level, &

Assessment–knowledge: Thinking ABOUT knowledge of speed, Individual Level)6. S1: But, if the time is longest and the distance is shortest, then speed is lowest. . .but distance is shortest. . .then we need to decide an

arbitrary distance for this. . .how about this? [paused for a time] Let’s find speed with the arbitrary distance (a constant) after dividing allof them, but how to decide the arbitrary distance? (Assessment–knowledge: Thinking ABOUT knowledge of speed, Social Level & IndividualLevel, & New idea: Thinking ABOUT a way of thinking, Individual Level)

7. S4: Is it possible? (Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking, Social Level)8. S2: We have to have a confidence for our strategy. (Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking, Social

Level)The students kept their own conceptualizations of the problem, such as “the longest time in air,” “the furthest distance

rom the start,” and “the slowest,” during problem solving. Based on these conceptualizations, they tried to operationalizeefinitions to develop mathematical models for evaluating criteria for the best floater (e.g., Statements 39, 46, 60, 76, etc.bove). As a result, the students continuously conflicted with each other on the processes of definition building and oper-tionalizing definitions (e.g., Statements 47, 50, 62, 77, etc.). However, feedback and criticism from each other encouragedhe students to monitor and evaluate their thoughts during problem solving (e.g., Statements 47, 48, 49, 50, etc.). This cases a good exemplar of how MEAs can encourage students’ metacognition on the social level by promoting the interactions

ith peers, encouraging them to go beyond their current ways of thinking (e.g., Statements 60, 72, 75, 76, etc.).

.1.3. The sources at the environmental level for eliciting students’ metacognitionIn sum, for the most accurate problem, students quickly approached the goal because they had a relatively clear common

oal without much doubt. When monitoring their own problem solving, students might expect an easy route to the goalith tacit agreement. Thus, each student might depend on the individual level of metacognition, rather than welcoming

r needing social metacognition. This well supports the Stacey’s (1992) warning about group behaviors in collaborativeroblem-solving settings. She cautioned that in the group situation, the attraction of simplicity is strong: Students whoee simple ideas produce simple answers quickly without any careful considerations of them. The most accurate problems relatively simplistic for definition building in comparison to the best floater problem, and therefore gives students confi-

ence to monitor their own problem solving. However, sometimes this confidence may prevent students from taking thepportunity to engage in metacognitive processes on the social level. This is in line with previous metacognition researchArtzt & Armour-Thomas, 1997; Stacey, 1992) which has shown that self-confident attitudes have contributed to the lesseregree of metacognitive behaviors in collaborative problem-solving settings. There is less tendency for students to unpack
Page 16: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

392 Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396

their multiple perspectives and make an argument-based negotiation, which may negate the need for social metacognitioneven though the problem solving is team-oriented.

However, for the best floater problem, each student had a perspective derived from their own conceptualization of theproblem, and the conflict around their conceptualizations encouraged students to come to an agreement. The first attempt thegroup made was far from optimal, and therefore students criticized each other’s thinking. The best floater problem involvesmore complexity in the problem-solving processes of definition building and operationalizing definitions, and consequentlyit provided a sufficient source for metacognition at the social level, in addition to metacognition at the individual level.

Finally, one unexpected but interesting finding related to the sources of metacognition is the possibility of a negativerole of calculators as an environmental source for eliciting metacognition (see Statement 40). We provided calculators toeach student group as a complementary tool, even though they are not allowed in a regular math class in South Korea. Thisexistence of calculators in the classroom setting might be able to reinforce the students’ belief that mathematics problemscall for a formal computation. Schoenfeld (1987) argued that this students’ wrong belief about “what mathematics is allabout” has a very strong negative effect on their mathematical behavior.

4.2. An analysis of how one student’s metacognition operated at the multiple levels

This section focuses on how one student’s interactions with the problem and with other group members became acatalyst for thinking metacognitively at the individual level and the social level. Student S1 actively verbalized her thoughtsthroughout much of the MEA, so she was chosen as the unit of analysis for this section. Student S1 took the lead in the activityto reach solutions in both parts of the problem. However, she revealed a misconception about ‘speed’ that was expected tobe used as the group’s model to judge the best floater. Her misconception was that “the furthest distance and the longesttime in flight result in the slowest speed” (Statement 39 below). She was unable to monitor this misconception in her ownproblem solving until other students repeatedly criticized her thinking (e.g., Statements 47, 49, 54, etc.). This is illustratedin the following excerpt:32. S1: We already decided the best floater as the slowest flight. . .right? (Assessment–progress toward goal, Individual Level)33. S2: um. . .the slowest flight. . .(Assessment–understanding: Thinking ABOUT S1’s way of thinking, Social Level/Assessment–strategy

appropriateness, Social Level, & Assessment–progress toward goal, Social Level)[In the middle of working on the most accurate problem, S1 tried to remind the other group members of her definition of the best floater.Students then returned their attention to calculations for the most accurate.]

[After a few minutes]38. S2: Best floater?39. S1: First, let’s write down our decision. C floater and H pilot (for the accuracy). . .best floater. . . the furthest distance and the longest time

in flight. . .distance divided by time. . .(best floater) that speed is smallest. . . (Assessment–progress toward goals, Individual Level, & ThinkingWITH knowledge of speed)

[Calculating average and then speed]47. S2: Something strange. . .strange. (Assessment–sense of result, Individual Level)48. S1: Why is this furthest? (Assessment–sense of result: Thinking ABOUT S2’s way of thinking led thinking ABOUT a way of thinking, Social

Level & Individual Level)49. S2: We are correct. But, (something’s) strange. . .(Assessment–accuracy or sense of result, Individual Level)50. S3: What’s the problem? (Assessment–understanding: Thinking ABOUT S2’s way of thinking, Social Level)51. S2: We need to calculate it again? (Assessment–progress toward goal: Thinking ABOUT Ss’ way of thinking, Social Level)52. S3: No, there are differences from each result. . .(Assessment–accuracy of result: Thinking ABOUT S2’s way of thinking, Social Level)53. S2: Is it a problem? (Assessment–accuracy of result: Thinking ABOUT S3’s way of thinking, Social Level)54. S4: No. . .5.7. . .it is right. . .something strange. (Assessment–accuracy or sense of result, Social Level & Individual Level)55. S1: Ah, this is the problem! It is 17.21. . . So, D is the best floater. . .and. . .H. . .(Assessment–accuracy of result, Social Level)56. Other Ss: Why? Why? Why?57. S1: Because it is the slowest speed. The best floater is the . . .slowest (Student S2 talked simultaneously). . .speed. . .

um. . .(Assessment–understanding, Social Level)58. S3: Isn’t (the best floater) the fastest floater . . .(Assessment–understanding, Social Level)59. S2: (The best floater is) Slowest and farther floater. . .(Assessment–understanding, Social Level)

Student S1 slowly developed her thinking in reaction to other students’ conceptualizations (Statements 48 and 57 above;Statements 61, 72, 76, and 84 below), even though she did not thoroughly evaluate her thinking, and could not correct themisconception by the end of the activity (Statements 85 and 87 below). This is illustrated in the following excerpt:9. S1: The furthest distance from the start and the shortest time in flight. . .(New idea: Thinking ABOUT a way of thinking, Individual Level)10. S4: Don’t we find the longest time? (Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking & Thinking ABOUT her

own way of thinking, Social Level & Individual Level)11. S1: Ah. . .the longest time in flight. . . (Assessment–strategy appropriateness, Social Level & Individual Level)[The team spent about 10 min talking about the most accurate problem, and then spent about 13 min working on the best floater problem.]60. S4: We need to compare the speed to distance. . .don’t we? (New idea: Thinking ABOUT a way of thinking, Individual Level)61. S1: Ah. . .compare. . .we need to calculate this again. . .(Assessment–strategy appropriateness: Thinking ABOUT S4’s way of thinking,

Social Level, & Thinking WITH a strategy, calculation)[After calculating them again]

62. S3: C has the longest time but. . .(Assessment–sense of result, Individual Level)63. S2: Speed is . . .64. S4: We need to compare the speed to distance. . .how about that? (Assessment–progress toward goals, Individual Level)65. S1: Right. . .(Assessment–progress toward goals: Thinking ABOUT S4’s way of thinking, Social Level)66. S2: We need to calculate distances again. (Thinking WITH a strategy, calculation of average)
Page 17: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

6

66777

777

7

77

788888[888

la

sptfafhd

5

cafm

mco&nm

ilt

aMmi

Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396 393

7. S4: No, we already have distances on the data. (Assessment–strategy appropriateness: Thinking ABOUT S2’s strategy of average, SocialLevel)

8. S1: Our definition of the best floater is the farthest distance and the longest time. . . (Assessment–progress toward goals, Individual Level)9. S2: Right! (Assessment–progress toward goals: Thinking ABOUT S1’s way of thinking, Social Level)0. S1: So, that’s the slowest speed. . .because distance divided by time. (Assessment–strategy appropriateness, Social Level & Individual Level)1. S2: Yes. . .then . . . flight C. . .(Assessment–strategy appropriateness, Social Level & Individual Level)2. S1: But, in this case, the shorter distance is, the smaller speed is. . .uh. . .what’s the problem? (Assessment–sense of result, Social Level &

Individual Level)3. S3: The larger a denominator is. . .(Assessment–knowledge: Thinking ABOUT knowledge of fraction, Individual Level)4. S2: The longest time is. . .5. S4: The less time, the faster speed is. (Assessment–sense of result: Thinking ABOUT S1’s way of thinking, Social Level, &

Assessment–knowledge: Thinking ABOUT knowledge of speed, Individual Level)6. S1: But, if the time is longest and the distance is shortest, then speed is lowest. . .but distance is shortest. . .then we need to decide an

arbitrary distance for this. . .how about this? [paused for a time] Let’s find speed with the arbitrary distance (a constant) after dividing allof them, but how to decide the arbitrary distance? (Assessment–knowledge: Thinking ABOUT knowledge of speed, Social Level & IndividualLevel, & New idea: Thinking ABOUT a way of thinking, Individual Level)

7. S4: Is it possible? (Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking, Social Level)8. S2: We have to have a confidence for our strategy. (Assessment–strategy appropriateness: Thinking ABOUT S1’s way of thinking, Social

Level)9. S1: C (flier) by F pilot.0. S4: Let’s see that. . .1. S1: Look back at the graph. . ..2. S4: C is the accurate. . .the most accurate flier equals to the best floater. Right? (Assessment–sense of result, Individual Level)3. S2: We can do like that. . .(Assessment–sense of result: Thinking ABOUT S4’s way of thinking, Social Level)4. S1: Let’s think more. . .Students were writing the letter to the judges]5. S1: The best floater is the problem. . .the highest speed is fine. . . (Assessment–sense of result, Individual Level)6. S3: Longest time in flight. . . (Assessment–sense of result, Social Level & Individual Level)7. S1: Ok. . .we can say that the best floater is measured by longest time and smallest speed. (Assessment–sense of result, Social Level &

Individual Level)

Slight but meaningful changes in student S1’s thinking during problem solving revealed how metacognition on the socialevel developed her thinking about speed in this context (Statement 76 above). This case illustrates how one person talkingloud about their conceptualization of a problem can provide metacognitive feedback for another person’s conceptualization.

In sum, student S1’s thinking about her misconception exemplifies how metacognition on both the individual and theocial level can be used to change current ways of thinking. First, her talking aloud about her conceptualization of theroblem demonstrated how she repeatedly monitored and evaluated her thinking by herself, at the individual level, duringhe best floater problem solving. Second, her talking aloud about her conceptualization of a problem also provided feedbackor another student’s conceptualization. In this way, she served as a source for the development of others’ metacognitiont the social level. Finally, she was also encouraged to develop her own metacognition at the social level through feedbackrom others’, especially from student S4 (e.g., Statements 60 and 75). To summarize, she was able to monitor and regulateer thinking processes in the best floater problem at both the individual and social level, in ways that would likely not occururing a less complex problem.

. Implications and future directions

This micro-lensed study of metacognition addresses the sources for developing students’ metacognition. We re-onceptualized metacognition at multiple levels, looking at the sources that trigger metacognition at the individual, social,nd environmental levels. By drawing on this re-conceptualization of metacognition and adopting a portion of Goos’ (2002)ramework, a theoretical model of metacognition in collaborative problem solving was developed based on the models and

odeling perspectives (Figs. 1 and 2).This study made three potentially important contributions to research in metacognition. First, it provided a coherent

odel of metacognition with the potential to make more or less clear the distinction of cognitive (thinking WITH cognitiveomponents) and metacognitive (thinking ABOUT cognitive or metacognitive components) behaviors, and the distinctionf metacognitive functions in the domain of problem solving, which are challenges for research on metacognition (Garofalo

Lester, 1985; Wilson & Clarke, 2002). While Goos’ (2002) framework adopted a linear progression viewpoint of metacog-ition, this model of metacognition developed in this study adopted a multiple dimensional progression viewpoint ofetacognition, representing the MMP view of how problem solving occurs.Second, this study addressed the need for theories that can bridge the cognitive and social realms to inform how an

ndividual’s mind works in context (Schoenfeld, 1999). In particular, the re-conceptualization of metacognition on multipleevels helps resolve the paradox of metacognition: It can explain how individuals who are absent of self-regulation transcendheir conceptual limitations when cognition goes awry by drawing on the conceptual systems of others.

Finally, this study addressed the lack of an authentic method to observe and analyze people’s mental processes, which is

nother obstacle in the study of metacognition (e.g., Garofalo & Lester, 1985; Wilson & Clarke, 2002). In this study, we used aodel-Eliciting Activity for an authentic method for verbal protocol analysis, as well as an environmental source for triggeringetacognition. This study showed how MEAs could be substituted for self-report methods in research on metacognition,

n particular in collaborative problem solving, addressing several criticisms of self-report methods, including accessibility,

Page 18: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

394 Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396

veridicality, retrieval issues, completeness, and reactivity due to artificial setting (e.g., Ericsson & Simon, 1980; Goos & Galbraith,1996).

This case study investigated how two subtasks of an MEA with different demands on definition building and operational-izing definitions fostered the development of students’ metacognition on multiple levels. The results suggest the importanceof different levels of complexity in problem-solving tasks in designing metacognitive learning environments. On one hand,problems that do not require complex negotiation to define constructs, but that provide room for multiple conceptions givestudents the confidence to monitor their own problem solving. Put differently, even though the problem-solving situationis team-oriented, it may require metacognition only on the individual level. On the other hand, problems requiring studentsto grapple with nebulous constructs and negotiate their individual and social meaning encourage metacognitive processeson both the individual and social levels. Such problems require students to share their multiple perspectives, promotingmultiple solution paths, and in turn, requiring multiple solution attempts to meet goals with team agreement. Thus, problemsolving may involve several iterative cycles of revision, demonstrating that more complex metacognitive functions are beingevoked.

In addition, this case study illustrated how one student’s (student S1’s) participation in an MEA could work as a catalystfor the development of individual and social metacognition within a team. This demonstrates that the problem of providingmetacognitive feedback can be off-loaded from one individual’s mind to another’s, and therefore implicates the importanceof social sources such as interactions with peers for improving metacognitive learning environments. Social sources enableone to go beyond the individual’s knowledge or regulation of cognition, which may support only limited metacognitiveprocessing, thus increasing opportunities to develop metacognition. These findings enrich our understanding of how todesign instruction that fosters the development of metacognition. For example, it is important to consider problem situationsrequiring students to define qualitative constructs and negotiate their individual and social meanings in order to createmetacognitive learning environments.

A limitation of the current study is that it is a case study; thus, the findings are not generalizable. Another limitation isthat a small number of metacognitive functions (i.e., assessment and new ideas) were present within the case study. Furtherresearch, such as multiple case studies spanning a range of problems, is required for investigating additional metacognitivefunctions. Further research is needed to understand metacognition on multiple levels, particularly what types of environ-ments encourage metacognition on the social level. One smaller scale qualitative research question is, How do different typesof teaching strategies and styles (based on traditional view versus constructivists’ view) encourage students’ metacognition onmultiple levels? Another is, How do different external sources (teachers, peers, class artifacts, and technology) encourage students’metacognition on multiple levels? Answering these questions would set the stage for larger-scale research on how differentproblems with different levels of complexity support the development of metacognitive abilities in problem solving. Inaddition, we expect that the theoretical model of metacognition on multiple levels would allow further research to exploredevelopmental patterns of students’ metacognitive activities within and across several problem-solving sessions, dimen-sions along which students’ metacognitive abilities develop, and critical events that facilitate or interfere with students’metacognition.

6. Conclusion

At the individual level, students are limited in their ability to self-monitor and self-evaluate their problem solving. Thisis due, in part, to the fact that their individual conceptual systems are based on prior knowledge and experiences. By havingstudents work together in a social setting, students in this study were able to access richer sources to potentially overcometheir individual limitations through feedback and criticism from others. The nature of the problem (i.e., an MEA rather than awell-defined problem) also served as a metacognitive catalyst by requiring students to define constructs and operationalizethose definitions. The results suggest that re-conceptualizing metacognition on multiple levels–the individual level, thesocial level, and the environmental level–is important for developing metacognitive activities and incorporating them intoschool curriculum. Finally, we hope that our theoretical model of metacognition provides a window into how an individual’smind works in contexts, in particular the nature of students’ metacognitive behavior in collaborative problem solving.

Acknowledgements

This material is based in part upon work supported by the National Science Foundation under Grant No. 0717529. Anyopinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do notnecessarily reflect the views of the National Science Foundation.

References

Altman, D. G. (1991). Practical statistics for medical research. London: Chapman and Hall.Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives.

New York: Addison Wesley Longman.Artzt, A. F., & Armour-Thomas, E. (1992). Development of a cognitive-metacognitive framework for protocol analysis of mathematical problem solving in

small groups. Cognition and Instruction, 9(2), 137–175.

Page 19: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

A

B

BBBBCC

C

CD

D

DE

E

EEFFG

G

GG

G

HH

HH

H

I

I

J

JKKK

LL

LL

L

L

L

M

M

M

M

M

Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396 395

rtzt, A. F., & Armour-Thomas, E. (1997). Mathematical problem solving in small groups: Exploring the interplay of students’ metacognitive behaviors,perceptions and ability levels. Journal of Mathematical Behavior, 16(1), 63–74.

ielaczyc, K., Pirolli, P., & Brown, A. L. (1995). Training in self-explanation and self-regulation strategies: Investigating the effects of knowledge acquisitionactivities on problem solving. Cognition and Instruction, 13, 221–252.

ogdan, R., & Biklen, S. (2003). Qualitative research for education: An introduction to theory and methods (4th ed.). Needham Heights, MA: Allyn & Bacon.orkowski, J., Carr, M., & Pressely, M. (1987). Spontaneous strategy use: Perspectives from metacognitive theory. Intelligence, 11, 61–75.ransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school. Washington, D.C.: National Academy Press.rown, K. L. (2003). From teacher-centered to learner-centered curriculum: Improving learning in diverse classrooms. Education, 124(1), 49–54.ardelle-Elawar, M. (1995). Effects of metacognitive instruction on low achievers in mathematics problems. Teaching & Teacher Education, 11(1), 81–95.arr, M., & Biddlecomb, B. (1998). Metacognition in mathematics: From a constructivist perspective. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.),

Metacognition in educational theory and practice. Mahweh, NJ: Lawrence Erlbaum Associates.hamberlin, S. A., & Moon, S. M. (2005). Model-eliciting activities as a tool to develop and identify creatively gifted mathematicians. Journal of Secondary

Gifted Education, 17(1), 37–47.reswell, J. (2006). Qualitative inquiry and research design: Choosing among five traditions (2nd ed.). Thousand Oaks, CA: Sage.arke, P., Shanks, G., & Broadbent, M. (1998). Successfully completing case study research: Combining rigour, relevance, and pragmatism. Information

Systems Journal, 8, 273–289.iefes, H. A., Moore, T. J., Zawojewski, J., Imbrie, P. K., & Follman, D. (2004). A framework for posing open-ended engineering problems: Model-eliciting

activities. In Proceedings of the 34th annual ASEE/IEEE frontiers in education conference Savannah, GA, October 20–23.uncker, K. (1945). On problem solving. Psychological Monographs, 58, 1–113.fklides, A. (2006). Metacognition and affect: What can metacognitive experiences tell us about the learning process? Educational Research Review, 1(1),

3–14.fklides, A. (2008). Metacognition: Defining its facets and levels of functioning in relation to self-regulation and co-regulation. European Psychologist, 13(4),

277–287.nglish, L. D. (2008). Introducing complex systems into the mathematics curriculum. Teaching Children Mathematics, 15(1), 38–47.ricsson, K. A., & Simon, H. (1980). Verbal reports as data. Psychological Review, 87(3), 215–251.lavell, J. H. (1976). Metacognitive aspects of problem solving. In L. Resnick (Ed.), The nature of learning (pp. 231–236). Hillsdale, NJ: Erlbaum.leiss, J. L. (1981). Statistical methods for rates and proportions (2nd ed.). New York: John Wiley & Sons.arofalo, J., & Lester, F. K. (1985). Metacognition, cognitive monitoring, and mathematical performance. Journal for Research in Mathematics Education, 16(3),

163–176.oos, M. (1994). Metacognitive decision making and social interactions during paired problem solving. Mathematics Education Research Journal, 6(2),

144–165. Retrieved 30.05.11, from http://www.merga.net.au/documents/MERJ 6 2 Goos.pdfoos, M. (2002). Understanding metacognitive failure. Journal of Mathematical Behavior, 21(3), 283–302.oos, M., & Galbraith, P. (1996). Do it this way! Metacognitive strategies in collaborative mathematical problem solving. Educational Studies in Mathematics,

30(3), 229–260.oos, M., Galbraith, P., & Renshaw, P. (2002). Socially mediated metacognition: Creating collaborative zones of proximal development in small group

problem solving. Educational Studies in Mathematics, 49(2), 193–223.adamard, J. (1954). An essay on the psychology of invention in the mathematical field. New York: Dover.amilton, E., Lesh, R., Lester, F., & Brilleslyper, M. (2008). Model-Eliciting Activities (MEAs) as a bridge between engineering education research and

mathematics education research. Advances in Engineering Education, 1(2), 1–25.arel, G., & Koichu, B. (2010). An operational defition of learning. Journal of Mathematical Behavior, 29(3), 115–124.elms-Lorenz, M., & Jacobse, A. E. (2008). Metacognitive skills of the gifted from a cross-cultural perspective. In M. F. Shaughnessy, M. V. Veenman, & C. K.

Kennedy (Eds.), Metacognition: A recent review of research, theory, and perspectives (pp. 3–43). Happauge, NY: Nova Publications.urme, T-R., Merenluoto, K., & Järvelä, S. (2009). Socially shared metacognition of pre-service primary teachers in a computer-supported mathematics

course and their feelings of task difficulty: A case study. Educational Research and Evaluation, 15(5), 503–524.iskala, T., Vauras, M., & Lehtinen, E. (2004). Socially-shared metacognition in peer learning? Hellenic Journal of Psychology, 1(2), 147–178. Retrieved 30.05.11,

from http://www.pseve.org/journal/UPLOAD/Iiskala1b.pdfiskala, T., Vauras, M., Lehtinen, E., & Salonen, P. (2011). Socially shared metacognition of dyads of pupils in collaborative mathematical problem-solving

processes. Learning and Instruction, 21(3), 379–393.onassen, D. H. (1997). Instructional design models for well-structured and ill-structured problem-solving learning outcomes. Educational Technology

Research and Development, 45(1), 65–94.ost, J. T., Kruglanski, A. W., & Nelson, T. O. (1998). Social metacognition: An expansionist review. Personality and Social Psychology Review, 2(2), 137–154.aplan, C. A., & Simon, H. A. (1990). In search of insight. Cognitive Psychology, 22, 374–419.itchener, K. S. (1983). Cognition, metaognition, and epistemic cognition: A three-level model of cognitive processing. Human Development, 4, 222–232.ramarski, B., & Mevarech, Z. R. (2003). Enhancing mathematical reasoning in the classroom: The effects of cooperative learning and metacognitive training.

American Educational Research Journal, 40(1), 281–310.andis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159–174.esh, R., & Doerr, H. M. (2003). Foundations of a models and modeling perspective on mathematics teaching, learning, and problem solving. In R. Lesh, &

H. M. Doerr (Eds.), Beyond constructivism: Models and modeling perspectives on mathematics problem solving, learning, and teaching (pp. 3–33). Mahwah,NJ: Lawrence Erlbaum Associates.

esh, R., & Harel, G. (2003). Problem solving, modeling, and local conceptual development. Mathematical Thinking and Learning, 5(2 & 3), 157–190.esh, R., Hoover, M., Hole, B., Kelly, A., & Post, T. (2000). Principles for developing thought-revealing activities for students and teachers. In A. Kelly, & R.

Lesh (Eds.), Research design in mathematics and science education (pp. 591–646). Mahwah, NJ: Lawrence Erlbaum and Associates.esh, R., & Lamon, S. (1992). Assessing authentic mathematical performance. In R. Lesh, & S. J. Lamon (Eds.), Assessment of authentic performance in school

mathematics (pp. 17–62). Washington, D.C.: American Association for the Advancement of Science.esh, R., Lester, F. K., & Hjalmarson, M. (2003). A models and modeling perspective on metacognitive functioning in everyday situations where problem

solvers develop mathematical constructs. In R. Lesh, & H. M. Doerr (Eds.), Beyond constructivism: Models and modeling perspectives on mathematicsproblem solving, learning, and teaching (pp. 383–403). Mahwah, NJ: Lawrence Erlbaum Associates.

esh, R., & Zawojewski, J. (2007). Problem solving and modeling. In F. K. Lester Jr. (Ed.), Second handbook of research on mathematics teaching and learning(pp. 763–804). Reston, VA: National Council of Teachers of Mathematics.

agiera, M. T., & Zawojewski, J. (2011). Characterizations of social-based and self-based contexts associated with students’ awareness, evaluation, andregulation of their thinking during small-group mathematical modeling. Journal for Research in Mathematics Education, 42(5), 486–520.

oore, T. J., & Diefes-Dux, H. A. (2004). Developing model-eliciting activities for undergraduate students based on advanced engineering context. InProceedings of the 34th annual ASEE/IEEE frontiers in education conference Savannah, GA, October 20–23.

oore, T. J., Diefes-Dux, H. A., & Imbrie, P. K. (2006). Assessment of team effectiveness during a complex mathematical modeling task. In Proceedings of the

36th annual ASEE/IEEE frontiers in education conference San Diego, CA, October 28–31.

oore, T. J., & Hjalmarson, M. A. (2010). Developing measures of roughness: Problem solving as a method to document student thinking in engineering.International Journal of Engineering Education, 26(4), 820–830.

uir, T., Beswick, K., & Williamson, J. (2008). I’m not very good at solving problems: An exploration of students’ problem solving behaviors. Journal ofMathematical Behavior, 27(3), 228–241.

Page 20: The Journal of Mathematical Behavior Volume 32 Issue 3 2013 [Doi 10.1016_j.jmathb.2013.04.002] Kim, Young Rae; Park, Mi Sun; Moore, Tamara J.; Varma, Sashank -- Multiple Levels of

396 Y.R. Kim et al. / Journal of Mathematical Behavior 32 (2013) 377– 396

National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics. Reston, VA: The National Council of Teachers ofMathematics, Inc.

Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice Hall.Norman, D. A. (1981). Twelve issues for cognitive science. In D. A. Norman (Ed.), Perspectives in cognitive science (pp. 265–295). Norwood, NJ: Ablex.Prins, F. J., Veenman, M. V. J., & Elshout, J. J. (2006). The impact of intellectual ability and metacognition on learning: New support for the threshold of

problematicity theory. Learning and Instruction, 16(4), 374–387.Rasekh, Z. E., & Ranjbary, R. (2003). Metacognitive strategy training for vocabulary learning. Electronic Journal for English as a Second Language, 7(2), 1–15.Salomon, G., & Globerson, T. (1989). When teams do not function the way they ought to. Journal of Educational Research, 13(1), 89–99.Schoenfeld, A. H. (1985). Mathematical problem solving. Orlando: Academic Press.Schoenfeld, A. H. (1987). What’s all the fuss about metacognition? In A. Schoenfeld (Ed.), Cognitive science and mathematics education (pp. 189–215). Hillsdale,

NJ: Erlbaum.Schoenfeld, A. H. (1999). Looking towards the 21st century: Challenges of educational theory and practice. Educational Researcher, 28(7), 4–14.Schraw, G. (1998). Promoting general metacognitive awareness. Instructional Science, 26(1–2), 113–125.Schraw, G., Dunkle, M. E., & Bendixen, L. D. (1995). Cognitive processes in well-defined and ill-defined problem solving. Applied Cognitive Psychology, 9,

523–538.Shin, N., Jonassen, D. H., & McGee, S. (2003). Predictors of well-structured and ill-structured problem solving in an astronomy simulation. Journal of Research

in Science Teaching, 40(1), 6–33.Stacey, K. (1992). Mathematical problem solving in groups: Are two heads better than one? Journal of Mathematical Behavior, 11(3), 261–275.Stahl, E., Pieschl, S., & Bromme, R. (2006). Task complexity, epistemological beliefs and metacognitive calibration: An exploratory study. Journal of Computing

Research, 35(4), 319–338.Vauras, M., Iiskala, T., Kajamies, A., Kinnunen, R., & Lehtinen, E. (2003). Shared-regulation and motivation of collaborating peers: A case analysis. Psychologia:

An International Journal of Psychology in the Orient, 46, 19–37.Volet, S., Summers, M., & Thurman, J. (2009). High-level co-regulation in collaborative learning: How does it emerge and how is it sustained? Learning and

Instruction, 19(2), 128–143.Volet, S., Vauras, M., & Salonen, P. (2009). Self- and social regulation in learning contexts: An integrative perspective. Educational Psychologist, 44(4), 215–226.White, B. Y., & Frederickson, J. R. (1998). Inquiry, modeling and metacognition: Making science accessible to all students. Cognition and Instruction, 16(1),

3–117.Wilkins, J. L. M., & Ma, X. (2003). Modeling change in student attitude toward and beliefs about mathematics. Journal of Educational Research, 97(1), 52–63.Wilson, J., & Clarke, D. (2002). Monitoring mathematical metacognition. In Paper presented at the annual meeting of the American educational research

association New Orleans, LA, April 1–5.Yin, R. K. (2004). Case study research: Design and methods (3rd ed.). Thousand Oaks, CA: Sage Publications.