promoting self-regulated learning in web-based learning environments

19
Promoting self-regulated learning in web-based learning environments Susanne Narciss * , Antje Proske, Hermann Koerndle Psychology of Learning and Instruction, Institute of Psychology IV, Dresden University of Technology, 01062 Dresden, Germany Available online 17 November 2006 Abstract Self-regulated learning with the Internet or hypermedia requires not only cognitive learning strat- egies, but also specific and general meta-cognitive strategies. The purposes of the Study2000 project, carried out at the TU Dresden, were to develop and evaluate authoring tools that support teachers and students in web-based learning and instruction. This paper presents how the authoring tools of the Study2000 project can implement psychologically sound measures to promote (a) active and elaborated learning activities and (b) meta-cognitive activities in a web-based learning environment. Furthermore, it describes a study involving 72 university students in the use of such a web-based learning environment in a self-regulated learning setting at the university level. Results show that stu- dents spent almost 70% of their study time with texts, 11% with learning tasks and 12% with the active and elaborated learning tools, whereas meta-cognitive aids where hardly used (<1%). Ó 2006 Elsevier Ltd. All rights reserved. Keywords: Self-regulated learning; Web-based learning environments; Learning strategies 1. Promoting self-regulated learning in web-based learning environments The Internet is an open information system in which various sources of information, media and materials (i.e., texts, images, video sequences) can be linked together in diverse ways to form so-called hypertext or hypermedia environments. Thus, it offers new possibilities to structure, represent, adapt and integrate various learning content and 0747-5632/$ - see front matter Ó 2006 Elsevier Ltd. All rights reserved. doi:10.1016/j.chb.2006.10.006 * Corresponding author. Tel.: +49 351 4633 6059; fax: +49 351 4633 7294. E-mail address: [email protected] (S. Narciss). Computers in Human Behavior 23 (2007) 1126–1144 Computers in Human Behavior www.elsevier.com/locate/comphumbeh

Upload: susanne-narciss

Post on 05-Sep-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

Computers in

Computers in Human Behavior 23 (2007) 1126–1144

Human Behavior

www.elsevier.com/locate/comphumbeh

Promoting self-regulated learning inweb-based learning environments

Susanne Narciss *, Antje Proske, Hermann Koerndle

Psychology of Learning and Instruction, Institute of Psychology IV, Dresden University of Technology,

01062 Dresden, Germany

Available online 17 November 2006

Abstract

Self-regulated learning with the Internet or hypermedia requires not only cognitive learning strat-egies, but also specific and general meta-cognitive strategies. The purposes of the Study2000 project,carried out at the TU Dresden, were to develop and evaluate authoring tools that support teachersand students in web-based learning and instruction. This paper presents how the authoring tools ofthe Study2000 project can implement psychologically sound measures to promote (a) active andelaborated learning activities and (b) meta-cognitive activities in a web-based learning environment.Furthermore, it describes a study involving 72 university students in the use of such a web-basedlearning environment in a self-regulated learning setting at the university level. Results show that stu-dents spent almost 70% of their study time with texts, 11% with learning tasks and 12% with theactive and elaborated learning tools, whereas meta-cognitive aids where hardly used (<1%).� 2006 Elsevier Ltd. All rights reserved.

Keywords: Self-regulated learning; Web-based learning environments; Learning strategies

1. Promoting self-regulated learning in web-based learning environments

The Internet is an open information system in which various sources of information,media and materials (i.e., texts, images, video sequences) can be linked together in diverseways to form so-called hypertext or hypermedia environments. Thus, it offers newpossibilities to structure, represent, adapt and integrate various learning content and

0747-5632/$ - see front matter � 2006 Elsevier Ltd. All rights reserved.

doi:10.1016/j.chb.2006.10.006

* Corresponding author. Tel.: +49 351 4633 6059; fax: +49 351 4633 7294.E-mail address: [email protected] (S. Narciss).

S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144 1127

materials. Furthermore, due to its interactivity, learners can process the material inaccordance with their individual preferences and strategies at any time and from anyplace provided an internet connection is available. They may select and examine froma large pool of information only those pieces necessary to meet their learning objectives.Hence, the potential of the Internet as an instructional hypermedia system is consideredto be very high.

However, the universal access to multiple sources of information as well as the non-linear structure and interactivity of open information systems create additional demandson learners in all phases of the learning process; the amount and structure of the learningmaterials can pose learners great difficulties. Various connections between materials fromdifferent sources of information need to be established. These various sources of informa-tion have to be located, examined and evaluated. This process involves risks due to thenon-linearity of web-based documents and the learner’s own knowledge deficiencies withregard to content and strategy.

Among such risks are the flood of information, incoherency of some of the documentsprovided, inconsistency of technical functions and attractive, yet psychologically unsoundmultimedia applications (see also Mayer & Moreno, 2002; Rouet & Levonen, 1996). Thus,learners might be distracted from their learning objectives or lose their way in hyperspace;they may work on too much irrelevant information or only consume important informa-tion cursorily (e.g., Salomon & Almog, 1998). Some may not even start the learning activ-ities as taking even the first step onto this ‘‘mountain of information’’ can be perceived asan insurmountable task. Hence, even though web-based learning environments provideapproaches to self-regulated learning (e.g., problem-based learning, blended learning, pro-ject-based learning), they impose numerous new demands on learners. Moreover, researchon self-regulated learning reveals that learners are often not able to cope with thesedemands (e.g., Pressley & Ghatala, 1990; Simons & de Jong, 1992).

Efficient learning is only guaranteed if learners actively engage in processing learningmaterial. The most important task instructional designers and teachers have to solve isto develop strategies which encourage, prime and guide learners in actively processingthe web-based material (e.g., Mayer, 2003; Mayer & Moreno, 2002). In the Study2000 pro-ject, we developed generic authoring tools to support designers and teachers in masteringthese important instructional tasks. These authoring tools permit the creation of web-based learning environments which provide several study tools for fostering learner’sactive information processing in web-based learning (Narciss & Korndle, 1998, 1999,2001; http://studierplatz2000.tu-dresden.de).

The purposes of this paper are to (a) present psychologically sound approaches to pro-moting self-regulated learning with web-based learning environments, (b) illustrate howinstructional designers/teachers can integrate such approaches to foster self-regulatedlearning into web-based learning environments with the tools of the Study2000 projectand (c) report the results of a study on how students use such a web-based learning envi-ronment in a self-regulated learning setting.

2. Challenges of self-regulated web-based learning

An important prerequisite to the promotion of self-regulated multimedia learning is theawareness of essential challenges. Based on a wide body of psychological research, thepresent analysis aims to attract attention to important factors contributing to the high

1128 S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144

demands of self-regulated learning with web-based learning environments (e.g., Boekaerts,1997; Mayer & Moreno, 2002; Simons & de Jong, 1992).

Self-regulated learning refers to a learning situation in which learners, in addition to set-ting their learning objectives, plan, conduct, regulate and evaluate the learning processindependently (Boekaerts, 1997, 1999; Zimmerman, 2001). Consequently, they must mas-ter a large number of demands, for example preparing learning activities which matchtheir learning objectives, conducting and monitoring these activities as well as searchingfor feedback on their learning progress (Winne, 2001; Winne & Hadwin, 1998). Copingwith these demands efficiently requires not only task or content-related cognitive strate-gies, but also specific meta-cognitive strategies, such as the selection and activation ofappropriate learning and working strategies. Furthermore, the independent managementand regulation of learning processes require the application of general meta-cognitivestrategies which enable the learner to monitor and regulate the learning process (e.g.,Boekaerts, 1997, 1999; Pintrich, 2004; Pressley, Borkowsky, & Schneider, 1989).

Hypermedia systems, such as the Internet, impose additional demands due to (a) theextensive amount of information available, (b) their non-linear structure and (c) techno-logical inconsistencies and limitations. Table 1 represents a summary of general andmedia-specific requirements related to self-regulated learning with multimedia. Theserequirements refer to the phases of the learning process, as well as the different levels ofcognitive and meta-cognitive strategies.

A further factor should also be taken into consideration. Attractive multimedia plat-forms containing, for example, interesting but irrelevant material, can tempt learners toconsume the presented information passively and superficially (coherence principle, seeMayer, 2001). Moreover, the computer as a medium itself induces a more trial-and-error-like behavior as opposed to problem or goal-oriented behavior. Indeed, studies onthe ‘‘lost-in-hyperspace’’ phenomenon reveal that learners tend to ‘‘skip’’ between the doc-uments of a hypermedia system without respecting their semantic or logical relations (e.g.,Rouet, 1990; Rouet & Levonen, 1996; Salomon & Almog, 1998). Ignoring these relationsmakes learning and comprehension much more difficult and results in inefficient learningactivities. Hence, in order to monitor and regulate their learning processes, learners mustdevelop special strategies and heuristics. In other words, they have to evolve a meta-strategy.

Concerning the evaluation of learning processes, it is of essential importance to mon-itor learning activities and their separate phases (Pintrich, 2004), and if necessary, askfor (external) feedback (e.g., Aleven, Stahl, Schworm, Fischer, & Wallace, 2003). A largenumber of computer-based learning systems offer task or question modules for evaluat-ing the learning process (Hoskins & van Hooff, 2005; Nadolski, Kirschner, & van Mer-rienboer, 2006). However, the absence of such modules, for instance in open, networkedinformation systems, requires learners to develop certain strategies to specify their learn-ing objectives in such ways that they can be used as criteria for examining learningsuccess.

In addition to the above-mentioned cognitive and meta-cognitive demands, self-reg-ulated learning also imposes high demands on the regulation of motivation and atten-tion (cf. Boekaerts, 1997). Motivational factors determine, for example, the tasksselected and effort invested in solving them. Furthermore, they are especially importantif learners encounter hurdles whose successful mastering requires (additional) learningactivities.

Tab

le1

Ove

rvie

wo

nre

qu

irem

ents

of

self

-reg

ula

ted

lear

nin

gth

rou

ghw

eb-b

ased

hyp

erm

edia

syst

ems

Lev

els

of

cogn

itiv

ere

gula

tio

nP

has

eo

fth

ele

arn

ing

pro

cess

Pre

par

atio

nP

roce

ssE

valu

atio

n

Gen

eral

,m

eta-

cogn

itiv

ere

qu

irem

ents

Set

tin

ggo

als

and

spec

ifyi

ng

sub

-go

als

�W

hat

do

Ik

no

wal

read

y?�

Wh

atd

oI

wan

tto

kn

ow

?�

Wh

ich

org

aniz

atio

nal

con

-st

rain

tsh

ave

tob

eta

ken

into

acco

un

t?

Mo

nit

ori

ng

and

regu

lati

ng

lear

nin

gac

tivi

ties

�Is

the

sele

cted

mat

eria

lu

sefu

l?�

Hav

eI

cho

sen

anap

pro

pri

ate

lear

nin

gst

rate

gy?

�H

ow

can

Io

verc

om

eo

bst

acle

s?�

Ho

wca

nI

corr

ect

mis

tak

es?

Eva

luat

ing

lear

nin

gp

rogr

ess

�W

hat

do

Ik

no

wn

ow

?�

Hav

eth

ego

als

bee

nse

tap

pro

pri

atel

yw

ith

inth

eo

rgan

izat

ion

alco

nst

rain

ts?

�W

hat

sou

rces

of

info

rmat

ion

pro

ved

tob

eva

luab

le?

Sp

ecifi

cm

eta-

cogn

itiv

ere

qu

irem

ents

Pla

nn

ing

the

lear

nin

gp

roce

ss�

Wh

atso

urc

eso

fin

form

atio

nar

ere

leva

nt

for

the

goal

s?�

Wh

ere

can

Ifi

nd

thes

eso

urc

es?

�H

ow

lon

gd

oes

itta

ke

too

bta

inm

ater

ial

fro

mth

ese

sou

rces

?

Sel

ecti

ng

and

acti

vati

ng

lear

nin

gst

rate

gies

�W

hat

lear

nin

gst

rate

gies

are

rele

van

t?�

Wh

enan

dh

ow

can

Iu

seth

ese

lear

nin

gst

rate

gies

?�

Ho

wd

oI

cop

ew

ith

mat

eria

lan

dm

edia

for

wh

ich

Id

on

ot

kn

ow

ale

arn

ing

stra

tegy

?

Pla

nn

ing

the

eval

uat

ion

of

the

lear

nin

gp

rogr

ess

�H

ow

can

lear

nin

ggo

als

be

tran

sfo

rmed

into

crit

eria

for

eval

uat

ing

lear

nin

go

utc

om

es?

�A

reth

ere

any

test

sfo

rth

egi

ven

do

mai

no

fk

no

wle

dge

?�

Are

ther

ean

yca

sest

ud

ies

toch

eck

kn

ow

l-ed

getr

ansf

er?

�H

ow

can

Io

bta

inex

tern

alfe

edb

ack

?

Tas

kan

dco

nte

nt-

rela

ted

cogn

itiv

ere

qu

irem

ents

Org

aniz

ing

lear

nin

gm

ater

ial

and

med

ia�

Ob

tain

ing

�S

can

nin

g�

Exp

lori

ng

�S

elec

tin

g�

Str

uct

uri

ng

�D

ocu

men

tin

g

Pro

cess

ing

lear

nin

gm

ater

ial

and

med

ia�

Rea

din

gan

dco

mp

reh

end

ing

text

s�

Stu

dyi

ng

pic

ture

san

dgr

aph

ics

�L

iste

nin

gto

and

loo

kin

gat

aud

io-v

isu

alm

edia

�E

xplo

rin

gsi

mu

lati

on

s�

Co

nd

uct

ing

and

eval

uat

ing

exp

erim

ents

Ret

riev

ing

and

app

lyin

gac

qu

ired

kn

ow

led

ge�

An

swer

ing

qu

esti

on

s�

So

lvin

gle

arn

ing

task

s�

Wo

rkin

go

nca

sest

ud

ies

�T

ran

sfer

rin

gfe

edb

ack

S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144 1129

1130 S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144

3. Study2000 – promoting self-regulated hypermedia learning

Research in the field of self-regulated learning reveals that many learners fail to controland regulate their learning activities with hypermedia systems due to a deficit in the skillsnecessary to comply with the above-mentioned demands (see e.g., Azevedo, 2002; Chen &Rada, 1996; Dillon & Gabbard, 1998; Simons & de Jong, 1992). An analysis of the chal-lenges posed by self-regulated hypermedia learning shows that various instructional inter-ventions are necessary to promote learning with hypermedia systems. These can bedeveloped and examined for the various phases of the learning process, taking into con-sideration different cognitive and meta-cognitive strategies.

Hadwin, Winne, and Nesbitt (2005) identified two broad categories of instructionalinterventions: (a) tools that deliver instruction and (b) tools that guide and tutor learning.According to Friedrich and Mandl (1997) the first might also be referred to as direct inter-ventions and the second as indirect interventions. These interventions can be either embed-ded or non-embedded in the learning environment (Clarebout & Elen, 2006).

Embedded instructional interventions are integrated in the learning environment andthus learners are forced to consider them. Meta-cognitive guiding, such as regularlyprompting monitoring strategies (e.g., Winne & Stockley, 1998; Bannert, 2004) is an exam-ple of an embedded direct intervention. Orientation and navigation support, such as givinga well-structured overview of the available documents in the form of a hierarchically orga-nized table of contents (e.g., Dee-Lucas, 1996; Rouet & Levonen, 1996) can be consideredan embedded indirect instructional intervention.

The use of non-embedded instructional interventions is dependent on the learner’s ini-tiative. These instructional interventions are provided within the learning environment,but the learners can decide to use them or not. As such, self-regulated learning with hyper-media can be promoted with non-embedded indirect interventions such as psychologicallysound active learning tools, for example, highlighting or note-taking facilities (van Oosten-dorp, 1996). Furthermore, learning tasks of varying complexity can support the monitor-ing and evaluation of learning progress and represent likewise a non-embedded indirectintervention (see Proske, Korndle, & Narciss, 2004).

Providing informative tutoring feedback (Narciss, 2004, 2006) within these learningtasks is a non-embedded combination of a direct and indirect intervention. Informativetutoring feedback provides strategically useful information that stepwise guides the lear-ner towards successful task completion, thereby assisting multiple solution attempts. Onthe one hand, informative tutoring feedback delivers instruction to solve the task suc-cessfully; on the other hand the information provided guides and tutors the learning pro-cess. Furthermore, learners have to act in order to receive these instructions: they haveto work on a task and, in case of a mistake, mindfully use the provided informativefeedback information.

In the Study2000 project we developed, implemented and evaluated generic authoringtools to facilitate the ergonomically and psychologically sound design of web-based learn-ing environments (see http://studierplatz2000.tu-dresden.de). These tools include:

� The s2w-compiler (Study-to-Web Compiler), which supports instructors in combiningand integrating multiple learning materials and media into an integrated interfacewhich provides direct and efficient access to all materials and media (accessible at:http://studierplatz2000.tu-dresden.de/s2w).

S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144 1131

� The EF-Editor (Exercise Format Editor) authoring tool, which facilitates the construc-tion and implementation of interactive learning tasks (for a detailed description seeProske et al., 2004; http://studierplatz2000.tu-dresden.de/efb). In contrast to test exer-cises, interactive learning tasks are solved interactively with the additional aid of multi-ple-try strategies and informative tutoring feedback if required (Narciss, 2004, 2006;Narciss & Huth, 2004).

We call a web-based learning environment designed with these authoring tools Studier-

platz (in English: Study Desk), that is, a working space for learning and studying. In con-trast to web-based learning components which deliver isolated content, Study Desks aredesigned to complement instruction in different learning contexts, such as lectures, semi-nars or project-based courses. They present multiple materials and information on a spe-cific topic. Thus, students are not only able to prepare for lessons, but also repeat andelaborate knowledge in a self-regulated manner. A Study Desk employs the followinginstructional interventions to promote self-regulated learning in a hypermediaenvironment:

� embedded indirect intervention orientation and navigation support;� non-embedded indirect intervention tools for active and elaborated learning – learning

tools and elaboration resources;� non-embedded indirect intervention tools for monitoring and evaluation – monitoring

tools and interactive learning tasks;� non-embedded combined intervention informative tutoring feedback.

3.1. Orientation and navigation – the Study Desk-interface

A Study Desk offers access to various learning materials (e.g., texts, transparencies,experiments, exercises, WWW-links and references). To support orientation and naviga-tion, students should be informed: (a) of their current location within the learning envi-ronment and the steps they have taken to reach this location; (b) which parts of thelearning environment they have already processed and in which way they can processinformation; and (c) which additional information sources are available (Conklin,1987). More specifically, the Study Desk-interface offers the following information (seeFig. 1):

1. Information on the content structure is offered by a hierarchically structured table ofcontents. As scientific topics tend to be complex, this table of contents is presented insuch a way that at first only the main chapters are indicated, thus offering a generaloverview. A mouse click on one of the entries gives a detailed view of its respectivesub-chapters.

2. A running-title line above the actual working area gives information on the current(sub-)chapter.

3. Labeled user buttons in the bottom frame of the screen provide information on theavailability of multiple multimedia resources (e.g., references, links to relevant web-pages, videos, learning tasks, labelled here exercises). The button is blue if a resourceis available in the current chapter and grey if unavailable.

Fig. 1. Interface of a Study Desk – navigational support.

1132 S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144

4. Information on monitoring learning activities is presented in the table of contents bymeans of indicating already treated chapters and is additionally provided in the formof a progress protocol which offers an overview on all running and accomplished learn-ing activities (see also Fig. 3).

In terms of the overview in Table 1 offering this information aims at supporting theplanning and monitoring of the learning process, and the efficient processing of learningmaterials.

3.2. Active elaborated learning – active learning and elaboration tools

Active, elaborated information processing is an important condition for the efficientand successful acquisition of knowledge from a cognitive view (see Lockhart & Craik,1990 for a summary). However, as mentioned above, learners employ activities such assurfing, scanning, and trial-and-error-like exercise completion more often than activitiesof deep information processing in attractive multimedia learning environments. In orderto counter these superficial learning activities, a Study Desk provides tools for active learn-ing and elaboration which enable learners to mark certain sections, make notes and inte-grate material they consider to be of interest or importance into an individual dossier.These tools permit the application of widely-used conventional study methods with whichthe students are familiar.

S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144 1133

The learning tools can be activated by clicking on the related buttons in the bottomframe of the screen. To highlight interesting or important words, sentences, paragraphsor pictures, the learner activates the marking tool, chooses a color, clicks on the firstword or element of interest, and then clicks on its last word or element. Consequently,the space between these words or elements is highlighted in the selected color. When tak-ing notes, the learner activates the note-taking tool, clicks on the word or element of theworking space referring to the intended note and writes the note into the note-takingwindow. These notes are saved and flagged in the working space by a small yellowtag (see Fig. 2). To document and organize material, learners activate the integrator tooland save all materials they consider important (e.g., transparencies, web sites, pictures,etc.) with their individual notes to an individual dossier, which can play the collectedmaterial in a slide-show. Furthermore, learners have constant access to a glossary con-taining the relevant terms of the corresponding Study Desk, their definitions andsynonyms.

Elaboration resources provide additional information about the subject of the corre-sponding Study Desk. In addition to the learning text, Study Desks provide audio-visualmaterials such as lecture transparencies, videos and experiment simulations. Various com-mented Internet resources as well as references, suggested readings and original researchpapers are made available so that learners have the opportunity to process informationon the particular topic from various perspectives.

Fig. 2. Tools for active, elaborated learning integrated into Study Desk.

1134 S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144

3.3. Monitoring and evaluation – learning tasks with feedback, progress report

Monitoring and evaluating the learning progress are essential for successful self-regu-lated learning (Boekaerts, 1999; Pintrich, 2004). Study Desk supports monitoring by offer-ing access to the protocol of all learning activities (progress and task report). Thus,learners may check which chapters they have already completed, the amount of materialand media still at their disposal as well as the number of accomplished and unaccom-plished learning tasks (see Fig. 3).

A core component for fostering learners’ evaluation is an interactive learning task mod-ule containing multiple tasks of diverse complexity together with informative tutoringfeedback and/or tutoring instructions (e.g., Nadolski et al., 2006).

The term informative tutoring feedback (ITF) refers to feedback types which providestrategically useful information for successful task completion (Narciss, 2004, 2006).ITF applies a combination of program and learner control. The strategic informationis presented so as to tutor students in detecting errors, overcoming obstacles and apply-ing more efficient strategies towards solving learning tasks. After learners fail on thefirst response of a learning task, the program delivers immediate feedback indicatinga mistake has been made. They then receive a prompt to use available hint information(program control). To receive this information, the learners have to take action, that is,they click on the button ‘‘hint’’ (learner control). After the next incorrect attempt thesystem (a) gives an evaluation of the overall performance and (b) marks correctly

Fig. 3. Example of a protocol of task processing.

Fig. 4. Example of a complex learning task with informative tutoring hint.

S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144 1135

answered parts of the question green,1 while incorrectly answered question parts arelabeled red. Again the learner is prompted to use the hint information. Hence, workingon interactive learning tasks may enable learners to find knowledge gaps, correct mis-takes and regulate the further learning process independently (Gao & Lehman, 2003).Fig. 4 illustrates an interactive learning task with ITF-components created with theEF-Editor.

4. How students use Study Desk tools

We conducted a study examining if and how students use the learning tools offered byStudy Desk. Its main purpose was to investigate to what extent we succeeded with theabove-described tools in initiating task and content-related learning activities (marking,note-taking and elaboration) and meta-cognitive activities (monitoring and evaluatingthe learning process and outcomes).

1 For interpretation of the references in color in Fig. 3, the reader is referred to the web version of this article.

1136 S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144

4.1. Method

4.1.1. Participants

Ninety-one university students of an introductory lecture to general psychology partic-ipated in the study. Most participating students were in their second year of studies. Assome students worked less than 5 min and others encountered technical problems withthe Study Desks, a total of 72 individual log-files could be used for data-analyses (48women, 24 men; mean age 22; SD = 2.4).

4.1.2. Material and procedure

Students had 5 Study Desks on different learning theories at their disposal over the per-iod of one university semester. The contents of these Study Desks complemented the learn-ing theories curricula of the introductory lecture to general psychology (see Table 2).

The successful completion of the course required students to pass a test at the end of thesemester. Students were free to work on the 5 Study Desks as many times and as long asdesired. Furthermore, there were no restrictions regarding the learning objectives and theselected topics. Hence, students had to self-regulate their learning activities with the 5Study Desks.

4.1.3. Measures

Each time students worked with a Study Desk they logged in with an individual username and password, meaning that for each student all learning activities were program-tracked in an individually coded log-file. Frequency and time of each learning activity wereautomatically summarized in each log-file. The measure of total working time representsthe sum of time on all learning activities. Yet, there was a high variability in the amountof available material in the 5 Study Desks (see Table 2), the selection of the Study Desks

and the time spent on the selected Study Desks. In order to account for these differences,the measure of time on a particular learning activity was standardized using percentages.As such, the time on a particular learning tool, elaboration resource or monitoring tool isexpressed as percentage of total working time (% time-on-text; % time-on-learning tasks; %time-on-tools for active learning and elaboration; % time-on tools for monitoring and

evaluation).All interactive learning tasks implemented in the Study Desks provide multiple-try strat-

egies and informative tutoring feedback information. Informative tutoring feedback

Table 2Topics, number of text pages, learning tasks and elaboration resources of the Study Desk, number of studentsworking with the particular Study Desk and mean number of logins for each student

Topic of theStudy Desk

Mainchapters

Textpages

Learningtasks

Elaborationresources

Students Logins

M SD

Introduction to learningtheories

3 17 21 61 58 2.74 2.16

Classical conditioning 5 28 42 30 50 2.94 2.24Operant conditioning 9 136 75 128 47 2.70 2.53Purposive behaviorism 4 26 16 9 30 2.23 1.17Socio-cognitive theory

of learning4 15 26 27 40 2.58 2.17

S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144 1137

guides the learner towards successful task completion. In order to assess the quality of stu-dents’ learning activities while working with the 5 Study Desks the percentage of correctlysolved learning tasks in a first attempt was taken as a performance measure. This perfor-

mance measure represents the number of correctly solved learning tasks related to thenumber of learning tasks students had at their disposal within their selected Study Desks.

Finally, students’ perceived acceptance regarding the work with the Study Desks wasmeasured by a shortened version of the Isometrics (Gediga, Hamborg, & Duntsch,1999). All students who worked at least 30 min with the Study Desks were asked to answer20 items on a rating scale ranging from 1 (=I do not agree at all) to 6 (=I agree com-pletely). These items addressed usability features such as ease of learning, efficiency ofuse, memorability and controllability, error frequency and severity and subjective satisfac-tion (Cronbach’s a = 0.92). Furthermore, students had the opportunity to leave open com-ments and feedback on usability.

4.2. Results

The number of students who worked with each Study Desk and the frequency of theiruse were analyzed first. As indicated in Table 2, 80% of the students worked with the intro-ductory Study Desk, 65–69% with the Study Desks on conditioning, 55% with the Study

Desk on the socio-cognitive theory of learning and 42% used the purposive behaviorismStudy Desk. The mean number of individual logins per Study Desk varied between twoand three.

4.2.1. Total working time

The analysis of total working time revealed a large variability in the time students spentstudying with the Study Desks (M = 194.5 min, SD = 229.9). Some students spent only afew minutes while others worked for hours. Due to the high variability the sample wasdivided into three percentiles by means of total working time for analysis of students’learning activities:

(a) 23 students working less than 40 min (<40 group)

(M = 18 min, SD = 10.1);

(b) 25 students working between 40 and 180 min (40–180 group)

(M = 87 min, SD = 39.5);

(c) 24 students working more than 180 min (>180 group)

(M = 475 min, SD = 188.0).

4.2.2. Learning activities

An analysis addressing the question of the amount of time the students of these threegroups utilized the tools for elaborated learning provided the results represented in Table3. Students of all groups spent most of their total working time in processing learning texts(65.3–72.4%). Learning tasks were processed within 7.2–15.1% of the total working time.A Kruskal–Wallis test provided evidence that the differences in the relative time on taskprocessing between the three groups are statistically significant (v2(2,72) = 8.89;p = 0.01). Students of the >180 group spent significantly more time on task processingthan students of the other groups (see Table 3).

Table 3Descriptive statistics for learning activities for three groups of students

% Time on learning activityb Groups defined by working time in minutesa

<40 40–180 >180

M SD M SD M SD

Total working time 17.8 5.6 87.3 43.4 475.6 188.0Text processingb 65.3 27.1 72.4 23.7 69.0 21.9Learning tasks processingb 11.6 23.6 7.2 15.9 15.1** 15.6

% Time on learning tools 6.1 11.5 10.1 15.4 8.4 11.9� Markingb 2.4 7.2 6.9 14.9 4.9 9.6� Note-takingb 0.1 0.6 0.2 1.0 0.01 0.1� Glossaryb 1.1 4.3 1.8 4.0 2.6* 6.1� Integratorb 2.5 8.2 1.2 4.9 0.8 2.5

% Time on elaboration resources 5.8 11.4 2.7 4.8 2.7 4.0� WWW-linksb 1.1 3.3 0.2 0.6 0.4 1.4� Transparenciesb 1.2 3.3 1.4 3.6 1.7 3.5� Videosb 3.6 8.0 0.7 1.2 0.5 1.2� Experimentsb 0 0 0.4 1.4 0.1 0.3� Research papersb 0 0 0 0 0 0

% Time on monitoring tools 1.7 4.9 0.3 0.5 0.3 1.2� Progress reportb 0.3 0.9 0.1 0.1 0.01 0.01� Learning task reportb 0.1 0.2 0.04 0.1 0.03 0.06� Material overviewb 1.4 4.3 0.2 0.3 0.3 1.2

Note: Time spent on technical help functions and on questionnaires is not listed here.a <40 min = 23 students working less than 40 min with the 5 Study Desks; 40–180 min = 25 students working

40–180 min with the 5 Study Desks; >180 min = 24 students working more than 180 min with the 5 Study Desks.b Measures represent percentages of the total working time.* p < 0.05.

** p < 0.01.

1138 S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144

The tools for active learning were used within 6.1–10.1% of the total working time. Asrepresented in Table 3, the marking tool was used more often than the note-taking tool,the glossary or the integrator. It is worth noting that the >180 group spent significantlymore time querying terms in the glossary than the other groups (v2(2, 72) = 5.97;p = 0.05). The percentage of time on elaboration resources varied between 2.7% and5.8%. As indicated in Table 3, WWW-links, transparencies and videos were used by somestudents. However, the experiments and original research papers were hardly used. Thesame is true for the monitoring tools.

4.2.3. Quality of learning activities – performance

The number of processed texts and learning tasks were analyzed in greater detail as stu-dents spent the most time carrying out these activities (see Table 4). This revealed that the>180 group processed 58.4% of the texts, while the 40–180 group and the <40 group pro-cessed 35.4% and roughly 24% of the texts, respectively. These differences are statisticalsignificant (v2(2, 72) = 22.1; p < 0.01).

By analyzing the number of learning tasks students processed related to the number oflearning tasks students had at their disposal, it was found that the >180 group worked on23.8% of the learning tasks, whereas the students of the other groups worked on approxi-mately 5% of the available tasks (v2(2,72) = 13.20; p < 0.01). Furthermore, the >180 group

Table 4Quantity and quality of text and learning task processing for three groups of students

Groups defined by working time in minutesa

<40 40–180 >180

Percentage of texts processed 23.96 35.43 58.43*

(19.14) (27.54) (21.36)

Percentage of learning tasks processed 4.44 5.52 23.80*

(11.21) (13.26) (28.46)

Percentage of correctly solved learning tasks (total) 3.22 4.03 21.05*

(9.18) (10.25) (28.06)

Percentage of correctly solved learning tasks (relative to theratio of processed tasks)

52.4 59.9 66.8(43.1) (38.3) (40.8)

Note: Values in parentheses represent standard deviations.a <40 min = 23 students working less than 40 min with the 5 Study Desks; 40–180 min = 25 students working

40–180 min with the 5 Study Desks; >180 min = 24 students working more than 180 min with the 5 Study Desks.* p < 0.01.

S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144 1139

solved 21.1% of the tasks correctly, whereas the students of the <40 group and the 40–180group correctly solved only 3.2% and 4.0% of the tasks, respectively (v2(2,72) = 10.08;p < 0.01). Kruskal–Wallis tests reveal that these differences are statistically significant. Itis worth noting that the relative ratio of correctly solved tasks is also higher for the >180group (66.8%) than for the 40–180 (59.9%) and <40 groups (52.4%) even though one mightexpect that working on more tasks makes it more likely to make errors.

To analyze how tool-use was related to learners’ performance, we conducted a correla-tion analysis for those subjects of the >180 group who had worked on learning tasks(n = 17; for the two other groups a correlation analysis was not reasonable because onlyn = 6 students of the <40 group and n = 8 of the 40–180 group worked on learning tasks).This correlation analysis yielded only two significant correlations: first, time-on learningtasks and performance was positively correlated (r = 0.54, p < 0.05), second, time-on glos-sary and performance was negatively correlated (r = �.51, p < 0.05).

4.2.4. Acceptance

Perceived acceptance was assessed after working for at least 30 min with the StudyDesks. However, some technical difficulties in providing the acceptance questionnaire wereexperienced. Hence, only four students of the <40 group (M = 4.44, SD = 0.60), 14 stu-dents of the 40–180 group (M = 3.88, SD = 0.76), and 23 of the >180 group (M = 4.06,SD = 0.95) received this questionnaire and were able to answer it (n = 41). All studentsrated the usability of the Study Desks as rather high (M = 4.03, SD = 0.86). The opencomments also reflected a rather high usability of the Study Desk-Interface. However,the students would have liked to have additional paper versions of the texts and tasksat their disposal.

5. Discussion

This article described the implementation of instructional interventions into a web-basedlearning environment, known as Study Desk. The instructional interventions implemented,

1140 S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144

either embedded, non-embedded (Clarebout & Elen, 2006), or a combination of embeddedand non-embedded tools, should foster self-regulated learning in web-based learningenvironments.

The results of our study show that there is a high variability in total working time.Whereas some students studied for only a few minutes, others spent more than 7 h withthe Study Desks. However, the results of the analysis of students’ learning activities whenstudying with the 5 Study Desks indicate that students employ relatively the same learningstrategies in web-based learning environments as they do with printed textbooks. On aver-age, students spent 70% of their total working time with processing texts. They spent amaximum of 15% of total working time processing learning tasks and a maximum of13% of total working time using the active learning and elaboration tools. Only a verysmall number of students used the monitoring tools. Furthermore, only a few studentsused the experiments, whereas the possibility of downloading original research paperswas not used at all.

Within the time students spent with the 5 Study Desks, students who worked more than3 h processed, on average, 58% of the available texts and 24% of the available learningtasks. Furthermore, they succeeded in solving 21% of the learning tasks correctly. More-over, the more time students worked on learning tasks the higher their performance. Stu-dents working less than 3 h processed a significantly lower percentage of texts and tasksand achieved significantly poorer scores. Despite the variability in total working time, per-formance and time spent with different learning activities all students rated the usability ofthe Study Desks as rather high.

There might be several reasons accounting for the minimal use of the active learning,elaboration and monitoring tools. First, the given instructional context (i.e., an introduc-tory lecture with optional, complementary web-based learning) may be associated in par-ticular with text processing. However, for university students one would have wished thatthey had used the tools providing original experiments and research papers, in order toelaborate the material presented in the main lecture. Furthermore, the large amount ofmaterial in the 5 Study Desks would have required the application of meta-cognitive activ-ities such as planning, monitoring and evaluating their learning activities. Unfortunately,the present data do not explain why these tools remained virtually unused. Thus, futureresearch should address this issue.

Second, the participants of the study were in their second year of study. Consequently,the majority of their prior lectures were at an introductory level. Hence, the students maynot have had the opportunity to develop strategies for active elaborated knowledge con-struction and self-regulation. Furthermore, inexperienced learners might not be able totranslate their usual learning behavior into efficient web-based learning activities (see alsoSeufert, Janen, & Brunken, 2006). In order to protect such students from being overloadedby the demands of efficient self-regulated web-based learning, they could be persuaded touse the provided tools by direct interventions (Kester & Paas, 2005). While such directinterventions are useful for inexperienced learners, they can hinder the learning processof those with greater experience. Thus, future research should, on the one hand, focuson comparing students’ usual study strategies and learning activities with those in web-based learning. On the other hand, it should answer the question of how direct interven-tion of a web-based learning environment can be adapted to the level of learners’ expertise.

Third, the quality of the available texts, the quality of the Study Desk-interface, and thetechnical quality of the specific tools may have contributed to the present results: (a) The

S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144 1141

texts presented in the Study Desks were developed respecting guidelines for writing com-prehensive web-based texts. Each web-based text unit was thus concise, coherent and easyto understand. Perhaps some students considered it unnecessary to highlight or to makenotes. (b) As described above, the Study Desk-interface provides information on whichchapters and learning tasks had been processed by marking the processed chapters andtasks. The specific monitoring of text and task processing is thus possible without usingthe progress or learning task reports. Thus, the minimal use of the monitoring tools doesnot reveal if students engaged in specific meta-cognitive activities or not. It does, however,indicate that students did not engage in general meta-cognitive activities (e.g., planning,monitoring and evaluating learning activities within the available time). General meta-cognitive activities can only be applied if one checks the overall amount of available mate-rial, the amount of successfully treated material, as well as the material remaining to beprocessed. (c) There were several technical limitations in applying the tools which provideaccess to audio-visual material and media (e.g., videos, experiments, and WWW-links).The efficient usage of these tools required a fast and powerful Internet access. However,not all students had such an Internet access. Whereas these technical limitations mightexplain why only a few students used these elaboration tools, they do not explain the rea-son the monitoring tools and the tool for downloading original research paper remainedvirtually unused.

An important issue related to the disappointing result that students left the non-embedded intervention tools unused is the question of how to prepare web-based learn-ing environment users for self-regulated learning with Internet resources. The issue is notonly limited to non-embedded intervention tools as it can also be argued that evenembedded direct interventions can be ignored or processed superficially by the students(cf. Aleven, McLaren, & Koedinger, 2006). Prior research on self-regulated learning withcomputer-aided instruction shows that direct and indirect instructional interventionsshould be combined in order to foster self-regulated learning both with and withoutmodern information technologies (e.g., Simons & de Jong, 1992; Winne & Stockley,1998). Furthermore, investigations should be carried out to determine (a) which toolsare appropriate for attaining specific learning goals and (b) the conditions under whichlearners consult these tools (see Clarebout & Elen, 2006; see also Munneke, Andriessen,Kanselaar, & Kirschner, 2006).

Another issue for future research and practice is the question of how individual vari-ables may determine the way students learn with web-based learning environments. Stud-ies on cognitive and personality variables reveal that academic ability and attitude (e.g.,Hiltz, 1993), multimedia comprehension skill (Maki & Maki, 2002) and openness to expe-rience (Maki & Maki, 2003) are significantly related to performance when learning withweb-based learning environments. Furthermore, a study of Joo, Bong, and Choi (2000)provided insight into the role of self-efficacy in web-based learning performance. However,to date there has been little research into how individual differences in learning strategiesand styles, learners’ goals and motivational orientations and learners’ meta-cognitive skillscontribute to differences in studying in web-based learning environments. Certainly, theseissues deserve further investigation.

Moreover, in the present study, Study Desks fostered the coverage of general, meta-cog-nitive requirements (e.g., monitoring and regulating learning activities, evaluating learningprogress; see Table 1) as well as task and content-related requirements (e.g., processinglearning material and media, retrieving and applying acquired knowledge; see Table 1).

1142 S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144

Tools that foster specific meta-cognitive requirements, such as planning learning processor selecting and activating learning strategies, were not included. Thus, future studiesshould investigate if the integration of planning and goal setting tools, in which studentscan determine their own learning activities independently, enhances the use of active learn-ing, elaboration and monitoring tools.

References

Aleven, V., McLaren, B. M., & Koedinger, K. R. (2006). Toward computer-based tutoring of help-seeking skills.In S. A. Karabenick (Ed.), Help seeking in academic setting: Goals, groups, and contexts (pp. 259–296).Mahwah, NJ: Lawrence Erlbaum Associates.

Aleven, V., Stahl, E., Schworm, S., Fischer, F., & Wallace, R. (2003). Help seeking and help design in interactivelearning environments. Review of Educational Research, 73, 277–320.

Azevedo, R. (2002). Beyond intelligent tutoring systems: computers as metacognitive tools to enhance learning?Instructional Science, 30, 31–45.

Bannert, M. (2004). Designing metacognitive support for hypermedia learning. In H. M. Niegemann, D. Leutner,& R. Brunken (Eds.), Instructional design for multimedia learning (pp. 19–30). Munster: Waxmann.

Boekaerts, M. (1997). Self-regulated learning: a new concept embraced by researchers, policy makers, educators,teachers and students. Learning and Instruction, 7, 161–186.

Boekaerts, M. (1999). Self-regulated learning: where we are today. International Journal of Educational Research,

31, 445–475.Chen, C., & Rada, R. (1996). Interacting with hypertext: a meta-analysis of experimental studies. Human–

Computer Interaction, 11, 125–156.Clarebout, G., & Elen, J. (2006). Tool use in computer-based learning environments: towards a research

framework. Computers in Human Behavior, 22, 389–411.Conklin, J. (1987). Hypertext: an introduction and survey. IEEE Computer, 20(9), 17–41.Dee-Lucas, D. (1996). Effects of overview structure on study strategies and text representations for instructional

hypertext. In F. Rouet, J. J. Levonen, A. Dillon, & R. J. Spiro (Eds.), Hypertext & cognition (pp. 73–107).Mahwah, NJ: Lawrence Erlbaum Associates.

Dillon, A., & Gabbard, R. (1998). Hypermedia as an educational technology: a review of the quantitativeresearch literature on learner comprehension, control, and style. Review of Educational Research, 68,322–349.

Friedrich, H. F., & Mandl, H. (1997). Analyse und Forderung selbstgesteuerten Lernens (Analyzing andpromoting self-regulated learning). In F. E. Weinert & H. Mandl (Eds.), Enzyklopadie der Psychologie:

Themenbereich D Praxisgebiete. Serie 1 Padagogische Psychologie. Band 4 Psychologie der Erwachsenenbildung

(pp. 237–293). Gottingen: Hogrefe.Gao, T., & Lehman, J. D. (2003). The effects of different levels of interaction on the achievement and motivational

perceptions of college students in a web-based learning environment. Journal of Interactive Learning Research,

14, 367–386.Gediga, G., Hamborg, K.-C., & Duntsch, I. (1999). The IsoMetrics usability inventory: an operationalisation of

ISO 9241-10. Behaviour and Information Technology, 18, 151–164.Hadwin, A. L., Winne, P. H., & Nesbitt, J. C. (2005). Roles for software technologies in advancing research and

theory in educational psychology. British Journal of Educational Psychology, 75, 1–24.Hiltz, S. R. (1993). Correlates of learning in a virtual classroom. International Journal of Man–machine Studies,

39, 71–98.Hoskins, S. L., & van Hooff, J. C. (2005). Motivation and ability: which students use online learning and what

influence does it have on their achievement? British Journal of Educational Technology, 36, 177–192.Joo, Y. J., Bong, M., & Choi, H. J. (2000). Self-efficacy for self-regulated learning, academic self-efficacy and

Internet self-efficacy in web-based instruction. Educational Technology Research & Development, 48(2),5–17.

Kester, L., & Paas, F. (2005). Instructional interventions to enhance collaboration in powerful learningenvironments. Computers in Human Behavior, 21, 689–696.

Lockhart, R. S., & Craik, F. I. (1990). Levels of processing: a retrospective commentary on a framework formemory research. Canadian Journal of Psychology, 44, 87–112.

S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144 1143

Maki, W. S., & Maki, R. H. (2002). Multimedia comprehension skill predicts differential outcomes of web-basedand lecture courses. Journal of Experimental Psychology: Applied, 8(2), 85–98.

Maki, R. H., & Maki, W. S. (2003). Prediction of learning and satisfaction in web-based and lecture courses.Journal of Educational Computing Research, 28, 197–219.

Mayer, R. E. (2001). Multimedia learning. New York: Cambridge University Press.Mayer, R. E. (2003). The promise of multimedia learning: using the same instructional design methods across

different media. Learning and Instruction, 13, 125–139.Mayer, R. E., & Moreno, R. (2002). Aids to computer-based multimedia learning. Learning and Instruction, 12,

107–119.Munneke, L., Andriessen, J., Kanselaar, G., & Kirschner, P. (2006). Supporting interactive argumentation:

influence of representational tools on discussing a wicked problem. Computers in Human Behavior,doi:10.1016/j.chb.2006.10.003.

Nadolski, R. J., Kirschner, P. A., & van Merrienboer, J. J. G. (2006). Process support in learning tasks foracquiring complex cognitive skills in the domain of law. Learning and Instruction, 16, 266–278.

Narciss, S. (2004). The impact of informative tutoring feedback and self-efficacy on motivation and achievementin concept learning. Experimental Psychology, 51, 214–228.

Narciss, S. (2006). Informatives tutorielles Feedback. Entwicklungs- und Evaluationsprinzipien auf der Basis

instruktionspsychologischer Erkenntnisse (Informative tutorial feedback). Munster: Waxmann.Narciss, S., & Huth, K. (2004). How to design informative tutoring feedback for multimedia learning. In H. M.

Niegemann, D. Leutner, & R. Brunken (Eds.), Instructional design for multimedia learning (pp. 181–195).Munster: Waxmann.

Narciss, S., & Korndle, H. (1998). Problems and perspectives for the development of multimedia tools forteaching and learning in the Internet. European Psychologist, 3, 219–226.

Narciss, S., & Korndle, H. (1999). Studierplatz 2000. Vernetzte Informationssysteme in der universitaren Lehre –Einsatzmoglichkeiten, Grenzen und Perspektiven [Study 2000. Networked information systems in universityinstruction – potentials, limitations and prospects]. Medienpsychologie, 11, 38–55.

Narciss, S., & Korndle, H. (2001). Forderung des lustvollen selbststandigen Lernens mit dem Internet(Fostering self-regulated learning with the Internet). Wissenschaftliche Zeitschrift der TU Dresden, 50,74–79.

Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in collegestudents. Educational Psychology Review, 16, 385–407.

Pressley, M., Borkowsky, J. G., & Schneider, W. (1989). Good information processing: what it is and whateducation can do to promote it. International Journal of Educational Research, 13, 857–867.

Pressley, M., & Ghatala, E. S. (1990). Self-regulated learning: monitoring learning from text. Educational

Psychologist, 25(1), 19–33.Proske, A., Korndle, H., & Narciss, S. (2004). The Exercise Format Editor: a multimedia tool for the design of

multiple learning tasks. In H. Niegemann, D. Leutner, & R. Brunken (Eds.), Instructional design for

multimedia learning (pp. 149–164). Munster: Waxmann.Rouet, F. (1990). Interactive test processing by inexperienced (hyper-)readers. In A. Rizk, N. Streitz, & J. Andre

(Eds.), Hypertexts: Concepts, systems, and applications (pp. 250–260). Cambridge, England: CambridgeUniversity Press.

Rouet, F., & Levonen, J. J. (1996). Studying and learning with hypertext: empirical studies and their implications.In F. Rouet, J. J. Levonen, A. Dillon, & R. J. Spiro (Eds.), Hypertext and cognition (pp. 9–23). Mahwah, NJ:Lawrence Erlbaum Associates.

Salomon, G., & Almog, T. (1998). Educational psychology and technology: a matter of reciprocal relations.Teachers College Record, 100, 222–241.

Seufert, T., Janen, I., & Brunken, R. (2006). The impact of intrinsic cognitive load on the effectivenessof graphical help for coherence formation. Computers in Human Behavior, doi:10.1016/j.chb.2006.10.002.

Simons, P. R.-J., & de Jong, F. P. (1992). Self-regulation and computer-aided instruction. Applied Psychology: An

International Review, 41, 333–346.van Oostendorp, H. (1996). Studying and annotating electronic text. In F. Rouet, J. J. Levonen, A. Dillon, & R.

J. Spiro (Eds.), Hypertext and cognition (pp. 137–147). Mahwah, NJ: Lawrence Erlbaum Associates.Winne, P. H. (2001). Self-regulated learning viewed from models of information processing. In B. J. Zimmerman

& D. H. Schunk (Eds.), Self-regulated learning and academic achievement: Theoretical perspectives (2nd ed.)(pp. 153–189). Mahwah, NJ: Lawrence Erlbaum Associates.

1144 S. Narciss et al. / Computers in Human Behavior 23 (2007) 1126–1144

Winne, P. H., & Hadwin, A. (1998). Studying as self-regulated learning. In D. J. Hacker, J. Dunlosky, & A.Graesser (Eds.), Metacognition in educational theory and practice (pp. 277–304). Hillsdale, NJ: LawrenceErlbaum Associates.

Winne, P. H., & Stockley, D. B. (1998). Computing technologies as sites for developing self-regulated learning. InD. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning; from teaching to self-reflective practice

(pp. 106–136). New York: The Guilford Press.Zimmerman, B. (2001). Theories of self-regulated learning and academic achievement: an overview and analysis.

In B. J. Zimmerman & D. H. Schunk (Eds.), Self-regulated learning and academic achievement: Theoretical

perspectives (2nd ed.) (pp. 1–37). Mahwah, NJ: Lawrence Erlbaum Associates.