the role of students' cognitive engagement in online learning

17
This article was downloaded by: [Adams State University] On: 15 October 2014, At: 09:06 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK American Journal of Distance Education Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/hajd20 The Role of Students' Cognitive Engagement in Online Learning Jennifer C. Richardson & Tim Newby Published online: 07 Jun 2010. To cite this article: Jennifer C. Richardson & Tim Newby (2006) The Role of Students' Cognitive Engagement in Online Learning, American Journal of Distance Education, 20:1, 23-37, DOI: 10.1207/s15389286ajde2001_3 To link to this article: http://dx.doi.org/10.1207/s15389286ajde2001_3 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content.

Upload: tim

Post on 09-Feb-2017

213 views

Category:

Documents


0 download

TRANSCRIPT

This article was downloaded by: [Adams State University]On: 15 October 2014, At: 09:06Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH,UK

American Journal of DistanceEducationPublication details, including instructions forauthors and subscription information:http://www.tandfonline.com/loi/hajd20

The Role of Students' CognitiveEngagement in Online LearningJennifer C. Richardson & Tim NewbyPublished online: 07 Jun 2010.

To cite this article: Jennifer C. Richardson & Tim Newby (2006) The Role of Students'Cognitive Engagement in Online Learning, American Journal of Distance Education,20:1, 23-37, DOI: 10.1207/s15389286ajde2001_3

To link to this article: http://dx.doi.org/10.1207/s15389286ajde2001_3

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all theinformation (the “Content”) contained in the publications on our platform.However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness,or suitability for any purpose of the Content. Any opinions and viewsexpressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of theContent should not be relied upon and should be independently verified withprimary sources of information. Taylor and Francis shall not be liable for anylosses, actions, claims, proceedings, demands, costs, expenses, damages,and other liabilities whatsoever or howsoever caused arising directly orindirectly in connection with, in relation to or arising out of the use of theContent.

This article may be used for research, teaching, and private study purposes.Any substantial or systematic reproduction, redistribution, reselling, loan,sub-licensing, systematic supply, or distribution in any form to anyone isexpressly forbidden. Terms & Conditions of access and use can be found athttp://www.tandfonline.com/page/terms-and-conditions

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

The Role of Students’ CognitiveEngagement in Online Learning

Jennifer C. Richardson and Tim NewbyCollege of Education

Purdue University

This study investigated the degree to which students cognitively engagewith theironlinecourses.Cognitiveengagementwasdefinedas the inte-gration and utilization of students’ motivations and strategies in thecourse of their learning. Given this, the study utilized J. B. Biggs’s(1987a) Study Process Questionnaire to measure motivations and strat-egies in general, rather than for a specific task. Statistically significantfindings were observed for program focus, gender, age, and prior onlineexperience in accordance with students’learning strategies and motiva-tions. Specifically, the findings indicate that as students gain experiencewithonline learning, theycometo takemore responsibility for theirownlearning. The findings have implications for how instructors facilitateonline courses as well as how designers organize online courses.

It has become evident among researchers of online learning environmentsthat it is no longer enough to compare these types of learning experiencesto traditional, face-to-face classroom environments. For better or for worse,each environment holds unique qualities. Many researchers have started tofocus more specifically on how learners learn in online environments in anattempt to better understand the unique learner needs.

For example, recent research of online environments has focused onlearning effectiveness (Shea et al. 2002; Swan 2003), the impact of socialaspects of online learning environments (Picciano 2002; Richardson andSwan 2003), student satisfaction (Shea, Pickett, and Pelz 2003), and learn-ing strategies (Brown, Myers, and Roy 2003). However, very little researchhas looked at how students engage with their online courses, especially interms of learning strategies and motivations. Cognitive engagement, as aconstruct, allows us to go beyond the anecdotes and course grades to gain

23

THE AMERICAN JOURNAL OF DISTANCE EDUCATION, 20(1), 23–37Copyright © 2006, Lawrence Erlbaum Associates, Inc.

Correspondence should be sent to Jennifer C. Richardson, College of Education, PurdueUniversity, BRNG 3142, 100 North University Street, West Lafayette, IN 47907–2067.E-mail: [email protected]

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

knowledge about how students “go about learning” while taking into ac-count their unique individual experiences that shape them and their learn-ing (Biggs 1987a).

Given that students learn differently, it would be useful for online in-structors and designers to have a better understanding of how learningstrategies, motivation, and prior experiences intersect within the onlineenvironment. Moreover, it would be useful in the design of instructionif they understand which of these attributes, if any, is especially desir-able in terms of student achievement. The research questions addressedby this study include (1) What strategies and motivations are studentsutilizing (Deep, Surface, or Achieving) in their online courses? and (2)Is cognitive engagement affected by factors such as program focus (en-gineering or education), gender, age, prior online experience, and em-ployment status?

Defining Cognitive Engagement

Corno and Mandinach (1983) first coined the term cognitive engage-ment in research that examined classroom learning from the perspective oflearning, motivation, and instruction. Their viewpoint portrayed studentsin the classroom as actively engaging in a variety of cognitive interpreta-tions, including interpretations of their environments and themselves. Cog-nitive engagement influenced the amount and kind of effort the students ex-pended on classroom tasks and was used as a common measure ofmotivated behavior. To Corno and Mandinach, self-regulated learning isconsidered to be one form of cognitive engagement, “the highest form …[in which a student] derive[s] solutions on his or her own” (90). Ultimately,the researchers believe that students must “learn to learn” and be encour-aged to “become adroit at strategy shifts across tasks, and even within cer-tain complex tasks” (106).

Cognitive engagement has been utilized in fields varying from literacy(Guthrie 1996; Guthrie et al. 1996) to multimedia (Bangert-Drowns andPyke 2001; Stoney and Oliver 1999) to mathematics (Henningsen andStein 2002; Marks 2000).

A review of the research on cognitive engagement finds that studies andsurveys typically look to students’ cognitive abilities, cognitive and affec-tive motivations, and experiences as a means of defining the construct. Forexample, studies at the National Reading Research Center have examinedstudent engagement with literacy and how it affects achievement. Guthrieet al. (1996) defined student engagement as “the integration of motivations

24

STUDENTS’ COGNITIVE ENGAGEMENT

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

and strategies in literacy activities” (306). Research stemming from the Na-tional Reading Research Center cites motivation as the underlying defini-tion for the outcome of literate learners, or learners who “generate theirown literacy learning opportunities” (Guthrie 1996, 433).

Greene and Miller (1996) found that students’perceived ability and learn-ing goals were positively correlated with meaningful cognitive engagement.Their path model indicated that achievement was positively affected by per-ceived ability and learning goals, but the effects of those variables were indi-rect, operating through meaningful cognitive engagement activities.

Stoney and Oliver’s (1999) work has implications for the design ofonline learning environments. In their study of the use of interactivemultimedia on motivation and engagement, they determined that whena student is engaged, it can be assumed that “students’ prior learningwill act in concert with the instruction to determine the types of cogni-tive engagement they exhibit, such as attention to specific information,analyses and synthesis of information, visualization and ability to dis-tinguish between relevant and irrelevant information” (2). Their re-search found that a well-designed microworld program led to increasedlearner cognitive engagement, as exemplified by greater levels of higherorder thinking (e.g., planning/strategy, predicting/imposing meaning,and multiple perspectives).

For the purpose of this research study, cognitive engagement was de-fined as the integration and utilization of students’ motivations and strate-gies in the course of their learning. As such, the study utilized John Biggs’s(1987a) Study Process Questionnaire (SPQ) to measure students’ motiva-tions and strategies in an online learning environment.

The SPQ and Student Approaches to Learning

Biggs’s (1987a)SPQincludesscales thatmeasurestudents’approaches tolearning by evaluating their learning strategies and motivations in order tocreate a learner profile. The scored scales place students in categories relatedto their motivation and strategy levels, including (1) Surface, Deep, orAchieving strategies and (2) Surface, Deep, or Achieving motivation. Biggs(1999) described two distinct groups of learners: those who learn for the sakeofknowledgeacquisitionand thosewholearn togainapassinggradeorqual-ification. The first group typically learns by using a Deep approach and ishighly engaged. They study to learn and are motivated to go beyond the basicrequirements for passing. Surface learning, on the other hand, involves onlyasmuchas isnecessary toget apassinggrade; learnerswhouse this approach

25

RICHARDSON AND NEWBY

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

are less cognitively engaged than their counterparts. The motive and strategyapproaches created by Biggs (1987b) are described in Table 1.

Biggs’s SPQ has been used in prior studies to examine students’ learningstrategies in traditional, higher education environments for purposes as di-verse as identifying at-risk students (Peng and Bettens 2002), measuringstudy skill programs intervention (Tang et al. 2003), examining the relationbetween cultural values and learning (Matthews 2001), and conducting alongitudinal study that examined how students’ study skills changed overtime (Zeegers 1999). To date, however, Biggs’s work has not been utilizedin online learning environments, which arguably appeals to a less tradi-tional student population and uses activities and methods that differ fromtraditional classrooms. This descriptive research study was designed tohelp researchers examine the implications and significance of students’cognitive engagement, defined as learning strategies and motivations, re-lated to their learning in online courses. As such, this study and its resultsmay provide implications not only for designing online activities and meth-ods but also for providing a profile of students who are more likely to suc-ceed in this type of learning environment.

26

STUDENTS’ COGNITIVE ENGAGEMENT

Table 1. Student Learning Motive and Strategy Approaches

Approach Motive Strategy

SA: Surface Surface motive (SM) is to meetrequirements minimally; abalancing act between failingand working more than isnecessary.

Surface strategy (SS) is to limittarget to bare essentials andreproduce them through rotelearning.

DA: Deep Deep motive (DM) is intrinsicinterest in what is beinglearned; to developcompetence in particularacademic subjects.

Deep strategy (DS) is to discovermeaning by reading widely,inter-relating with previousrelevant knowledge, etc.

AA: Achieving Achieving motivation (AM) is toenhance ego and self-esteemthrough competition; to obtainhighest grades, whether or notmaterial is interesting.

Achieving strategy (AS) is toorganize one’s time andworking space; to follow upall suggested readings,schedule time, behave as“model student.”

Reprinted from Student Approaches to Learning and Studying: Study Process Question-naire Manual, by J. B. Biggs, © 1987. Used by permission of the Australian Council for Ed-ucational Research.

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

Method

Participants

Participants for this research (N = 121) come from two distinct groups ofstudents. At a large university in the midwestern United States,postbaccalaureate students (n = 68) were enrolled in online courses in anengineering-focused program. Participants from a large northeastern uni-versity were enrolled in an online master’s degree program with an educa-tion focus (n = 53). Of the participants, sixty-three were female(thirty-seven in education, twenty-six in engineering), and fifty-eight weremale (sixteen in education, forty-two in engineering). Other demographicdata collected included age (60% were in the 25–35 age range, 26% were inthe 18–24 age range), schooling (85% had a bachelor’s degree, 13% had amaster’s degree), employment (76% worked full time), online experience(40% had no previous experience), and purpose for enrolling (24% speci-fied their reasons as personal; 21% specified that their employer reim-bursed their educational expenses; and 19% specified their purpose asmonetary—e.g., they wished to earn a higher salary upon degree/coursecompletion). A limitation of this study is that the participants were all vol-unteers, and therefore data relating to students who elected not to volunteermay not be represented.

Instrument and Data Collection

The instrument utilized in this study consisted of two distinct parts: (1)participant background and demographics information and (2) the SPQ,which examines students’ approaches to learning and studying at the uni-versity level. The SPQ consists of forty-two items and contains sixsubscales (with seven items each), with a range of seven to thirty-five foreach subscale. Subscales were then converted to levels for each of thelearning strategies and motivations (see Table 1 for a description of each)as per the SPQ scoring manual (1 = below average, 2 = average, 3 =above average). All data were collected via an online survey system in2003.

Reliability for the SPQ has been established through multiple studies(Hattie and Watkins 1981; O’Neil and Child 1984). Reliability coefficientswere reported as Surface Motive (.61), Surface Strategy (.66), Deep Motive(.65), Deep Strategy (.75), Achieving Motive (.72), and Achieving Strategy(.77).

27

RICHARDSON AND NEWBY

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

Data Analyses and Results

Initial data analyses included a summary of demographic items, as re-ported in the Participants section. Independent samples t tests were con-ducted to examine differences in the learning strategies and motivations en-gaged during the online course experience based on (a) prior onlineexperience, (b) program area focus, and (c) gender. In addition, one-wayanalyses of variance (ANOVAs) were conducted to examine differences inthe learning strategies and motivations by age and employment status.

Of particular interest were the learning strategies and motivations that,according to their definitions, could be considered more desirable for on-line learners to use—for example, the Achieving Strategy (organize one’stime and working space, behave as a “model student”)—given the learningenvironment and context. In addition, preferred motives and strategies forlearners in any environment would include the Deep Motive (e.g., intrinsicinterest) and Deep Strategy (e.g., to discover meaning by reading widely,interrelating with previous relevant knowledge).

Independent samples t tests were used to identify significant differencesin the learning strategies and motivations used by participants as they en-gaged in their online course based on prior experience, program area focus,and gender. Specifically, the t tests were run on the Deep Strategy and theDeep Motive, as by definition they are the motives/strategies considered tobe more engaging. The independent samples t test was also run on theAchieving Strategy, as by definition it looks to self-regulatory practices.

An independent samples t test was conducted to evaluate the hypothesisthat there is no significant difference between students’ previous experi-ence (no prior experience = 1, prior experience = 2) with online courses andthe learning strategies and motives in which they engage during onlinecourses. For prior experience, both Deep Strategy, t(119) = –2.01, p = .05,and Achieving Strategy, t(110.47) = –1.946, p = .05, were found to be sta-tistically significant (see Table 2, Part A). The eta-squared index indicatedthat 5% of the variance for Deep Strategy, and 8% of the variance forAchieving Strategy, was accounted for by whether a student had prior on-line experience; both favored students with online experience.

Similarly, an independent samples t test was conducted to evaluate thehypothesis that there is no significant difference between the program fo-cus area (education = 1, engineering = 2) and the learning strategies andmotivations in which students engage during online courses (see Table 2,Part B). For program area focus, Surface Motive was significant, t(119) =2.60, p = .05, with 7% of the variance for Surface Motive accounted for by

28

STUDENTS’ COGNITIVE ENGAGEMENT

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

students’ program focus, favoring engineering students. The t test for Sur-face Strategy was significant, t(119) = 3.08, p = .05, with 8% of the vari-ance for Surface Strategy accounted for by students’ program focus, favor-ing engineering students. The t test for Deep Strategy was significant,t(119) = –2.80, p = .05, with 6% of the variance for Deep Strategy ac-counted for by students’ program focus, favoring education students.

In addition, an independent samples t test was conducted to evaluate thehypothesis that there is no significant difference between gender (female =1, male = 2) and the learning strategies and motivations in which studentsengage during online courses (see Table 2, Part C). The t test for AchievingMotive was significant, t(112.12) = –2.57, p = .05, with 5% of the varianceaccounted for by students’ gender in favor of male students.

A one-way ANOVA was conducted to evaluate the relation between eachof the learning strategies and motivations and age, for a total of six ANOVAs.The independent variable, age, included five levels: (a) 18 through 24 yearsof age, (b) 25 through 35 years of age, (c) 36 through 45 years of age, (d) 46

29

RICHARDSON AND NEWBY

Table 2. Overall Means and Independent Samples t-Test Results for PriorExperience, Program Focus, and Gender

Part A: Overall Means and Independent Samples t-Test Results for Prior Experience

StrategyMotivation

Overall(n = 121)

No PriorExperience

(n = 49)

PriorExperience

(n = 72)

Deep Strategy 2.17 2.00 2.28 t(119) = –2.01*Achieving Strategy 2.22 2.06 2.33 t(110.47) = –1.946*

Part B: Overall Means and Independent Samples t-Test Results for Program Focus

StrategyMotivation

Overall(n = 121)

Education(n = 53)

Engineering(n = 68)

Surface Motive 2.27 2.09 2.41 t(119) = 2.60*Surface Strategy 2.26 2.04 2.44 t(119) = 3.08*Deep Strategy 2.17 2.38 2.00 t(119) = –2.80*

Part C: Overall Means and Independent Samples t-Test Results for Gender

StrategyMotivation

Overall(n = 121)

Females(n = 63)

Males(n = 58)

Achieving Motive 2.57 2.43 2.72 t(112.12) = –2.57*

*p = .05.

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

through 55 years of age, and (e) 56 through 65 years of age. The dependentvariable for each of the ANOVAs conducted was the learning strategy or mo-tivation (Surface Motive, Surface Strategy, Achieving Motive, AchievingStrategy, Deep Motive, Deep Strategy) (see Table 3, Part A).

The result of the one-way ANOVA for age was found to be significantfor Surface Motive, F(4, 116) = 2.47, p = .05, for all age groups, with theexception of the fifty-six through sixty-five age group, as indicated by theconfidence intervals (–4.85, 7.85). The strength of the relationship betweenage and Surface Motive was medium, with age accounting for 8% of thevariance. Further examination of the means reveals that the younger theparticipant, the more likely he or she was to use the Surface Motive.

The result of the one-way ANOVA for age was also found to be signifi-cant for Surface Strategy, F(4, 116) = 2.78, p = .05, for all age groups, withthe exception of the fifty-six through sixty-five age group, as indicated bythe confidence intervals (–4.85, 7.85). The strength of the relation betweenage and Surface Strategy was medium, with age accounting for 9% of thevariance. An examination of the means for age indicate that, the youngerthe participant, the more likely he or she is to use the Surface Strategy.

30

STUDENTS’ COGNITIVE ENGAGEMENT

Table 3. Analyses of Variance (ANOVAs) for Learning Strategies andMotivations by Age and Employment Status

Part A: ANOVAs for Learning Strategies and Motivations by Age

StrategyMotivation ANOVA

Sum ofSquares d.f.

MeanSquare F p

Surface Motive Between groups 4.406 4 1.102 2.47 .048*Within groups 51.594 116 0.445Total 56.00 120

Surface Strategy Between groups 5.735 4 1.434 2.781 .030*Within groups 59.802 116 0.516Total 65.537 120

Part B: ANOVAs for Learning Strategies and Motivations by Employment Status

StrategyMotivation ANOVA

Sum ofSquares d.f.

MeanSquare F p

Achieving Motive Between groups 5.845 3 1.948 4.932 .003*Within groups 45.432 115 0.395Total 51.277 118

*p = .05.

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

Dunnett’s C tests were unable to differentiate between the age groups for(a) the ANOVA involving age and Surface Motive and (b) the ANOVA in-volving age and Surface Strategy; this may be caused in part by the break-down of the age groups.

Finally, a one-way ANOVA was conducted to evaluate the relation be-tween each of the learning strategies and motivations and employment sta-tus, for a total of six ANOVAs. The independent variable, employment sta-tus, included four levels: 1 = full time, 2 = part time, 3 = full-time student,and 4 = other. The dependent variable for each of the ANOVAs conductedwas the learning strategy or motivation (see bottom section of Table 3).

The result of the one-way ANOVA for employment status was signifi-cant for Achieving Motive, F(3, 115) = 4.93, p = .01, for all levels of em-ployment status, with the exception of “other,” as indicated by the confi-dence intervals (–0.48, 4.48). The strength of the relationship betweenemployment status and Achieving Motive was medium, with employmentstatus accounting for 11% of the variance. Examination of the means re-veals that when employment status was categorized as “full-time student”(M = 2.81), participants were more likely to utilize the Achieving Motive.

Dunnett’s C tests were unable to differentiate between the employmentstatus categories for the ANOVA involving Achieving Motive; this may becaused in part by a potential overlap of categories where students felt thatmore than one response applied to them.

Discussion

This research investigated whether demographic and self-selected vari-ables (e.g., program area) have an effect on students’use of the six learningmotivations and strategies described in Table 1: Surface, Deep, andAchieving motivations and Surface, Deep, and Achieving strategies. Spe-cifically, the variables examined included prior online experience, programarea focus, gender, age, and employment status.

The analyses, which looked to establish whether students demonstrate theuse of learning strategies and motivations that are more engaging (Deep Mo-tive, Deep Strategy) and perhaps self-regulatory (Achieving Strategy) asthey experience multiple online courses, revealed statistical significance forseveral learning strategies. As students begin and proceed to additional on-line courses, their strategies begin to differ, with students who have more on-line experience utilizing the Deep Strategy, making connections acrosscourses and content. Similarly, students who have online experience appear

31

RICHARDSON AND NEWBY

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

to be more self-regulatory in their strategies: a statistically significant differ-encewasfoundforstudents’prioronlineexperienceandAchievingStrategy.

Thefindings related toprioronlineexperiencesyieldsomevaluable infor-mation as well as validation for those involved in best practices for onlinelearning. As a student gains additional experience with online courses, he orshe tends to decrease the use of Achieving Motive, while increasing the useof Achieving Strategy, exemplifying a shift in the role of students. In otherwords, students showed a decrease in a need to obtain the highest grade whileshowinganincrease inorganizing their timeandfollowinguponresources inaccordance with an increase in online learning experience. This shift for on-line learners has been discussed, but to date there has been little evidence tosupport it. The shift implies that students are in fact learning to take more re-sponsibility for their own learning in online learning environments, perhapseven becoming more self-directed, which would be the expectation as onlineinstructors alter their role from the traditional “sage on the stage” to the“guide on the side” approach (see Higher Education Program and PolicyCouncil of the American Federation of Teachers 2001). Moreover, interven-tions can be designed that allow students to progress in this shift at a quickerpace, suchas introductorymaterials forapproaches toonline learning,whichin turn may affect student retention in online courses.

The authors were also interested to know whether students who enrolledin online courses coming from different program focus areas demonstrateddifferent learning strategies and motivations. This question was consideredafter they observed within their own courses what can be defined as “cul-tural” differences among students from different program areas. The resultsof this research indicate that this phenomenon may not in fact be imagined.Instead, this research demonstrated that there was a statistically significantdifference between program focus areas (education vs. engineering) forthree of the six learning strategies and motivations, namely, Surface Motive,Surface Strategy, and Deep Strategy. As shown in Table 2, Part B, there is astatistically significant difference between the programs in regard to SurfaceStrategy (e.g., limit target to bare essentials and reproduce them through rotelearning) and Surface Motive (e.g., meet requirements minimally), in favorof the engineering students. There is also a statistically significant differencein regard toDeepStrategy(e.g.,discovermeaningbyreadingwidely, interre-lating with previous relevant knowledge), in favor of the education students.There may be several potential reasons for this, such as the type of skillstaught at the undergraduate level within program areas or even the self-selec-tion process students use when choosing a program focus area. If one as-sumes that students currently enrolled in courses that have one focus or the

32

STUDENTS’ COGNITIVE ENGAGEMENT

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

other have come from similar undergraduate programs (e.g., a student in anengineering-focused program has a background in engineering), then onemay also be able to go one step further and assume that students may engageat different levels because they are asked to engage at different levels in ac-cordance with domain or content areas and the types of activities and ex-pected outcomes that are conventional in those areas. In either case, there isreason to believe that the varying learning strategies and motivations can betaught to students in either program focus area as well as additional programareas to help them learn at a higher level (e.g., Deep Strategy, Deep Motive)or, at the very least, in a more effective, self-regulatory manner (e.g.,Achieving Strategy) for an online environment.

Although the results for gender did not yield as much information interms of statistical significance, one should also be heartened that this wasthe case. When gender was examined with the learning strategies and moti-vations, the results indicated that gender accounted for 5% of the varianceon the Achieving Motive, in favor of male students—which may in fact berelated to a larger cultural or societal issue that cannot as easily be reme-died by encouraging the broadening of skill sets as it is linked to motive.However, this finding needs to be researched further.

An additional finding relates to the implication of the observed resultthat the younger the age of the student, the more likely he or she was tomake use of the Surface Strategy (e.g., limit target to bare essentials) andthe Surface Motive (e.g., meet requirements minimally). This could mostreasonably be explained by educational experience overall. For example,younger students may not have had the training or exposure to higher levelstrategies or motivations or may not have found their niche yet and there-fore have not considered learning for the sake of learning. Or, more simply,it could be that younger students are still learning to adjust to the balancingact that can exist between school and other aspects of their lives. Again, thisis a finding that can be remedied in part by an intervention, such as provi-sion of resources for approaches to online learning and the encouragementof the broadening of skill sets (e.g., beyond Surface Strategy), while it alsodemonstrates to students reasons for higher level learning, thereby poten-tially altering the Surface Motive tendency of younger students.

Finally, results for employment status were found to be a significant vari-able for Achieving Motive, with the exception of “other.” In fact, accordingto the means, full-time students were more likely to utilize the AchievingMotive, which, as defined by Biggs (1987b), is “to obtain highest grades,whether or not [the] material is interesting” (3). This finding is somewhat un-anticipated, as one would like to think that full-time students will learn with

33

RICHARDSON AND NEWBY

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

experience to balance their commitments and have chosen to be full-timestudents because they wish to “learn for the sake of learning,” which mayeventually be the case given additional time and coursework.

Ultimately, it is the mission of all programs and instructors tocognitively engage their students through their coursework, if bycognitively engaging one means that students have the ability to “learn tolearn” (Corno and Mandinach 1983) and look to internal motivations thatcan lead to long-term literacy—as opposed to motivation related to compli-ance, an external motivator that may not have the same long-term effect(Guthrie 1996). Given this, and the nature of this research, it would be idealif students were rated more highly for Deep Motive than for Achieving Mo-tive or Surface Motive, as Deep Motive can be considered to be morecognitively engaging. This also ties into Guthrie’s (1996) definition, whichlooks to internal motivations such as those tied to Deep Motive, rather thanSurface Learning which, as Biggs (1999) pointed out, involves only asmuch as is necessary to achieve a passing grade, or, more likely, a balancedcombination of learning strategies and motivations is necessary for stu-dents to be cognitively engaged while maintaining a balance acrosscourses. The findings of Greene and Miller (1996) show that students whoscored high on meaningful engagement strategies were also high on shal-low engagement. One possible theory for this result is that “students whoare high on meaningful cognitive engagement strategies may be able tostrategically employ both types of processing, whereas those studentsguided by performance goals were more likely to utilize shallow strategiesregardless of the needs of the learning situation” (Greene and Miller 1996,189).

How can online learning environments and their developers alter thesepredispositions? One intervention could include taking into account the ac-tivities, expected outcomes, and instructional practices of domain areas andwork specifically with those in mind as courses are developed. More specifi-cally, when developing online courses instructors and designers need to takeinto account the characteristics of a population, such as program focus area,ageof the learner,andamountofprioronlineexperience.Anadditional inter-vention would allow students to be exposed to additional strategies and moti-vations with which they may not be as familiar through the inclusion of re-sources that allow students to become acquainted with the skills, learningstrategies, and even motivations that can make them successful in onlinelearning environments (e.g., Peng and Bettens 2002; Tang et al. 2003). Al-though this may seem to be an unlikely path for many programs and instruc-tors, if we are to have cognitively engaged students in our online programs,

34

STUDENTS’ COGNITIVE ENGAGEMENT

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

we need to address this issue, as the nature of the learning environment dif-fers in accordance with the nature of the interactions, learning aids and tools,and even motivational levels necessary for completion of coursework.

Instructors and designers of online learning environments can use thesefindings as guidance when they look at components such as building an in-teraction between students or an interaction between instructor and stu-dent. These findings can also facilitate better planning for the use of scaf-folds or other learning aids in online courses, as they relate to specificlearning strategies or motives, much as Stoney and Oliver’s (1999) researchfound that a well-designed microworld program leads to increased learnercognitive engagement, such as greater levels of higher order thinking.

Future Research

Because one focus of this research was to consider how students engagewith online courses to allow educational professionals to better design ap-propriate courses and online activities, future research should explore fac-tors that go beyond the demographic and self-selecting factors exploredhere and examine additional factors that teachers can influence more di-rectly, such as learning designs and scaffolding strategies. Additional anal-yses should also look at potential relations between learning strategies andmotivations in relation to students’ perceptions of community in onlinelearning environments to see whether particular strategies and motivationsare better suited to various levels of the course-related online community.

References

Bangert-Drowns, R. L., and C. Pyke. 2001. A taxonomy of student engage-ment with educational software: An exploration of literate thinking withelectronic text. Journal of Educational Computing Research 24 (3):213–234.

Biggs, J. B. 1987a. Student approaches to learning and studying. Mel-bourne: Australian Council for Educational Research.

———. 1987b. Student approaches to learning and studying: Study pro-cess questionnaire manual. Hawthorn: Australian Council for Educa-tional Research.

———. 1999. What the student does for enhanced learning. Higher Edu-cation Research and Development 18 (1): 57–75.

Brown, G., C. Myers, and S. Roy. 2003. Formal course design and thestudent learning experience. Journal of Asynchronous Learning Net-

35

RICHARDSON AND NEWBY

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

works 7 (3). Available online at http://aln.org/publications/jaln/v7n3/v7n3_myers.asp

Corno, L., and E. B. Mandinach. 1983. The role of cognitive engagement inclassroom learning and motivation. Educational Psychologist 18 (2):88–108.

Greene, B. A., and R. B. Miller. 1996. Influences on achievement: Goals,perceived ability, and cognitive engagement. Contemporary Educa-tional Psychology 21 (2): 181–192.

Guthrie, J. T. 1996. Educational contexts for engagement in literacy. TheReading Teacher 49 (6): 432–445.

Guthrie, J. T., P. Van Meter, A. McCann, and A. Wigfield. 1996. Growth ofliteracy engagement: Changes in motivations and strategies during con-cept-oriented reading instruction. Reading Research Quarterly 31 (3):306–323.

Hattie, J., and D. Watkins. 1981. Australian and Filipino investigations ofthe internal structure of Biggs’ new Study Process Questionnaire. Brit-ish Journal of Educational Psychology 51 (2): 241–244.

Henningsen, M. A., and M. K. Stein. 2002. Supporting students’high-levelthinking, reasoning, and communication in mathematics. In Research,reflection, and practice, ed. J. Sowder and B. Schappelle, 46–61.Reston, VA: National Council of Teachers of Mathematics.

Higher Education Program and Policy Council of the American Federationof Teachers. 2001. Distance education: Guidelines for good practice.USDLA Journal 15 (11). Available online at http://www.usdla.org/html/journal/NOV01_Issue/article03.html

Marks, H. 2000. Student engagement in instructional activity: Patterns inthe elementary, middle, and high school years. American Education Re-search Journal 37 (1): 153–184.

Matthews, B. 2001. The relationship between values and learning. Interna-tional Education Journal 2 (4): 223–232. Available online at http://ehlt.flinders.edu.au/education/iej/articles/v2n4/MATTHEWS/PAPER.PDF

O’Neil, M. J., and D. Child. 1984. Biggs’ SPQ: A British study of its in-ternal structure. British Journal of Educational Psychology54:228–234.

Peng, L. L., and R. Bettens. 2002. NUS students and Biggs’ Learning Pro-cess Questionnaire. CDTL Brief 5 (7), article 0001a. Available online athttp://www.cdtl.nus.edu.sg/brief/v5n7/sec2.htm

Picciano, A. 2002. Beyond student perceptions: Issues of interaction, pres-ence, and performance in an online course. Journal of Asynchronous

36

STUDENTS’ COGNITIVE ENGAGEMENT

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014

Learning Networks 6 (1). Available online at http://www.aln.org/publi-cations/jaln/v6n1/v6n1_picciano.asp

Richardson, J. C., and K. Swan. 2003. Examining social presence in onlinecourses in relation to students’perceived learning and satisfaction. Jour-nal of Asynchronous Learning Networks 7 (1). Available online athttp://www.aln.org/publications/jaln/v7n1/v7n1_richardson.asp

Shea, P. J., A. M. Pickett, and W. E. Pelz. 2003. A follow-up investigationof “teaching presence” in the SUNY Learning Network. Journal ofAsynchronous Learning Networks 7 (2). Available online athttp://aln.org/publications/jaln/v7n2/v7n2_shea.asp

Shea, P. J., K. Swan, E. E. Fredericksen, and A. M. Pickett. 2002. Studentsatisfaction and reported learning in the SUNY Learning Network. InElements of quality online education, vol. 3, ed. J. Bourne and J. C.Moore. Olin and Babson Colleges: Sloan Center for Online Education.

Stoney, S., and R. Oliver. 1999. Can higher order thinking and cognitive en-gagement be enhanced with multimedia? Interactive Multimedia Elec-tronic Journal of Computer-Enhanced Learning 2 (7). Available onlineat http://imej.wfu.edu/articles/1999/2/07/index.asp

Swan, K. 2003. Learning effectiveness: What the research tells us. In El-ements of quality online education, practice and direction, ed. J.Bourne and J. C. Moore, 13–45. Needham, MA: Sloan Center for On-line Education.

Tang, W., A. Kwok, K. Lau, and W. Lee. 2003. The “role” of an effectivelearner programme (ELP) on “approaches of studying” of first-year stu-dents in the Hong Kong Polytechnic University. Paper presented at the7th Pacific Rim, First Year in Higher Education Conference, July, Bris-bane, Australia.

Zeegers, P. 1999. Student learning in science: A longitudinal study usingBiggs’SPQ. Paper presented at the HERDSA Annual International Con-ference, July, Melbourne, Australia.

37

RICHARDSON AND NEWBY

Dow

nloa

ded

by [

Ada

ms

Stat

e U

nive

rsity

] at

09:

06 1

5 O

ctob

er 2

014