1 instructional strategies to improve learning in computer games harold f. o’neil and hsin-hui...

39
1 Instructional Strategies to Improve Learning in Computer Games Harold F. O’Neil and Hsin- Hui Chen, University of Southern California/CRESST AERA v.5 Chicago, Illinois April 10, 2007

Post on 22-Dec-2015

216 views

Category:

Documents


2 download

TRANSCRIPT

1

Instructional Strategies to Improve Learning in Computer

Games

Harold F. O’Neil and Hsin-Hui Chen, University of Southern

California/CRESST

AERA v.5Chicago, IllinoisApril 10, 2007

2

What Is a Game?

• A computer game consists of four key components– Settings that are real or imaginary– Roles or agendas for the participants– Rules (real life vs. imaginative)– Scoring, recording, monitoring, or other kinds

of systematic measurement

• Motivation comes from challenge, complexity, fantasy

3

CRESST Model of Learning

Content Understanding

Learning

Communication

Collaboration/Teamwork

Problem Solving

Self-Regulation

4

Content Understanding

Assessing Problem Solving Via Games

Domain-DependentProblem-Solving

Strategies

Self-Regulation

Metacognition

Self-Monitoring

Planning

Motivation

Effort Self-Efficacy

5

Research Questions

• Will games increase players’ problem solving?

• Will adding effective instructional strategies to commercial off the shelf games improve problem solving?

• Trade-off between development and selection

6

The Specification of What We Are Teaching Is Essential

• From goal/objective of teaching leadership, situational awareness, decision making, tactical problem solving

– The instructional strategies follow

• Nature of feedback, timing of feedback, take-home packages, instructor training, homework assignments, etc.

– The type of assessment follows

• Different assessment measures, after-action reviews

7

Do Games Train? — Literature• The research indicates that computer games are

potentially useful for instructional purposes and are hypothesized to provide multiple benefits

– Promotion of motivation; improvement of knowledge and skills; facilitation of metacognition

• Limited empirical research in journals conducted on games topic (19 studies, 1990–2005)

– Adults, empirical (qualitative/quantitative)

– PsycINFO, Education Abs, SocSciAbs

• In 2006, DOD technical report literature added 4 additional reports

– Only one relevant empirical study on massive multiplayer games

8

Developers Educators/Trainers

Type of game platform Learning objectives

Players Learners

Contractor Type of learning1 (e.g., collaborative problem solving)

Genre2 (e.g., strategy game) Type of feedback3 (e.g., implicit vs. explicit)

Commercial success Formative evaluation

Different Mental Models

1Content understanding, problem-solving, self-regulation, communication, team skills.2Action, role planning, adventure, strategy games, goal games, team sports, individual sports (Laird & VanLent, 2001).3Implicit vs. Explicit: During or after (AAR).

9

Check Validity of Instructional Strategy

• Embedded in game

– Usually inductive discovery approach

— Usually doesn’t result in learning (Kirschner, P. A., Sweller, J., & Clark, R. E. 2006. Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based learning. Educational Psychologist, 41, 75-86.)

• What Works in Distance Learning

– Good instructional practices that can be applied to games

10

Selection of Game for Research

• Off-the shelf games lacking learning objectives and assessment of learning

• Use wrap around instructional & assessment strategies as no access to source code

11

SafeCracker

• Puzzle-solving game– Example of problem solving

• No special background knowledge, motor skills, or extraordinary visual-spatial ability required

• Adult-oriented• Single-player game• Pacing controlled by players• Not popular

12

Common Methodology

• Participants: – Young adults selected to have no

experience of playing SafeCracker but game players

• Measures: – Knowledge mapper – Retention and transfer questions analogous

to Mayers’ – Trait self-regulation questionnaire

13

Knowledge Mapper

14

An Example of Scoring Map

Concept 1 Links Concept 2 Expert1 Expert2 Expert3

Key is used for Safe 1 1 1

Safe Requires Key 1 1 0

Catalog contains Clue 1 0 1

Safe Contains Clue 0 0 1

Final score= total score÷ number of experts =8÷3= 2.67

15

Problem Solving Strategy Measure

• Domain-specific problem-solving strategies measured by open-ended questions

• Modifications of previous researchers (Mayer, 2001; Mayer & Moreno, 1998; Moreno & Mayer, 2004)

16

Problem-Solving Strategy Measure

• Retention Question

List how you solved the puzzles inthe rooms.

• Transfer Question

List some ways to improve the funor challenge of the game.

17

Scoring of Retention Question and Transfer Question

• Based on the number of predefined major idea units correctly stated by a participant regardless of the wording.

18

Measurement of Self-Regulation

• Trait self-regulation questionnaire (O’Neil & Herl, 1998).

– planning

– self-checking

– self-efficacy

– effort

19

Study I, II & III

• Study I– Without effective instructional strategies.

• Study II– With worked examples.

• Study III– With navigational aids.

20

• Purpose of the Study I

To evaluate a computer game (SafeCracker) with regard to its effectiveness for improving problem solving

21

Formative Evaluation

Measures design and tryout

Checking of validity of instructional strategies embedded in game against research literature

Feasibility review

Revisions implemented

O’Neil’s Framework (2002)

22

Data Analysis

Knowledge Map

Retention Test

Transfer Test

M M M

Pretest 2% 8% 7%

Posttest 4% 15% 12%

t(29) = 4.32, p < .01

t(29) = 12.66, p < .01

t(29) = 7.05, p < .01

23

Discussion/Implications• There was an increase in problem-solving. But it

was small.

• Existing instructional strategies (discovery learning) in the game were not effective.

• More research on a game designed with effective research-based instructional strategies

– Worked examples (Danny Shen)

– Pictorial aids (Richard Wainess)

– Just-in-Time Worked Examples (Joan Lang)

– After-Action Review

24

• This study provides a research environment with reliable and valid measures of problem solving:

– knowledge maps

– retention and transfer questions

– trait self-regulation questionnaire

• Used in RSOE/USC game research

Discussion/Implications (cont.)

25

Study II

Wrap-Around Instructional Strategy (Shen & O’Neil, In-Press)

Will participants in the worked example group

increase their problem solving in a game-based

task (i.e., SafeCracker) after studying worked

examples compared to the control group?

26

Worked Examples

• Worked examples are procedures that focus attention on problem states and associated operators (i.e., solution steps), enabling students to induce generalized solutions or schemas (Sweller, 1998).

• Many researchers investigated the efficacy of using worked examples in classroom and computer-based instruction and provided evidence of the effectiveness of worked examples instruction (Cooper & Sweller, 1987; Mayer & Mautone, 2002; Ward & Sweller, 1990).

• No research used worked examples in a game-based environment.

27

A Sample of a Worked Example

Room 5: Constructor OfficeGoal: Open the Liberty SafeStep 1: Recognize the Buttons, the Lights, and the Handle.

Buttons

Lights

Handle

28

ResultsWorked example instruction produced a significant increase in content understanding compared to the control group.

Percentage Group Mean SD

Control Pretest 4.97% 2.87% Posttest 5.74% 3.42% Improvement .77% 3.03%

Worked example Pretest 4.08% 2.85% Posttest 6.81% 4.89% Improvement 2.73% 3.16%

Inferential Statistics

29

Worked example instruction produced a significant increase in problem-solving strategy question of retention compared to the control group. Percentage

Group Mean SD Control

Pretest 10.52% 4.43% Posttest 13.89% 4.07% Improvement 3.52%* .52%*

Worked example Pretest 7.74% 4.87% Posttest 13.69% 6.07% Improvement 5.81%* .52%*

Inferential Statistics

* Adjusted value

30

Worked example instruction produced a significant increase in problem-solving strategy question of transfer compared to the control group. Percentage

Group Mean SD Control

Pretest 5.18% 5.23% Posttest 7.32% 5.88% Improvement 2.38%* .62%*

Worked example Pretest 7.95% 5.58% Posttest 12.37% 8.46% Improvement 4.19%* .62%*

Inferential Statistics

* Adjusted value

31

Results (cont.)

Alternative Problem Solving Measure

Opened worked example safes Group Mean SD

Control Posttest 1.03 (34.33%) 1.08 (36.07%)

Worked example Posttest 2.53 (84.33%) .88 (29.26%)

Inferential Statistics t (70) = 6.46, p < .01

32

Discussion/Implications• The worked example group significantly improved

more than the control group in content understanding and problem-solving strategies. However, the improvement was small.

• This study provided evidence that using worked examples could be one of the good instructional methods to facilitate adults’ problem solving with a commercial off-the-shelf computer game.

• In order to obtain greater improvement, in future studies the worked example instruction could add:– Just-in-Time– Fading procedure– Self-monitoring

33

General Research Results

• Study I– Problem solving increased somewhat after game playing.

• Study II– Problem solving increased significantly more with worked examples.

• Study III– Navigation maps did not affect problem solving.

34

What Are Continuing R&D Issues?

• Can we leverage game technology for training?

– Embedded instructional and assessment strategies

– Wrap-around instructional and assessment strategies

35

Walk-Away Issues• How are games currently used effectively for

adults?

– Limited evaluation data (qualitative or quantitative) to answer this question

– There is little empirical work in the literature on effectiveness of games for training of adults

• Analytically, would you predict that commercial off-the-shelf games should teach?

– No

36

Walk-Away Issues (cont.)

• What support and guidance would help training game developers to do a better job?

-Alignment with What Works in Distance Learning

• Instructional strategies that could work

- Wrap-around or embedded instructional and assessment strategies

37

CRESST Web Site

http://www.cresst.org or any search engine: type CRESST

[email protected]

38

Back-Up

39

code safe

clue

tool

roomtrial-and -error

combination

desk

searching

key

direction

floor plan

compass

book

catalog

containsrequires

contains

Safecracker Expert Map III

used for

containscontains

used for

leads to

contains

part of

contains

used for

contains

contains

part of

leads to

part of

leads toused for

containspart of

used for

leads to

contains

part of

leads to

requires

part of

used forleads to

contains

used for

used forused for

prior to

contains

part of

part of

requires

prior to

requires

contains

used for

contains

leads to

used for

contains

used for

part of

part of

requires

contains

leads to