session p16 evaluating online learning: frameworks and perspectives (workshop: sunday feb 17th,...

212
Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com Associate Professor, Indiana University http://php.indiana.edu/~cjbonk, [email protected] Dr. Vanessa Paz Dennen Assistant Professor, San Diego State University [email protected]

Upload: marion-preston

Post on 25-Dec-2015

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Session P16 Evaluating Online Learning: Frameworks and

Perspectives(Workshop: Sunday Feb 17th, Training 2002)

Dr. Curtis J. Bonk President, CourseShare.com

Associate Professor, Indiana Universityhttp://php.indiana.edu/~cjbonk,

[email protected]

Dr. Vanessa Paz Dennen Assistant Professor, San Diego State

[email protected]

http://edweb.sdsu.edu/people/vdennen

Page 2: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Workshop Overview

• Part I: The State of Online Learning• Part II. Evaluation Purposes,

Approaches, and Frameworks• Part III. Applying Kirkpatrick’s 4

Levels• Part IV. ROI and Online Learning• Part V. Collecting Evaluation Data &

Online Evaluation Tools(Time: 8:30-11:30; 12:30-3:30)

Page 3: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Part I. The State of Online Learning

Survey of Corporate Settings• What’s Going On?

• And How Are We Evaluating It?

Page 4: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Free Corporate Reports

1. Corporate E-Learning: Exploring a New Frontier, Hambrecht and Co. (2000, March) http://www.wrhambrecht.com/research/coverage/elearning/ir/ir_explore.pdf (95 pages)

2. Training Magazine Special Issue, September 2000, 37(9), The State of Online Learning

3. Fortune Special Issue, 142(13), Nov. 27, 2000, Special Insert: E-learning strategies for executive education and corporate training. http://www.fortuneelearning.com/topics/

Page 5: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Survey of 201 Trainers, Instructors, Managers, Instructional Designers,

CEOs, CLOs, etc.

Page 6: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Among the Key Goals

1. To identify the resources, tools, and activities desired in e-learning.

2. To document gaps between tools and resources deemed useful and actual use.

3. To survey commitment to e-learning.4. To document practices related to e-learning

training and support.5. To document pedagogical practices and

motivational techniques supported in e-learning.

Page 7: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Survey Limitations

• Sample pool—e-PostDirect• The Web is changing rapidly• Lengthy survey, low response rate• No password or keycode• Many backgrounds—hard to

generalize• Does not address all issues (e.g., ROI

calculations, how trained & supported, specific assessments)

Page 8: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 1. Respondents' Organizational Roles

User or Facilitator

17%

User and Decision-

Maker57%

Neither a User or

Decison-Maker

6%

Decision-Maker20%

Page 9: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 2. Size of Respondent Organizations

0

5

10

15

20

25

30

1 to 30 31-100 101 to500

501 to1,000

1,001 to5,000

5,001 to10,000

10,001 to100,000

More than100,001

Number of Employees

Per

cen

t o

f R

esp

on

den

ts

Page 10: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 11: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 4. Focus of Respondent Organizations

0

5

10

15

20

25

Pe

rce

nt

of

Re

sp

on

de

nts

Page 12: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Primary Job Function

84% = Training (e.g., trainers, training managers, training directors, or training evaluators)– 30% Instructors or Trainers– 27% Training Managers– 20% Training Evaluators– 14% Training Directors

45% = Instructional Designers & Program Devel.5% = Human Resources; 5% Performance

Managers; and 4% CLOs

Page 13: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Categorized Job Titles

26% Trainers, Educators, or Instructors20% Managers (e.g., Training, IT Programs,

Instructional Designers, or Quality Assurance)19% Directors (Director of Corp Education, E-

Learning, Professional Development, etc.)13% Instructional Designers or Technologists13% High Ranking Administrators (CEO,

President, CLO, CTO)9% Consultants

Page 14: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Professional Reading Interests

• 80% read magazines or journals related to e-learning.

• Nearly 100% read training related publications

Page 15: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 12. Methods Used to Deliver Training in Organization

0 20 40 60 80 100 120

Instructor-Led Classroom

Internet/Intranet

Multimedia

Videotape

Paper-Based Correspondence

Other

Page 16: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 14. Interest in Web Learning by Industry Type

0102030405060708090

100

Consu

lting

Fin

Servic

es/In

sura

nce

Info

Tech

Health

Ser

vices

Educa

tion

Indu

stria

l

Gover

nmen

tPe

rce

nt

of

Re

sp

on

de

nts

Agree/Strongly Agree

Unsure

Disagree/StronglyDisagree

Page 17: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 15. Commitment to Web Learning by Industry Type

0102030405060708090

100

Fin S

ervic

es/In

sur

Educa

tion

Info

Tech

Consu

lting

Indu

stria

l

Gover

nmen

t

Health

Ser

vices

Pe

rce

nt

of

Re

sp

on

de

nts

Agree/Strongly Agree

Unsure

Disagree/StronglyDisagree

Page 18: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 17. Reasons Interested in Web-Based Learning

0102030405060708090

100

Pe

rce

nt

of

Re

sp

on

de

nts

Page 19: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Why Interested in E-Learning?

Mainly cost savings Reduced travel time Greater flexibility in delivery Timeliness of training Better allocation of resources, speed of delivery,

convenience, course customization, lifelong learning options, personal growth, greater distrib of materials

Page 20: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Why Interested in E-Learning?

“Exploit the technology to deliver our intellectual capital.”

“Reduce time to learn, reduce time to productivity.”

“Cost reduction (write once, publish on different platforms).”

“Invest less in expensive trips to train for 3 days without apparent results.”

Page 21: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 19. Purpose of Web-Based Learning in Organization

0

10

20

30

40

50

60

70

Sole source oflearning

Supplementtraditional

Follow-up totraditional

Alternative totraditional

Other

Pe

rce

nrt

of

Re

sp

on

de

nts

Page 22: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Blended Approach Is Most Common Ganzel, May 2001, Online learning Magazine

Use Blended Approaches (Live plus Online)

Yes67%

No33%

Page 23: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Corporate Web Integration Continuum

Level 1: Blended course—self-paced

Level 2: Entire course online--self-paced

Level 3: Tutored or mentored course

Level 4: Blended course—instructor led

Level 5: Entire course online-synchronous

Level 6: Entire course online-asynchronous

Level 7: Entire course online-sync and asynchronous

Level 8: Certificate program online

Level 9: Degree online

Level 10: Corporate university online

Page 24: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 20. Types of Training Respondent Organizations Offer Online

010203040506070

Compu

ter A

pps/S

oftw

are

Tech

nical

Job

Relate

d

Comm

unica

tion

Skills

Syste

ms/P

rogr

amm

ing

Man

agem

ent

Perso

nal

Custo

mer

Ser

vice

Sales/M

arke

ting

Exec E

duca

tionP

ern

ce

nt

of

Re

sp

on

de

nts

Page 25: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 22. Aspects of Web-Based Training Developed In-House

0102030405060708090

100

Content Delivery System TrainingImplementation

Evaluation

Pe

rce

nt

of

Re

sp

on

de

nts

Page 26: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Current Courseware SystemNegatives

“Slow development time.” “Not interactive.” “Low interactivity, boring.” “…lack of bookmarking, tracking, evaluation,

etc.” “Don’t support the instructional design process

—are course management systems.” “XYZ,…, presents obstacles in moving course

content from one server to another.”

Page 27: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Current Courseware System: Negative and Positive

“…does provide a number of excellent features, yet development time is very clumsy…it is not very intuitive.”

“XYZ is powerful and intuitive. It is not always reliable.”

“Fairly reliable, but not always. At times have had to stop training and go back to the beginning to start again as it seizes up.”

“From a cost posture, they are, quite simply, unbeatable. Limitations: Can’t save whiteboard presentations developed in virtual classroom.”

Page 28: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Current Courseware System: Positives

“It is comprehensive, scalable, and intuitive.” “…seems to be flexible.” “XYZ is simple to use & clean in design.” “modify to suit individual course needs.” “It’s reasonably inexpensive, there is a Web-based

template to design customized courses…easily added to existing courseware.

Page 29: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Delivery System

17% developed own systems or tools 15% did not know what system they were

using 30% used Internet application tools (e.g.,

Designer’s Edge, Dreamweaver, Authorware) 35% used presentation tools (e.g., Astound,

WebEx)

Many used existing courseware systems and tools (e.g., WebBoard, Learning Space)

Page 30: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

What Vendors Select & Why?

• Standardization vs. Innovation

Standard Tool Advantages:Training easier, jump started, common framework,

fixed costs

Disadvantages:Tools do not fit all needs, need technical training,

lose control

Page 31: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Web-Based Content Capella Click 2 Learn Colleges/Universities Digital Think Docent, Inc. Eduprise Element K eMind.com eSocrates ExecuTrain Freeskills.com Headlight.com Jones International University KnowledgeNet

Knowledge Planet Mentergy--includes LearnLinc

products Microsoft Training and Service Netg Prime Learning Saba Smart Force ThinQ (i.e., Trainingnet) TrainSeek Vcampus Viviance New Education Walden Univ./Institute

Page 32: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 24. Aspects of Web-Based Training Outsourced

0

10

20

30

40

50

60

70

80

Content Delivery System TrainingImplementation

Evaluation

Pe

rce

nt

of

Re

sp

on

de

nts

Page 33: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 25. Percent of Respondent Organizations Conducting Formal Evaluations of Web-Based Learning

No59%

Yes41%

Page 34: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Why Evaluate?

• Cost-savings– Becoming less important reason to evaluate

as more people recognize that the initial expense is balanced by long-term financial benefits

• Performance improvement– A clear place to see impact of online learning

• Competency advancement

Page 35: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Pause: How are costs calculated in online programs?

Page 36: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

The Cost of E-learning

• Brandon-hall.com estimates that an LMS system for 8,000 learners costs $550,000

• This price doesn’t include the cost of buying or developing content

• Bottom line: getting started in e-learning isn’t cheap

Page 37: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Evaluation Process• Can be likened to ADDIE instructional

design model– ANALYSIS is needed to determine a purpose

of the evaluation– A DESIGN is needed to guide the process– Instruments must be DEVELOPED– Without IMPLEMENTATION you have no

data– In the end, the data are analyzed, and

EVALUATED

Page 38: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

A Few Assessment Comments

Page 39: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Level 1 Comments. Reactions

“We assess our courses based on participation levels and online surveys after course completion. All of our courses are asynchronous.”

“I conduct a post course survey of course material, delivery methods and mode, and instructor effectiveness. I look for suggestions and modify each course based on the results of the survey.”

“We use the Halo Survey process of asking them when the course is concluding.”

Page 40: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Level 2 Comments: Learning

“We use online testing and simulation frequently for testing student knowledge.”

“Do multiple choice exams after each section of the course.”

“We use online exams and use level 2 evaluation forms.”

Page 41: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Level 3 Comment: Job Performance

“I feel strongly there is a need to measure the success of any training in terms of the implementation of the new behaviors on the job. Having said that, I find there is very limited by our clients in spending the dollars required…”

Page 42: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

More Assessment CommentsMultiple Level Evaluation

“Using Level One Evaluations for each session followed by a summary evaluation. Thirty days post-training, conversations occur with learners’ managers to assess Level 2” (actually Level 3).”

“We do Level 1 measurements to gauge student reactions to online training using an online evaluation form. We do Level 2 measurements to determine whether or not learning has occurred…

“Currently, we are using online teaching and following up with manager assessments that the instructional material is being put to use on the job.”

Page 43: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Who is Evaluating Online Learning?

• 59% of respondents said they did not have a formal evaluation program

• At Reaction level: 79%

• At Learning level: 61%

• At Behavior/Job Performance level: 47%

• At Results or Return on Investment: 30%

Page 44: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 26. How Respondent Organizations Measure Success of Web-Based Learning

0102030405060708090

Learner satisfaction Change inknowledge, skill,

atttitude

Job performance ROI

Kirkpatrick's Evaluation Level

Pe

rce

nt

of

Re

spo

nd

en

ts

Page 45: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Assessment Lacking or Too Early

“We are just beginning to use Web-based technology for education of both associates and customers, and do not have the metric to measure our success. However, we are putting together a focus group to determine what to measure (and) how.”

“We have no online evaluation for students at this time.”

“We lack useful tools in this area.”

Page 46: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Limitations with Current System

“I feel strongly there is a need to measure the success of any training in terms of the implementation of the new behaviors on the job. Having said that, I find there is very limited by our clients in spending the dollars required…”

“We are looking for better ways to track learner progress, learner satisfaction, and retention of material.”

“Have had fairly poor ratings on reliability, customer support, and interactivity…”

Page 47: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Pause…How and What Do You Evaluate…?

Page 48: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

What else did the corporate training

survey show?

Page 49: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 27. Organizational Ownership of Online Courses and Materials

0102030405060708090

100

Clear Guidelines Property ofOrganization

Pe

rce

nt

of

Re

sp

on

de

nts

Agree/Totally Agree

Unsure

Disagree/StronglyDisagree

Page 50: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 28. Organizational Interest in Knowledge Objects

Agree44%

Strongly Agree25%

Unsure17%

Strongly Disagree

3%Disagree

11%

Page 51: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 29. Percent of Organizations Valuing Online Certificates and Degrees as Much as Those

FromTraditional Programs

0

20

40

60

80

100

Value OnlineCertificates

Value OnlineDegrees

Per

cen

t o

f R

esp

on

den

ts

Agree/Totally Agree

Unsure

Disagree/StronglyDisagree

Page 52: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 31. Course Tools with Growth Potential

0 5 10 15 20 25

CourseEvaluations

Courseware

Quizzes/Tests

File Up/Download

Cases orProblems

Databases

Percent of Respondents Indicating High Usefulness for a ParticularTool or Resource But Not Currently Using It

Page 53: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 42. Percent of Instructional Time spent training via the Web in the next decade

0%

20%

40%

60%

80%

100%

1 Year 2Years

5Years

10Years

76-100%

51-75%

26-50%

1-25%

0%

Page 54: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 44. Freelance or Adjunct Instructor Web-Based Training

0%

20%

40%

60%

80%

100%

Past Experience Future Interest

No

Yes

Page 55: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 56: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 45. Cultural and Organizational Reasons Limiting the Adoption of Web-Based Learning

0 10 20 30 40 50

Perceived High Cost

Instructor Prep Time

Cultural Resistance

Lack of Org Support

Difficult to Measure ROI

Lack Web Training

Lack of Interest

Lack Time to Learn

Instructor Delivery Time

Learner Time

Percent of Respondents

Page 57: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Sample Reasons for Obstacles

• “Skepticism on the benefits within the Healthcare environment.”

• “Ignorance about the advantages of using the Internet to save money.”

• “Generation gap and bias against anything not face to face.”

• “Poor support from IT managers to support organizational goals.”

• “Lack of foresight in the industry/no ability to see the big pic!”

Page 58: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 46. Technological Reasons Limiting the Adoption of Web-Based Learning

0 10 20 30 40 50

Bandwidth

Tech Support

Firewalls

Hardware

Lack of Standards

Classroom Resources

Lack Interactivity

Software

Percent of Respondents

Page 59: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Just Why is Bandwidth So Darn Important???

Page 60: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 61: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Obstacles: Technology Comments

“Lack of hardware to efficiently use Web-based technology.”

“Systems infrastructure.”“Huge diversity in hardware.”“Reliable Web access of our training

audiences.”“Caught up in the tech not the

instruction!”

Page 62: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 63: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 64: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Obstacles: Problems in Delivery Methods

“Students needs hands on.”“High rate of change in IT materials

—never mature.”“Effectiveness of this method.”“Some courses are better delivered

in traditional classrooms.”

Page 65: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 47. Types of Training Provided to Personnel for Designing and Developing Web-Based Courses

0 5 10 15 20 25 30 35

Topical Conferences

Topical Workshops

Expert Access

Outside Consultants or Company

Vendor Supported

Web Courses

Percent of Respondents

Page 66: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 48. Percent of Organizations Where Design and Development Training Leads to Certification

Yes22%

No63%

Don't Know15%

Page 67: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 49. Location Where Learners Access Web-Based Training

0 10 20 30 40 50 60 70 80

Office

Home

Road

Other

Percent of Respondents

Page 68: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 69: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 50. Support Resources Provided for E-Learners

0 10 20 30 40 50 60

E-mail support

Desktops

Online tutorials

Online help

Laptops

Computer labs

24 hour phone support

None

Percent of Respondents

Page 70: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 71: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 52. Number of Languages Respondent Organizations Currently Offer Web-Based Courses

0

510

15

2025

30

3540

45

1 2 3 4 to 6 7 to 10 10+ NA

Number of Languages

Pe

rce

nt

of

Re

sp

od

en

ts

Page 72: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 53. Learner Completion Rate in Web-Based Courses

0

5

10

15

20

25

0-25% 26-50% 50-59% 60-69% 70-79% 80-89% 90-99% 99-100%

Learner Completion Rate

Per

cen

t o

f R

esp

on

den

ts

Page 73: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 54. Reasons Learners Fail to Complete Web-Based Courses

0 10 20 30 40 50

Time

Lack of incentives

Poorly designed instruction

Costs

Percent of Respondents

Page 74: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 75: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Figure 55. Incentives for Successful Completion of Web-Based Learning

0 10 20 30 40 50 60

None

Inc Job Responsibility

Public Recognition

Awarding Credits to Degree

Inc Job Security

Salary

Promotion

Percent of Respondents

Page 76: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Issues Raised in Survey

• Increases in Web instruction anticipated

• Better tools needed

• Perceived high cost

• Need clearer vision & manage support

• Lots of money being spent

• Low course completion rates

• Limited organizational support

Page 77: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

So, any questions about the state of things?

Page 78: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

What do we need???Part II

Evaluation Purposes, Approaches and Frameworks

Page 79: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

One Area in Need of Frameworks is Evaluation

of Online Learning

Page 80: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

What is Evaluation???

“Simply put, an evaluation is concerned with judging the worth of a program and is essentially conducted to aid in the making of decisions by stakeholders.” (e.g., does it work as effectively as the standard instructional approach).

(Champagne & Wisher, in press)

Page 81: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

But who are the evaluators?

The level of evaluation will depend on articulation of the stakeholders. Stakeholders of evaluation in corporate settings may range from…???

Page 82: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

What is assessment?• Assessment refers to…efforts to obtain info about

how and what students are learning in order to improve…teaching efforts and/or to demo to others the degree to which students have accomplished the learning goals for a course.” (Millar, 2001, p. 11).

• It is a way of using info obtained through various types of measurement to determine a learner’s performance or skill on some task or situation (Rosenkrans, 2000).

Page 83: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Why Evaluate?

Page 84: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Evaluation Purposes

• Assessing learner progress– What did they learn?

• Assessing learning impact– How well do learners use what they learned?– How much do learners use what they learn?

Page 85: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Evaluation Purposes

• Efficiency– Was online learning more effective than

another medium?– Was online learning more cost-effective than

another medium/what was the return on investment (ROI)?

• Improvement– How do we do this better?

Page 86: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Evaluation Purposes

• An evaluation plan can evaluate the delivery of e-learning, identify ways to improve the online delivery of it, and justify the investment in the online training package, program, or initiative (Champagne & Wisher, in press).

Page 87: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Evaluation Purposes

• Evaluation can help quantify the return on investment allowing one to compare the costs of acquiring, developing, and implementing e-learning to actual savings, revenue impact, and other competitive advantages that are translatable into monetary values.

Page 88: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Contextual Factors

• Learner progress, impact of training and efficiency all may be affected by other contextual factors

• Contextual factors unique to online learning:– Technology breakdowns– Inadequate computer systems (learners can’t

access multimedia components -- and don’t know that they’re missing anything)

Page 89: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Evaluation Plans

Does your company have a training evaluation plan?

Page 90: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Formal Evaluation Programs

• Most training evaluation data are not used for evaluation or performance improvement purposes.

• Why? There is no plan for using the data and no one has the time.

• Why does it matter in online learning? Need to be sure that the development expense is justified.

Page 91: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Steps to Developing an OL Evaluation Program

• Select a purpose and framework• Develop benchmarks• Develop online survey instruments

– For learner reactions– For learner post-training performance– For manager post-training reactions

• Develop data analysis and management plan

Page 92: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

What Are Your Evaluation Questions?

• What does your employer want to know about online learning’s impact?

• How interested is your employer in evaluation results?

Page 93: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Formative Evaluation

• Formative evaluations focus on improving the online learning experience.

• A formative focus will try to find out what worked or did not work.

• Formative evaluation is particularly useful for examining instructional design and instructor performance.

Page 94: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Formative Questions

• -How can we improve our OL program?

• -How can we make our OL program more efficient?

• -More effective?

• -More accessible?

Page 95: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Summative Evaluation

• Summative evaluations focus on the overall success of the OL experience (should it be continued?).

• A summative focus will look at whether or not objectives are met, the training is cost-effective, etc.

Page 96: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

What Can OL Evaluation Measure?

• Categories of Evaluation Info (Woodley and Kirkwood, 1986)

• .Measures of activity

• .Measures of efficiency

• .Measures of outcomes

• .Measures of program aims

• .Measures of policy

• .Measures of organizations

Page 97: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Typical Evaluation Frameworks for OL

• Commonly used frameworks include:– CIPP Model– Objectives-oriented– Marshall & Shriver’s 5 levels– Kirkpatrick’s 4 levels

• Plus a 5th level

– AEIOU– Consumer-oriented

Page 98: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

CIPP Model Evaluation

• CIPP is a management-oriented model– C = context

– I = input

– P = process

– P = product

• Examines the OL within its larger system/context

Page 99: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

CIPP & OL: Context

• Context: Addresses the environment in which OL takes place.

• How does the real environment compare to the ideal?

• Uncovers systemic problems that may dampen OL success.

Page 100: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

CIPP & OL: Input

• Input: Examines what resources are put into OL.

• Is the content right?

• Have we used the right combination of media?

• Uncovers instructional design issues.

Page 101: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

CIPP & OL: Process

• Process: Examines how well the implementation works.

• Did the course run smoothly?

• Were there technology problems?

• Was the facilitation and participation as planned?

• Uncovers implementation issues.

Page 102: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

CIPP & OL: Product

• Product: Addresses outcomes of the learning.

• Did the learners learn? How do you know?

• Does the online training have an effect on workflow or productivity?

• Uncovers systemic problems.

Page 103: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Objectives-Oriented Evaluation

• Examines OL training objectives as compared to training results

• Helps determine if objectives are being met• Helps determine if objectives, as formally

stated, are appropriate• Objectives can be used as a comparative

benchmark between online and other training methods

Page 104: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Evaluating Objectives & OL

• An objectives-oriented approach can examine two levels of objectives:– Instructional objectives for learners (did the

learners learn?)– Systemic objectives for training (did the

training solve the problem?)

Page 105: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Objectives & OL

• Requires:– A clear sense of what the objectives are

(always a good idea anyway)– The ability to measure whether or not

objectives are met• Some objectives may be implicit and hard

to state• Some objectives are not easy to measure

Page 106: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Marshall & Shriver's 5 Levels of Evaluation

• Performance-based evaluation framework

• Each level examines a different area’s of performance

• Requires demonstration of learning

Page 107: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Marshall & Shriver's 5 Levels

• Level I: Self (instructor)

• Level II: Course Materials

• Level II: Course Curriculum

• Level IV: Course Modules

• Level V: Learning Transfer

Page 108: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick’s 4 Levels

• A common training framework.

• Examines training on 4 levels.

• Not all 4 levels have to be included in a given evaluation.

Page 109: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

The 4 Levels

• Reaction

• Learning

• Behavior

• Results

Page 110: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

A 5th Level

• Return on Investment is a 5th level

• It is related to results, but is more clearly stated as a financial calculation

• How to calculate ROI is the big issue here

Page 111: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Is ROI the answer?

• Elise Olding of CLK Strategies suggests that we shift from looking at ROI to looking at time to competency.

• ROI may be easier to calculate since concrete dollars are involved, but time to competency may be more meaningful in terms of actual impact.

Page 112: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Example: Call Center Training

• Traditional call center training can take 3 months to complete

• Call center employees typically quit within one year

• When OL was implemented, the time to train (time to competency) was reduced

• Benchmarks for success: time per call; number of transfers

Page 113: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Example: Circuit City

• Circuit City provided online product/sales training

• What is more useful to know:– The overall ROI or break-even point?

– How much employees liked the training?

– How many employees completed the training?

– That employees who completed 80% of the training saw an average increase of 10% in sales?

Page 114: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

A 6th Level?Clark Aldrich (2002)

• Adding Level 6 which relates to the budget and stability of the e-learning team.– Just how respected and successful is the e-learning

team.

– Have they won approval from senior management for their initiatives.

– Aldrich, C. (2002). Measuring success: In a post-Maslow/Kirkpatrick

world, which metrics matter? Online Learning, 6(2), 30 & 32.

Page 115: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

And Even a 7th Level?Clark Aldrich (2002)

• At Level 7 whether the e-learning sponsor(s) or champion(s) are promoted in the organization.

• While both of these additional levels address the people involved in the e-learning initiative or plan, such recognitions will likely hinge on the results of evaluation of the other five levels.

Page 116: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

ROI Alternative:Cost/Benefit Analysis (CBA)

• ROI may be ill-advised since not all impacts hit bottom line, and those that do take time.

• Shifts the attention from more long-term results and quantifying impacts with numeric values, such as:– increased revenue streams,– increased employee retention, or– reduction in calls to a support center.

• Reddy, A. (2002, January). E-learning ROI calculations: Is a cost/benefit analysis a better approach? e-learning. 3(1), 30-32.

Page 117: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Cost/Benefit Analysis (CBA)

• To both qualitative and quantitative measures:– job satisfaction ratings,– new uses of technology, – reduction in processing errors, – quicker reactions to customer requests, – reduction in customer call rerouting, – increased customer satisfaction, – enhanced employee perceptions of training,– global post-test availability.

• Reddy, A. (2002, January). E-learning ROI calculations: Is a cost/benefit analysis a better approach? e-learning. 3(1), 30-32.

Page 118: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Cost/Benefit Analysis (CBA)

• In effect, CBA asks how does the sum of the benefits compare to the sum of the costs.

• Yet, it often leads to or supports ROI and other more quantitatively-oriented calculations.

• Reddy, A. (2002, January). E-learning ROI calculations: Is a cost/benefit analysis a better approach? e-learning. 3(1), 30-32.

Page 119: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Other ROI Alternatives• Time to competency (need benchmarks)

– online databases of frequently asked questions can help employees in call centers learn skills more quickly and without requiring temporary leaves from their position for such training

• Time to market– might be measured by how e-learning speeds up the

training of sales and technical support personnel, thereby expediting the delivery of a software product to the market

Raths, D. (2001, May). Measure of success. Online Learning, 5(5), 20-22, & 24.

Page 120: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Still Other ROI Alternatives

• Return on Expectation1. Asks employees a series of q’s related to how

training met expectations of their job performance.

2. When q’ing is complete, they place a $ figure on that.

3. Correlate or compare such reaction data with business results or supplement Level 1 data to include more pertinent info about the applicability of learning to employee present job situation.

– Raths, D. (2001, May). Measure of success. Online Learning, 5(5), 20-22, & 24.

Page 121: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

AEIOU• Provides a framework for looking at

different aspects of an online learning program

• Fortune & Keith, 1992; Sweeney, 1995; Sorensen, 1996

Page 122: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

A = Accountability

• Did the training do what it set out to do?

• Data can be collected through– Administrative records– Counts of training programs (# of attendees,

# of offerings)– Interviews or surveys of training staff

Page 123: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

E = Effectiveness

• Is everyone satisfied?– Learners– Instructors– Managers

• Were the learning objectives met?

Page 124: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

I = Impact

• Did the training make a difference?

• Like Kirkpatrick’s level 4 (Results)

Page 125: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

O = Organizational Context

• Did the organization’s structures and policies support or hinder the training?

• Does the training meet the organization’s needs?

• OC evaluation can help find when there is a mismatch between the training design and the organization

• Important when using third-party training or content

Page 126: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

U = Unintended Consequences

• Unintended consequences are often overlooked in training evaluation

• May give you an opportunity to brag about something wonderful that happened

• Typically discovered via qualitative data (anecdotes, interviews, open-ended survey responses)

Page 127: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Consumer-Oriented Evaluation

• Uses a consumer point-of-view– Can be a part of vendor selection process– Can be a learner-satisfaction issue

• Relies on benchmarks for comparison of different products or different learning media

Page 128: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

What About Evaluation Issues in Higher

Education???

Page 129: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

My Evaluation Plan…Considerations in Evaluation Plan

1. Student

2. Instructor

3. Training

4. Task5. Tech Tool

6. Course

7. Program

8. University or

Organization

Page 130: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

What to Evaluate?1.Student—attitudes, learning, jobs.

2.Instructor—popularity, survival.

3.Training—effectiveness, integratedness.

4.Task--relevance, interactivity, collab.

5.Tool--usable, learner-centered, friendly, supportive.

6.Course—interactivity, completion.

7.Program—growth, model(s), time to build.

8.University—cost-benefit, policies, vision.

Page 131: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

1. Measures of Student Success(Focus groups, interviews, observations,

surveys, exams, records)

• Positive Feedback, Recommendations• Increased Comprehension, Achievement• High Retention in Program• Completion Rates or Course Attrition• Jobs Obtained, Internships• Enrollment Trends for Next Semester

Page 132: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

1. Student Basic Quantitative

• Grades, Achievement• Number of Posts• Participated• Computer Log Activity—peak usage,

messages/day, time of task or in system• Attitude Surveys

Page 133: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

1. Student High-End Success

• Message complexity, depth, interactivity, q’ing• Collaboration skills• Problem finding/solving and critical thinking• Challenging and debating others• Case-based reasoning, critical thinking

measures• Portfolios, performances, PBL activities

Page 134: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Focus of Assessment?

1. Basic Knowledge, Concepts, Ideas

2. Higher-Order Thinking Skills, Problem Solving, Communication, Teamwork

3. Both of Above!!!

4. Other…

Page 135: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Assessments Possible

• Online Portfolios of Work

• Discussion/Forum Participation

• Online Mentoring

• Weekly Reflections

• Tasks Attempted or Completed, Usage, etc.

Page 136: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

More Possible Assessments

• Quizzes and Tests

• Peer Feedback and Responsiveness

• Cases and Problems

• Group Work

• Web Resource Explorations & Evaluations

Page 137: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Increasing Cheating Online($7-$30/page, http://www.syllabus.com/ January, 2002, Phillip Long,

Plagiarism: IT-Enabled Tools for Deceit?)

• http://www.academictermpapers.com/

• http://www.termpapers-on-file.com/

• http://www.nocheaters.com/• http://www.cheathouse.com/uk/index.html

• http://www.realpapers.com/

• http://www.pinkmonkey.com/ (“you’ll never buy Cliffnotes again”)

Page 138: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 139: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 140: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 141: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Reducing Cheating Online

• Ask yourself, why are they cheating?• Do they value the assignment?• Are tasks relevant and challenging?• What happens to the task after submitted

—reused, woven in, posted?• Due at end of term? Real audience?• Look at pedagogy b4 calling plagiarism

police!

Page 142: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Reducing Cheating Online

• Proctored exams• Vary items in exam• Make course too hard to cheat• Try Plagiarism.com ($300)• Use mastery learning for some tasks• Random selection of items for item pool• Use test passwords, rely on IP# screening• Assign collaborative tasks

Page 143: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Reducing Cheating Online($7-$30/page, http://www.syllabus.com/ January, 2002, Phillip Long,

Plagiarism: IT-Enabled Tools for Deceit?)

• http://www.plagiarism.org/ (resource) • http://www.turnitin.com/ (software, $100, free

30 day demo/trial)• http://www.canexus.com/ (software; essay

verification engine, $19.95)• http://www.plagiserve.com/ (free database of

70,000 student term papers & cliff notes)• http://www.academicintegrity.org/ (assoc.)• http://sja.ucdavis.edu/avoid.htm (guide)

Page 144: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 145: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Turnitin Testimonials

"Many of my students believe that if they do not submit their essays, I will not discover their plagiarism. I will often type a paragraph or two of their work in myself if I suspect plagiarism. Every time, there was a "hit." Many students were successful plagiarists in high school. A service like this is needed to teach them that such practices are no longer acceptable and certainly not ethical!”

Page 146: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Part III:

Applying Kirkpatrick’s 4

Levels to Online Learning Evaluation & Evaluation Design

Page 147: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Why Use the 4 Levels?

• They are familiar and understood

• Highly referenced in the training literature

• Can be used with 2 delivery media for comparative results

Page 148: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Conducting 4-Level Evaluation

• You need not use every level– Choose the level that is most

appropriate to your need and budget

• Higher levels will be more costly and difficult to evaluate

• Higher levels will yield more

Page 149: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick Level 1: Reaction

• Typically involves “Smile sheets” or end-of-training evaluation forms.

• Easy to collect, but not always very useful.

• Reaction-level data on online courses has been found to correlate with ability to apply learning to the job.

• Survey ideally should be Web-based, keeping the medium the same as the course.

Page 150: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick Level I: Reaction

• Types of questions:– Enjoyable?– Easy to use?– How was the instructor?– How was the technology?– Was it fast or slow enough?

Page 151: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick Level 2: Learning

• Typically involves testing learners immediately following the training

• Not difficult to do, but online testing has its own challenges– Did the learner take the test

on his/her own?

Page 152: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick Level 2: Learning

• Higher-order thinking skills (problem solving, analysis, synthesis)

• Basic skills (articulate ideas in writing)

• Company perspectives and values (teamwork, commitment to quality, etc.)

• Personal development

Page 153: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick Level 2: Learning

• Might include:– Essay tests.– Problem solving exercises.– Interviews.– Written or verbal tests to assess

cognitive skills.

Shepard, C. (1999b, July). Evaluating online learning. TACTIX from Fastrak Consulting. Retrieved February 10, 2002, from: http://fastrak-consulting.co.uk/tactix/Features/evaluate/eval01.htm.

Page 154: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick Level 3: Behavior

• More difficult to evaluate than Levels 1 & 2

• Looks at whether learners can apply what they learned (does the training change their behavior?)

• Requires post-training follow-up to determine

• Less common than levels 1 & 2 in practice

Page 155: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick Level 3: Behavior

• Might include:– Direct observation by supervisors or

coaches (Wisher, Curnow, & Drenth, 2001).– Questionnaires completed by peers,

supervisors, and subordinates related to work performance.

– On the job behaviors, automatically logged performances, or self-report data.

Shepard, C. (1999b, July). Evaluating online learning. TACTIX from Fastrak Consulting. Retrieved February 10, 2002, from: http://fastrak-consulting.co.uk/tactix/Features/evaluate/eval01.htm.

Page 156: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick Level 4: Results

• Often compared to return on investment (ROI)

• In e-learning, it is believed that the increased cost of course development ultimately is offset by the lesser cost of training implementation

• A new way of training may require a new way of measuring impact

Page 157: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick Level 4: Results

• Might Include:– Labor savings (e.g., reduced duplication of

effort or faster access to needed information).

– Production increases (faster turnover of inventory, forms processed, accounts opened, etc.).

– Direct cost savings (e.g., reduced cost per project, lowered overhead costs, reduction of bad debts, etc.).

– Quality improvements (e.g., fewer accidents, less defects, etc.).

Horton, W. (2001). Evaluating e-learning. Alexandria, VA: American Society for Training & Development.

Page 158: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Kirkpatrick + Evaluation Design

• Kirkpatrick’s 4 Levels may be achieved via various evaluation designs

• Different designs help answer different questions

Page 159: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Pre/Post Control Groups

• One group receives OL training and one does not

• As variation try 3 groups– No training (control)– Traditional training– OL training

• Recommended because it may help neutralize contextual factors

• Relies on random assignment as much as possible

Page 160: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Multiple Baselines

• Can be used for a program that is rolling out

• Each group serves as a control group for the previous group

• Look for improvement in subsequent groups

• Eliminates need for tight control of control group

Page 161: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Time Series

• Looks at benchmarks before and after training

• Practical and cost-effective• Not considered as rigorous as

other designs because it doesn’t control for contextual factors

Page 162: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Single Group Pre/Post

• Easy and inexpensive• Criticized for lack of rigor

(absence of control)• Needs to be pushed into

Kirkpatrick levels 3 and 4 to see if there has been impact

Page 163: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Case Study

• A rigorous design in academic practice, but often after-the-fact in corporate settings

• Useful when no preliminary or baseline data have been collected

Page 164: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Part IV:ROI and Online

Learning

Page 165: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

The Importance of ROI

• OL requires a great amount of $$ and other resources up front

• It gives the promise of financial rewards later on

• ROI is of great interest because of the investment and the wait period before the return

Page 166: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Calculating ROI

• Look at:– Hard cost savings– Hard revenue impact– Soft competitive benefits– Soft benefits to individuals

See: Calculating the Return on Your eLearning Investment (2000) by Docent, Inc.

Page 167: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Possible ROI Objectives

• Better Efficiencies• Greater Profitability• Increased Sales• Fewer Injuries on the Job• Less Time off Work• Faster Time to Competency

Page 168: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Hard Cost Savings

• Travel• Facilities• Printed material costs (printing,

distribution, storage)• Reduction of costs of business

through increased efficiency• Instructor fees (sometimes)

Page 169: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Hard Revenue Impact

• Consider– Opportunity cost of improperly or

untrained personnel– Shorter time to productivity through

shorter training times with OL– Increased time on job (no travel

time)– Ease of delivering same training to

partners and customers (for fee?)

Page 170: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Soft Competitive Benefits

• Just-in-time capabilities• Consistency in delivery• Certification of knowledge

transfer• Ability to track users and gather

data easily• Increase morale from

simultaneous roll-out at different sites

Page 171: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Individual Values

• Less wasted time• Support available as needed• Motivation from being treated

as an individual

Page 172: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Talking about ROI

• As a percentage– ROI=[(Payback-Investment)/

Investment]*100• As a ratio

– ROI=Return/Investment• As time to break even

– Break even time=(Investment/Return)*Time Period

Page 173: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

What is ROI Good For?

•Prioritizing Investment•Ensuring Adequate

Financial Support for OL Project

•Comparing Vendors

Page 174: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

The Changing Face of ROI

• “Return-on-investment isn’t what it used to be … The R is no longer the famous bottom line and the I is more likely a subscription fee than a one-time payment” (Cross, 2001)

Page 175: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

More Calculations

• Total Admin Costs of Former Program - Total Admin Costs of OL Program=Projected Net Savings

• Total Cost of Training/# of Students=Cost Per Student (CPS)

• Total Benefits * 100/Total Program Cost=ROI%

Page 176: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

At the End of the Day...

• Are all training results quantifiable?• NO! Putting a price tag on some

costs and benefits can be very difficult

• NO! Some data may not have much meaning at face value– What if more courses are offered and

annual student training hours drop simultaneously? Is this bad?

Page 177: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Part V:

Collecting Evaluation Data

& Online Evaluation Tools

Page 178: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Collecting Evaluation Data

• Learner Reaction• Learner Achievement• Learner Job Performance• Manager Reaction• Productivity Benchmarks

Page 179: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Forms of Evaluation

• Interviews• Focus Groups• Self-Analysis• Supervisor Ratings• Surveys and Questionnaires• ROI• Document Analysis• Data Mining (Changes in pre and post-

training; e.g., sales, productivity)

Page 180: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

How Collect Data?

• Direct Observation in Work Setting– By supervisor, co-workers,

subordinates, clients

• Collect Data By Surveys, Interviews, Focus Groups– Supervisors, Co-workers,

Subordinates, Clients

• Self-Report by learners or teams

Page 181: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Learner Data

• Online surveys are the most effective way to collect online learner reactions

• Learner performance data can be collected via online tests– Pre and post-tests can be used to

measure learning gains• Learner post-course performance data

can be used for Level 3 evaluation– May look at on-the-job performance– May require data collection from

managers

Page 182: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Example: Naval Phys. Training Follow-Up

Evaluation• A naval training unit uses an online

survey/database system to track performance of recently trained physiologists

• Learner’s self-report performance• Managers report on learner

performance• Unit heads report on overall

productivity

Page 183: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Learning System Data

• Many statistics are available, but which are useful?– Number of course accesses– Log-in times/days– Time spent accessing course

components– Frequency of access for particular

components– Quizzes completed and quiz scores– Learner contributions to discussion (if

applicable)

Page 184: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Learner System Data

• IF learners are being evaluated based on number and length of accesses, it is only fair that they be told

• Much time can be wasted analyzing statistics that don’t tell much about the actual impact of the training

• Bottom line: Easy data to collect, but not always useful for evaluation purposes– Still useful for management purposes

Page 185: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Benchmark Data

• Companies need to develop benchmarks for measuring performance improvement

• Managers typically know the job areas that need performance improvement

• Both pre-training and post-training data need to be collected and compared

• Must also look for other contextual factors

Page 186: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Online Testing Tools(see: http://www.indiana.edu/~best/)

Page 187: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 188: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 189: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Test Selection Criteria

(Hezel, 1999)• Easy to Configure Items and Test• Handle Symbols• Scheduling of Feedback (immediate?)• Provides Clear Input of Dates for Exam• Easy to Pick Items for Randomizing• Randomize Answers Within a Question• Weighting of Answer Options

Page 190: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

More Test Selection Criteria

• Recording of Multiple Submissions

• Timed Tests• Comprehensive Statistics• Summarize in Portfolio and/or

Gradebook• Confirmation of Test

Submission

Page 191: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

More Test Selection Criteria

(Perry & Colon, 2001)

• Supports multiple items types—multiple choice, true-false, essay, keyword

• Can easily modify or delete items• Incorporate graphic or audio elements?• Control over number of times students

can submit an activity or test• Provides feedback for each response

Page 192: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

• Flexible scoring—score first, last, or average submission

• Flexible reporting—by individual or by item and cross tabulations.

• Outputs data for further analysis• Provides item analysis statistics

(e.g., Test Item Frequency Distributions).

More Test Selection Criteria

(Perry & Colon, 2001)

Page 193: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Computer Log DataChen, G. D., Liu, C. C., Liu, B. J. (2000). Discovering decision knowledge from Web log portfolio for managing classroom processes by applying decision tree and data cute tech.

Journal of Educ Computing Research, 23(3), 305-332. • Determine student behavior patterns

– student posting opinions,– asking questions, – replying to opinions, – posting articles, etc.

• Web logs can also help instructors make informed pedagogical decisions. For instance, does a particular teaching strategy or task improve student interaction?

Page 194: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Computer Log DataChen, G. D., Liu, C. C., Liu, B. J. (2000). Discovering decision knowledge from Web log portfolio for managing classroom processes by applying decision tree and data cute tech.

Journal of Educ Computing Research, 23(3), 305-332. • In a corp training situation, computer log data

can correlate online course completions with:– actual job performance improvements such as

• fewer violations of safety regulations,

• reduced product defects,

• increased sales, and

• timely call responses.

Page 195: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Email and Chat

• Chats and email messages might provide data about the effectiveness of the training event.

Page 196: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Online Survey Tools for Assessment

Page 197: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Sample Survey Tools• Zoomerang

(http://www.zoomerang.com)• IOTA Solutions

(http://www.iotasolutions.com)• QuestionMark

(http://www.questionmark.com/home.html)• SurveyShare (http://SurveyShare.com;

from Courseshare.com)• Survey Solutions from Perseus

(http://www.perseusdevelopment.com/fromsurv.htm)

• Infopoll (http://www.infopoll.com)

Page 198: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 199: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 200: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Survey Tool Features• Maintain email lists and email invitations• Conduct polls• Adaptive branching and cross tabulations• Modifiable templates• Maintain library of past surveys• Publish reports• Technical support, chat advice• Different types of accounts—hosted,

corporate, professional, etc.

Page 201: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 202: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 203: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 204: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 205: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com
Page 206: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Web-Based Survey Advantages

• Faster collection of data• Standardized collection format• Computer graphics may reduce

fatigue• Computer controlled branching

and skip sections• Easy to answer clicking• Wider distribution of

respondents

Page 207: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Web-Based Survey Problems: Why Lower

Response Rates?• Low response rate• Lack of time• Unclear instructions• Too lengthy• Too many steps• Can’t find URL• Perceived as aggressive

Page 208: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Web-Based Survey Solutions: Some Tips…• Send second request• Make URL link prominent• Offer incentives near top of request• Shorten survey, make attractive,

easy to read• Credible sponsorship—e.g., university• Disclose purpose, use, and privacy• E-mail cover letters• Prenotify of intent to survey

Page 209: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Tips on Authentification

• Check e-mail access against list• Use password access• Provide keycode, PIN, or ID #• (Futuristic Other: Palm Print,

fingerprint, voice recognition, iris scanning, facial scanning, handwriting recognition, picture ID)

Page 210: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Some Final Advice…

Page 211: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

• As venture capital drys up and state funding is cut, evaluation and accountability takes center stage in e-learning decision-making and discussion.

Page 212: Session P16 Evaluating Online Learning: Frameworks and Perspectives (Workshop: Sunday Feb 17th, Training 2002) Dr. Curtis J. Bonk President, CourseShare.com

Questions?

Comments?

Concerns?