evaluating success of_digital_content

39
Evaluating the Success of Digital Content WCET Annual Meeting San Antonio, Texas November 2, 2012 Presenters: Dr. Darlene Williams , Northwestern State University (LA) Ms. Julie Ricke (eduKan)

Upload: wcet

Post on 16-Jan-2015

245 views

Category:

Documents


3 download

DESCRIPTION

WCET Annual Meeting Presentation

TRANSCRIPT

Page 1: Evaluating success of_digital_content

Evaluating the Success of Digital Content

WCET Annual MeetingSan Antonio, TexasNovember 2, 2012

Presenters:

Dr. Darlene Williams, Northwestern State University (LA)Ms. Julie Ricke (eduKan)

Page 2: Evaluating success of_digital_content

Evaluating the Success of Digital Content

Abstract

Institutions are converting textbook-based courses to more interactive courses with digitally embedded content. The presenters will discuss measures of success with the implementation of digitally embedded content at two very different institutions. Additionally, they will describe the model and strategy employed by their respective institutions for their digital content initiatives; share the successes and stumbling blocks to its implementation; and enumerate methods to measure and analyze of student success and engagement with digital content and open educational resources.

Page 3: Evaluating success of_digital_content

Northwestern State Universityof Louisiana

32 Online Degree Programs

68% of Students Take Online Courses

30% are Exclusively Online

12,000 Enrollments Each Semester

550-600 Course Sections Each Semester

LMS - Moodle

Page 4: Evaluating success of_digital_content

Northwestern State Universityof Louisiana

Experimentation (1999-2000)

(Desktop video, Course Submission Database, Student Assessment Splash Page

“Core” Course Development

Video Conferencing Infrastructure

Technology Costing Methodology Project (2001: WCET and NCHEMS Joint Project)

Recognizing the Need to be Adult Friendly

SREB Assessment (September 2004)

Professional Development Redesign (2005)

Competency Levels

Staff (Instructional and Media)

Mobile Initiatives (Web, Course Content, etc.)

Digital Content and Social Media 2008-2009

Storage, Sharing, Backup, Lecture Capture

Page 5: Evaluating success of_digital_content

Barriers to the Adoption of Digital Content

Digital resources– web resources, video, audio, provide great options when developing digital content

Findings:

During development and review processes established for online courses the resources are:

Not necessarily evaluated for quality

Not necessarily evaluated as appropriate measure of learning outcomes

Alignment with learning outcomes is a time-intensive process

University resources often limited to support the digital effort

Not necessarily used to the fullest potential

Page 6: Evaluating success of_digital_content

Early Initiatives

Publisher's Course Packs Designed for Learning Management Systems

Learning Objects (Merlot)

Faculty Produced Digital Content (Video Cameras, Audio Recorders)

Video Conferencing Recordings (Full and Partial Lectures)

Podcasts (Podcast Producer) RSS Feeds, Imported into the LMS, Designed for Multiple Devices

There was still limited digital content being created and distributed in online courses.

Page 7: Evaluating success of_digital_content

Faculty as Digital Content Adopters and Developers

Ebooks (Student Choice)

Audio

Video

YouTube

Vimeo

Blogs

Wikis

Documents

Portfolio Elements

Page 8: Evaluating success of_digital_content

Faculty as Digital Content Adopters and Developers

Page 9: Evaluating success of_digital_content

Challenges in the Support and Development of Digital Content

• Creating Content

• Methods for Creating Access to the Content

• Sharing with Students/Departments/Colleagues

• Backup/Storage/Disaster Recovery Protocol

Establishing IT Support Protocol is Important Part of the Process

Page 10: Evaluating success of_digital_content

Challenges in the Support and Development of Digital Content

What is the Life Cycle of the Digital Content?

How Do You Manage the Process?

What Support is Required in Order to be Successful?

How Do We Manage the Promotion of Effective Digital Pedagogy?

What are the Criteria for Evaluating Digital Content?

How Can we Better Assess the Effectiveness of Digital Content?

Page 11: Evaluating success of_digital_content

Early Initiative

Early Deployment Architecture

Transition to Fully Integrated Comprehensive Content Management System

Current Capability:

47 Classrooms Capable of Recording Lecture Automatically or Adhoc

A Web Interface Allows for the Upload of Content from the Office, Home, or in the Field

Video Available for Faculty to Publish

Page 12: Evaluating success of_digital_content

Measuring Success of Digital Content

Students:

Usage: (Do Student Access and View the Content)

Track “Hits” on Learning Management System Track “Views” on VIC (Video Integrated Content

System: Content created as full lectures in the classroom, adhoc from the office or field, through Jabber (MOVI), Podcast Producer...etc.

First Week of Classes Tracked Hundreds of Views by Students

*VIC is also the name of NSU's mascot.

Page 13: Evaluating success of_digital_content

Measuring Success of Digital Content

Faculty adoption Participation in professional development Work with appropriate staff (instructional and

media positions to assist in the development and assessment of desired course content)

Adoption of standards and rubrics to assess learning outcomes

Involvement in a peer review process

Page 14: Evaluating success of_digital_content

Current Efforts

Assist Faculty in the Development of All Forms of Content:

Open content Mobile content Connected content Collaborative content Cross media content

Page 15: Evaluating success of_digital_content

Future

Long range planning does not deal with future decisions, but with the future of present decisions. (Peter Drucker)

Developing content that provides the best learning experience for students

Adoption of “digital content” best practices Incorporate assessment strategies for digital content

Page 16: Evaluating success of_digital_content

Thank You!

Dr. Darlene Williams

Vice President for Technology, Research, and Economic Development

Northwestern State University

[email protected]

Page 17: Evaluating success of_digital_content

.

EduKan provides access to quality higher education through degrees, certificates, individual courses, support services and emerging market-driven programming.

We are accessible, convenient and affordable

Page 18: Evaluating success of_digital_content

Average cost of a text book $157.56 ($18.29 - $307.90)

Spring 2010 1,338 students spent $233,082

10 most popular classes

60% of total textbook costs

Average cost of $224.20

Page 19: Evaluating success of_digital_content

Project Aristotle

Page 20: Evaluating success of_digital_content

RetentionCourse ID Course Title

Match Index Format

Number Completed Number Withdrew Number students Completion Rate Difference

4933338 Introduction to Computer Concepts and Applications, Herrera 1 ARISTOTLE 25 6 31 0.81

4576193 Introduction to Computer Concepts and Applications, Herrera 1 TEXTBOOK 16 2 18 0.89 -0.084933352 Principles of Macroeconomics, Reynolds 2 ARISTOTLE 18 2 20 0.90

4576219 Principles of Macroeconomics, Reynolds 2 TEXTBOOK 8 2 10 0.80 0.104933324 Beginning Algebra, Wenzel 3 ARISTOTLE 18 6 24 0.75

4576098 Beginning Algebra, Wenzel 3 TEXTBOOK 12 5 17 0.71 0.044933336 Intermediate Algebra, Goymerac 4 ARISTOTLE 12 2 14 0.86

4576187 Intermediate Algebra, Goymerac 4 TEXTBOOK 23 0 23 1.00 -0.144933328 College Algebra, Dowell 5 ARISTOTLE 15 4 19 0.79

4576113 College Algebra, Dowell 5 TEXTBOOK 19 7 26 0.73 0.064933390 American Government 6 ARISTOTLE 12 2 14 0.86

4573597 American Government, Kryschtal 6 TEXTBOOK 16 0 16 1.00 -0.144933384 Personal Finance, Niederman 7 ARISTOTLE 11 7 18 0.61

4576166 Personal Finance, Niederman 7 TEXTBOOK 14 2 16 0.88 -0.264933375 Introduction to Business, M Hatcher 8 ARISTOTLE 14 2 16 0.88

4576141 Introduction to Business, M Hatcher 8 TEXTBOOK 9 3 12 0.75 0.134955234 College Algebra, Faullin 9 ARISTOTLE 20 2 22 0.91

4576091 College Algebra, Faullin 9 TEXTBOOK 11 1 12 0.92 -0.01

4933400 Introduction to Computer Concepts and Applications, Herrera 10 ARISTOTLE 13 4 17 0.76

4573626 Introduction to Computer Concepts and Applications, Herrera 10 TEXTBOOK 13 6 19 0.68 0.084933392 Beginning Algebra, Wenzel 11 ARISTOTLE 11 2 13 0.85

4573600 Beginning Algebra, Wenzel 11 TEXTBOOK 9 1 10 0.90 -0.054933399 Intermediate Algebra, Goymerac 12 ARISTOTLE 10 0 10 1.00

4573624 Intermediate Algebra, Goymerac 12 TEXTBOOK 13 1 14 0.93 0.07

4930692 Introduction to Computer Concepts and Applications, Herrera 13 ARISTOTLE 9 0 9 1.00

3979823 Introduction to Computer Concepts and Applications, Herrera 13 TEXTBOOK 8 1 9 0.89 0.114930686 American Government, Kryschtal 14 ARISTOTLE 4 2 6 0.67

3979813 American Government, Kryschtal 14 TEXTBOOK 3 2 5 0.60 0.07

Avg diff -0.003

6 lower retention

8 higher retention

Almost no impact

Page 21: Evaluating success of_digital_content

Other Findings

2011 – 12 Students saved approximately

$68,000 eduKan retained approximately

$24,000

Page 22: Evaluating success of_digital_content

Introduction to Business, M Hatcher

Aristotle Design: 4933375

Completion Rate: 88%Average Grade: 83%

Pre Aristotle Design: 4576141

Completion Rate: 75%Average Grade: 86%

Page 23: Evaluating success of_digital_content

Aristotle Design

Introduction to Business, M Hatcher

Pre Aristotle Design

Instructor and student interactivity higher in the new design – Of particular note is the inclusion of all course users

Interactivity Node/Edge chart

Page 24: Evaluating success of_digital_content

Aristotle Design Pre Aristotle Design

Introduction to Business, M Hatcher

Average Minutes of Activity per User 1 >15

Activity intensity comparison by feature

Page 25: Evaluating success of_digital_content

Introduction to Business, M Hatcher

Aristotle Design

Pre Aristotle Design

01

01

01

Week Number 1 2 3 4 5 6 7 8 9 10 11 12

Average Minutes of Activity per User1 > 35

Activity comparison by week

Page 26: Evaluating success of_digital_content

Aristotle Design Pre Aristotle Design

1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12

Introduction to Business, M Hatcher

Average Minutes of Activity per User1 >35

Point accumulations ( per user ) paired to weekly activity

Week Number

Page 27: Evaluating success of_digital_content

Completion Rates vs. Enrollment

EduKan Ops Review

Com

ple

tion R

ate

s –

Censu

s to

Cours

e E

nd

Page 28: Evaluating success of_digital_content

Advice & Lessons Learned

• Plan, plan, plan!

• Determine the textbook. • Map out your course.

• Decide which learning objects you want to use.

• Allow adequate time for delivery

Page 29: Evaluating success of_digital_content

Design Process

Consultation with Textbook representative

Selection of Textbooks

Work with instructional designer

Review the course upon delivery

Page 30: Evaluating success of_digital_content

Research on My Labs Resources

AssetsInteractive tutorials as

supplement to reading, definition pop ups, note taking capability

Podcasts

Simulations

Videos

Selection of My Labs Resources

Page 31: Evaluating success of_digital_content

Highlights

Fully Customized Course

Capitalization of Teaching Style

Completely Embedded Digital Content

“I Don’t Have” Student Excuse is Eliminated

Some books are now available via the iPad and Android tablet.

Retention

Best of All Worlds in Resources (Subjective Opinion)

Page 32: Evaluating success of_digital_content

Speed Bumps

Timeline

Approximately 3 Months

Repagination

Reading Electronic Book

Page 33: Evaluating success of_digital_content

Lessons Learned

 Slowly Integrate

Timeline Expectations

Valuable Project

Page 34: Evaluating success of_digital_content

Questions

Page 35: Evaluating success of_digital_content

Node/Edge Charts explained:

• Thread Interactivity• Nodes: Users in threads• Node color is final grade at course end ( Red = < 50%, Green > 80%)• Blue node: Instructor• Grey node: Dropped User or zero activity user• Node size represents total number of posts• Node location is influenced by node size as it relates to transitive edges (1)

• Edges ( lines ) indicate connections• Edge weight is how often that connection is made• Edge color corresponds to direction: takes on source color

• Content map• Nodes : Feature type• Node color and size: how much that activity was engaged• Edges indicate connections between features within the course interaction• Edge weight is how often that connection is made• Edge color corresponds to direction: Edge takes source color

(1)The positioning of the node is influenced by the size of the node ( total interactions) and the push and pull of total edgesper node. Consider each node floating in space: Each connection that leaves a node provides a push away from the destination node, and each connection to a node provides

a pull towards the source node. The imbalance between those two influences weighted by the size of the node determines location.

EDUKAN section analysis - Visuals

Page 36: Evaluating success of_digital_content

Activity Intensity explained:

• Week Number across the top - range is determined by course start and end date

• Units and Feature items across the vertical – Features are listed by their containing units

• Measures a combination of activity minutes and record insertions

• Intended to reflect the intensity against a feature• Scale is given in minutes for clarity, but there is an

underlying scoring to capture record inserts (1)

(1)Average count of record inserts measured per feature, per user. If the amount of inserts is greater than the average, a 1.5

multiplier is applied. This only has an influence for threaded discussion, as an attempt to capture the number of posts, not just time on task.

EDUKAN section analysis - Visuals

Page 37: Evaluating success of_digital_content

Point accumulation plots explained:

• Week Number across the horizontal- range is determined by course start and end date

• Cumulative points along the vertical. Scale is determined by what is possible ( weighted )

• Each line represents a user in the course.• Intended to expose common/different point

accumulation patterns between users and courses

EDUKAN section analysis - Visuals

Page 38: Evaluating success of_digital_content

Student Survival plots explained.

• Week Number across the horizontal- range is determined by course start and end date

• % of total activity (minutes only) on the y axis.• Each area represents a user in the course, the width of line indicates what % of

total minutes for the course were earned by them during that week.• Not intended to differentiate all users, only the users that dropped.• Intended to measure improvement in student survival (how long are the

dropped students staying engaged). A measure of improvement.• Red bars indicate points where user(s) dropped out• Blue line is instructor contribution.

EDUKAN section analysis - Visuals

Page 39: Evaluating success of_digital_content

Thank You!

Ms. Julie Ricke

eduKan

[email protected]