the virtuous loop of learning analytics & academic technology innovation

56
The Virtuous Loop of Learning Analytics & Academic Technology Innovation John Whitmer, Ed.D. Director for Teaching & Learning Analytics and Research, Blackboard Adjunct Faculty Fellow, San Diego State University [email protected] | @johncwhitmer www.johnwhitmer.info Online Learning Consortium Collaborate November 19, 2015

Upload: john-whitmer-edd

Post on 23-Jan-2018

774 views

Category:

Data & Analytics


0 download

TRANSCRIPT

The Virtuous Loop of Learning Analytics & Academic Technology Innovation

John Whitmer, Ed.D.Director for Teaching & Learning Analytics and Research, Blackboard

Adjunct Faculty Fellow, San Diego State University

[email protected] | @johncwhitmer

www.johnwhitmer.info

Online Learning Consortium CollaborateNovember 19, 2015

Quick bio

15 years managing academic technology at public higher ed institutions (R1, 4-year, CC’s)

• Always multi-campus projects, innovative uses of academic technologies

• Driving interest: what’s the impact of these projects? Most recently: California State University, Chancellor’s Office, Academic Technology Services

Doctorate in Education from UC Davis (2013) with Learning Analytics study on Hybrid, Large Enrollment course

Active academic research practice (San Diego State Learning Analytics, MOOC Research Initiative, Udacity SJSU Study…)

Quick poll

A Unfamiliar; Never heard of it

Somewhat familiar; I’ve seen a reference or two

Very familiar; I follow the literature and/or use it in my practice

Expert; I’m very knowledgeable and actively contributing to the field

How familiar are you with learning analytics?

B

C

D

Do you ever wonder (or perhaps worry) …

How much the programs you invest your time, energy and resources into improve student outcomes? And in what ways? (post-hoc) If your

programs are helping the right students?

(e.g. those who need it)

If you could understand how students interact with technology experiences to a) create optimal experiences for students or b) identify students who are struggling? (design research)

1. What’s Learning Analytics

2 .What we’re learning from research

3. Examples of Learning Analytics (time permitting)

Outline

1. Defining Learning Analytics

200MB of data emissions annually

Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.

Logged into course within 24 hours

Interacts frequently in discussion boards

Failed first exam

Hasn’t taken college-level math

No declared major

What is learning analytics?

Learning and Knowledge Analytics Conference, 2011

“ ...measurement, collection,

analysis and reporting of data about

learners and their contexts,

for purposes of understanding

and optimizing learning

and the environments

in which it occurs.”

Strong interest by faculty & students

From Eden Dahlstrom, D. Christopher Brooks, and Jacqueline Bichsel. The Current Ecosystem of Learning Management Systems in Higher Education: Student, Faculty, and IT Perspectives. Research report. Louisville, CO: ECAR, September 2014. Available from http://www.educause.edu/ecar.

2. What we’re learning from research

Study #1: Learning analytics pilot study for Introduction to Religious Studies

Redesigned to hybrid delivery through Academy eLearning

Enrollment: 373 students (54% increase on largest section)

Highest LMS (Vista) usage entire campus Fall 2010 (>250k hits)

Bimodal outcomes:

• 10% increased SLO mastery

• 7% & 11% increase in DWF

Why? Can’t tell with aggregated reporting data

54 F’s

Grades Significantly Related to Access

Course: “Introduction to Religious Studies” CSU Chico, Fall 2013 (n=373)

Variable % Variance

Total Hits 23%

Assessment activity hits 22%

Content activity hits 17%

Engagement activity hits 16%

Administrative activity hits 12%

Mean value all significant

variables 18%

LMS Use better Predictor than Demographic Variables

Variable % Variance

HS GPA 9%

URM and Pell-Eligibility

Interaction 7%

Under-Represented Minority 4%

Enrollment Status 3%

URM and Gender Interaction 2%

Pell Eligible 2%

First in Family to Attend College 1%

Mean value all significant

variables 4%

Not Statistically Significant

Gender

Major-College

Variable % Variance

Total Hits 23%

Assessment activity hits 22%

Content activity hits 17%

Engagement activity hits 16%

Administrative activity hits 12%

Mean value all significant

variables 18%

At-risk students: “Over-working gap”

Activities by Pell and grade

Grade / Pell-Eligible

A B+ C C-

0K

5K

10K

15K

20K

25K

30K

35K

Measure Names

Admin

Assess

Engage

Content

Not Pell-Eligible

Pell-Eligible

Not Pell-Eligible

Pell-Eligible

Not Pell-Eligible

Pell-Eligible

Not Pell-Eligible

Pell-Eligible

Extra effortIn content-related activities

Study #2: Learning analytics triggers & interventions

President-level initiative

Goals: (1) find accurate learning analytics triggers; (2) create effective interventions

Multiple “triggers” (e.g. LMS access, Clicker use, Grade) to identify at-risk students, sent “awareness” messages

Conducted 2 Academic Years (Spring 2014 – present)

Study Design

Select Courses

•High integration academic technologies

•High repeatable grade rates

Identify meaningful triggers for course

•Consult with faculty

•Consider timing for impact

Recruit Students

•Assign to experimental/control group

Run weekly triggers

•Identify students at-risk (or deserving praise)

Send “intervention” message

•To students in experimental group only

Study Participation by Course(Spring 2015)

Course Professor Format Enrolled Participating

Percent

Participation

ANTH 101 - 01 S. Kobari Online 454 126 28%

ANTH 101 - 03 S. Kobari F2F 175 67 38%

COMPE 271 - 01 Y. Ozturk Hybrid 96 63 66%

ECON 102 - 04

C. Amuedo-

Dorantes Hybrid 139 50 36%

PSY 101 - 01 M. Laumakis Hybrid 137 81 59%

PSY 101 - 02 M. Laumakis Hybrid 482 257 53%

STAT 119 - 03 H. Noble Hybrid 496 305 61%

STAT 119 - 04 H. Noble Hybrid 372 190 51%

TOTAL 2,351 1,139 61%

910

13

108 8

1311

53

5 5

5

6

2

2

2 23 3

22

1

2

12

2 2

22

0

5

10

15

20

25

Learning Analytics Trigger Events by Type(Spring 2015)

High Grade Triggers

Low Grade Triggers

No Clicker

Low/No Blackboard Use

Correlation Individual Variables w/Outcomes (all courses)

Significant Demographic/Educational Prep Variables

Numeric Grade

Repeatable Grade

GPA @ Census 0.2989* 0.2508*

Units that Term 0.0879* 0.0643*

SAT (Comp) 0.0686* 0.0817*

EOP Status -0.0783* -0.0477

Age -0.0769* -0.0719*

Pell Eligibility -0.0843* -0.0806*

LA Interventions -0.4261* -0.2576*

LA Interventions + Grade -0.5979* -0.4305*

Not Significant** Student LevelSex Ethnicity Enrollment StatusMajorCollegeHonors DisabledEOPDorm ResidentLow Income EFCFirst Gen Some CollegeLearning Community

** Variables highly correlated w/other predictors were excluded in favor of variable closest to current experience, e.g. GPA @ Census (not HS GPA), SAT Comp (not SAT Math), etc.

0.74

0.25

0.1

0.38

0.64

0.5

0.33

0.46

0.68

0.13

0

0.21

0.44

0.27

0.17

0.28

0 0.2 0.4 0.6 0.8

Anth 101 (Online)

Anth 101(In Person)

Comp Eng 271

Econ 102

Psych 101-01

Psych 101-02

Stat 119-03

Stat 119-04

R2 Value

Relationship between Learning Analytics Triggers Activated and Final Grade (Spring 2015)

Behavioral Triggers

Behavioral + GradeTriggers

0%

20%

40%

60%

80%

100%

120%

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Re

pe

atab

le G

rad

e R

ate

Ave

rage

Triggers Activated

Triggers Activated vs. Repeatable Grade

Click to edit Master title style

What does this mean?

What students DO is more important than who they are

LMS use is a proxy for student effort

Can we get more sophisticated? Yes.

Click to edit Master title style

A Typical Intervention

30

… data that I've gathered over the years via clickers indicatesthat students who attend every face-to-face class meeting reduce their chances of getting a D or an F in the class from almost 30% down to approximately 8%.

So, please take my friendly advice and attend class and participate in our classroom activities via your clicker. You'll be happy you did!

Let me know if you have any questions.

Good luck,Dr. Laumakis

Did the interventions make a difference? Nope.

80 8

6

69

80

74 7

9

77

73

81 8

7

65

82

70

79 80

75

0

10

20

30

40

50

60

70

80

90

100

Anth 101(Online)

Anth 101(In Person)

Comp Eng271

Econ 102 Psych 101-01

Psych 101-02

Stat 119-03

Stat 119-04

Comparison of Course Grade Average between Experimental and Control Groups (by Course)

Control

Experimental

Click to edit Master title style

Next Steps

1. Currently testing “Supplemental Instruction” near-peer tutoring approach developed by UMKC– Initial results look very promising/promising: >10+ increase in grade

of students who attend Supplemental Instruction vs. those who don’t

2. Combine with predictive model analysis with Learning Analytics (“Doing the Right Thing” score) + Demographic variables

3. Expand to additional courses; evaluate other intervention approaches

Summary findings previous LMS analytics studies

Institution-Wide Analysis with Only LMS Data

Course-Specific with Only LMS Data

Course-Specific with LMS Data & Other Sources

% G

rad

e E

xp

lain

ed

#

60%

50%

40%

30%

20%

10%

0%

25%

4%

51%

0%

33% 31%

57%

35%

(Whitmer, 2013a)

(Campbell 2007a)

(Campbell 2007b)

(Jayaprakash, Lauria 2014)

(Macfadyenand Dawson

2010)

(Morris, Finnegan et al.

2005)

Whitmer & Dodge (2015)

Whitmer (2013b)

HybridCourse Format:

Hybrid, online

Online

Factors affecting growth of learning analytics

Enabler

Constraint

WidespreadRare

New education models

Sufficient Resources

($$$, talent)

Clear data governance (privacy, security,

ownership)

Clear goals and linked

actions

Data valued in academic decisions

Tools/systems for data

co-mingling and analysis

Academic technology adoption

Low data quality (fidelity with meaningful learning)

Difficulty of data preparation

Not invented here syndrome

Call to action (from a May 2012 Keynote Presentation @ San Diego State U)

You’re not behind the curve, this is a rapidly emerging area that we can (should) lead...

Metrics reporting is the foundation for analytics

Start with what you have! Don’t wait for student characteristics and detailed database information; LMS data can provide significant insights

If there’s any ed tech software folks in the audience, please help us with better reporting!

3. Examples of Learning Analytics

Forward looking statements

Statements regarding our product development initiatives,

including new products and future product upgrades, updates

or enhancements represent our current intentions, but may be

modified, delayed or abandoned without prior notice and there

is no assurance that such offering, upgrades, updates or

functionality will become available unless and until they have

been made generally available to our customers.

Purpose-Built Learning Data Collection (in Bb Learn)

Learn Activity in Q4 Release

Grade History Events

Content Management Events

Test Access Events

External Launch Events

Student Contribution Events

User Sessions

Log Events

Tracking Event Events (Activity Accumulator)

Institution Data Store

Event Listener

Clickstream Listener

Event Listener

Client Listener

LogListener

GradePoint-in-Time

ContentPoint-in-Time

Bb Data Privacy/Confidentiality

Blackboard will Blackboard will not

• Respect regional data privacy/ confidentiality laws, starting with keeping detailed customer data within their region (e.g. Australia, Germany, North America)

• Analyze only anonymized data for cross-institutional insights

• Provide raw data to campuses at no additional charge as part of “core” platform services

• Sell raw or individually-identifiable customer data

Platform Analytics Initiative

Improved, purpose-built data sources

• Initially about academic technology interactions

• Extending to other aspects of student experience

Validated data elements and models

• Based in large-scale analysis, using inferential statistics and data mining on anonymized data

Integrated interventions and actions within core application workflows

• Providing actionable insights where action can be taken immediately

Embedded Learning Analytics in Learn SaaS “Ultra”

Learner Experience

Teacher Experience

X-Ray Learning Analytics for Moodlerooms

Activity over Time

Activity by Week Relative to Other Students

Discussion Forum Wordcloud

Social Network Analysis of Discussion Forum w/Critical Thinking Coefficient

Risk Status

Discussion Forum Grading Suggestions

Thank you!

John Whitmer, [email protected]

@johncwhitmer

www.johnwhitmer.info/research