using learning analytics to assess innovation & improve student achievement

Post on 30-Jul-2015

115 Views

Category:

Education

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Using Learning Analytics to Assess Innovation & Improve Student Achievement

John Whitmer, Ed.D.john.whitmer@blackboard.com@johncwhitmer

UK Learning Analytics Network Event (JISC) March 5, 2015

http://bit.ly/jwhitmer-jisc

Quick bio

15 years managing academic technology at public higher ed institutions (R1, 4-year, CC’s)

• Always multi-campus projects, innovative uses of academic technologies

• Most recently: California State University, Chancellor’s Office, Academic Technology Services

Doctorate in Education from UC Davis (2013) with Learning Analytics study on Hybrid, Large Enrollment course

Active academic research practice (San Diego State Learning Analytics, MOOC Research Initiative, Udacity SJSU Study…)

Meta-questions driving my research

1. How can we provide students with immediate, real-time feedback? (esp identify students at-risk of failing a course)

2. How can we design effective interventions for these students?

3. How can we assess innovations (or status quo deployments) of academic technologies?

4. Do these findings apply equally to students ‘at promise’ due to their background (e.g. race, class, family education, geography)

Outline

1. Defining & Positioning Learning Analytics2. A Few Empirical Research Findings

• Understanding Contradictory Outcomes in a Redesigned Hybrid Course (Chico State)

• Creating Accurate Learning Analytics Triggers & Effective Interventions (SDSU)3. How we’re Applying this Research @ Blackboard4. Discussion

4

Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.5

200MB of data emissions annually

Economist. (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy. The Economist.6

Logged into course within 24 hours

Interacts frequently in discussion boards

Failed first exam

Hasn’t taken college-level math

No declared major

7

What is learning analytics?

Learning and Knowledge Analytics Conference, 2011

“ ...measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding

and optimizing learning

and the environments

in which it occurs.”

Strong interest by faculty & students

From Eden Dahlstrom, D. Christopher Brooks, and Jacqueline Bichsel. The Current Ecosystem of Learning Management Systems in Higher Education: Student, Faculty, and IT Perspectives. Research report. Louisville, CO: ECAR, September 2014. Available from http://www.educause.edu/ecar.

Source: Educause and AIR, 2012 (2012), http://goo.gl/337mA

2. A Few Empirical Research Findings

Study 1: Understanding Contradictory Outcomes in a Redesigned Hybrid Course (Chico State)

Course redesigned for hybrid delivery in year-long program

Enrollment: 373 students (54% increase largest section)

Highest LMS usage entire campus Fall 2010 (>250k hits)

Bimodal outcomes:• 10% increased SLO mastery• 7% & 11% increase in DWF

Why? Can’t tell with aggregated reporting data

Study Overview

54 F’s

Grades Significantly Related to Access

Course: “Introduction to Religious Studies” CSU Chico, Fall 2013 (n=373)

Variable % VarianceTotal Hits 23%Assessment activity hits 22%Content activity hits 17%Engagement activity hits 16%Administrative activity hits 12%

Mean value all significant variables 18%

LMS Activity better Predictor than Demographic/Educational Variables

Variable % Var.HS GPA 9%URM and Pell-Eligibility Interaction 7%Under-Represented Minority 4%Enrollment Status 3%URM and Gender Interaction 2%Pell Eligible 2%First in Family to Attend College 1%Mean value all significant variables 4% Not Statistically Significant  

Gender  Major-College  

At-risk students: “Over-working gap”

Activities by Pell and grade

Grade / Pell-Eligible

A B+ C C-

0K

5K

10K

15K

20K

25K

30K

35K

Measure Names

Admin

Assess

Engage

Content

Not Pell-Eligible

Pell-Eligible

Not Pell-Eligible

Pell-Eligible

Not Pell-Eligible

Pell-Eligible

Not Pell-Eligible

Pell-Eligible

Activities by Pell and grade

Grade / Pell-Eligible

A B+ C C-

0K

5K

10K

15K

20K

25K

30K

35K

Measure Names

Admin

Assess

Engage

Content

Not Pell-Eligible

Pell-Eligible

Not Pell-Eligible

Pell-Eligible

Not Pell-Eligible

Pell-Eligible

Not Pell-Eligible

Pell-Eligible

Extra effortIn content-related activities

Study 2: Creating Accurate Learning Analytics Triggers & Effective Interventions (SDSU)

20

Study Overview

• President-level initiative

• Goal: identify effective interventions driven by Learning Analytics “triggers”

• Multiple “triggers” (e.g., LMS access, Grade, Online Homework/Quiz, Clicker use)

• At scale & over time: conducted for 3 terms, 5 unique courses, 3,529 students

• “Gold standard” experimental design (control / treatment)

John Whitmer
Replace with picture of Bernie at front of PSY 101 room asking for consent?

21

Focus on High Need Courses

ANTH 101 COMPE 270 ECON 101 PSY 101 STAT 1190%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

84%

62%69%

78.5%70.8%

16%

38%31%

21.5%29.2%

Repeatable Grades

Non-Repeatable Grades

1. Identify courses and recruit instructors 2. Prior to course start, review syllabus, schedule meaningful

“triggers” for each course (e.g. attendance, graded items, Blackboard use, etc.)

3. Run reports in Blackboard, Online Homework/Quiz software to identify students with low activity or performance (~ weekly)

4. Send “flagged” student in experimental group a notification/intervention

5. Aggregate data, add demographic data. Analyze.

Study Protocol

22

Key Questions

1. Are triggers accurate predictors of course grade?

2. Do interventions (based on triggers) improve student grades?

3. Do these relationships vary based on student background characteristics?

Frequency of interventions (Spring 2014)

# Students Receiving >0 Interventions: PSY: 177 (84%) STAT: 165 (70%)

14%

19%

11%

17%

10%

6% 5% 6%

2% 1%3% 2%

30%

17%

13% 12%7%

6% 6%

3%

4%

1% 2%

4%

0%

5%

10%

15%

20%

25%

30%

35%

0 1 2 3 4 5 6 7 8 9 10 >10

Stud

ents

Interventions

PSY

STAT

Frequency of interventions (Spring 2015)

0 1 2 3 4 5 6 7 8 9 10 11 12 13 140%

5%

10%

15%

20%

25%

30%

35%

40%

45%

Triggers Activated per StudentS(Spring 2015)

Anthro1Anthro3Comp EngrEcon4Psych1Psych2Stats3Stats4

Triggers Activated

Stud

ents

25

Interventions

Spring 2014

Fall 2014

Fall 2015

26

27

A Typical Intervention: “Concerned Friend” tone

… data that I've gathered over the years via clickers indicatesthat students who attend every face-to-face class meeting reduce their chances of getting a D or an F in the class from almost 30% down to approximately 8%.

So, please take my friendly advice and attend class and participate in our classroom activities via your clicker. You'll be happy you did! Let me know if you have any questions.

Good luck,Dr. Laumakis

Poll question

A Not significant

<10%, significant .05 level

20%, significant .01 level

30%, significant .001 level

Did triggers predict achievement? What level significance? How much variation in student grade was explained?

B

C

D

E 50%+, significant .0001 level

28

Poll question

A Not significant

<10%, significant .05 level

20%, significant .01 level

30%, significant .001 level

Did triggers predict achievement? What level significance? How much variation in student grade was explained?

B

C

D

E 50%+, significant .0001 level (Spring 2014, Fall 2014)

29

Statistics

Learning analytics triggers vs. final course pointsSpring 2014: 4 sections, 2 courses, 882 students

Psychology

p<0.0001; r2=0.4828 p<0.0001; r2=0.6558

Fall 2014 results: Almost identical5 Sections, 3 Courses, N=1,220 students

p<0.00001; r2=0.4836

Spring 2015 Results (tentative): lower relationship

8 Sections, 5 Courses, N=1,390 students

p<0.00001; r2=0.28

32

Spring 2015 Results (tentative): lower relationship

8 Sections, 5 Courses, N=1,390 students

p<0.00001; r2=0.28

33

Explained by differences between courses (Spring 2015 Results by Course)

34

So did the interventions make a difference in learning outcomes?

Control (STAT)

Exp. (STAT)

Exp. (PSY) Control (PSY)

0%10%20%30%40%50%60%70%80%90%

100%

79% 79%

89%82%

21% 21% 11% 18%

Experimental Participation vs.

Repeatable Grade (Spring 2014)

Repeati-ble Grade

Passing Grade

36

No Interventions _x000d_(n=87, PSY,

Pell-Eligible)

Interventions _x000d_(n=81, PSY,

Pell-eligible)

0%10%20%30%40%50%60%70%80%90%

100%

77%

91%23%9%

Experimental Participation vs. Repeatable Grade (Pell-El-

igible) (n=168, Spring 2014, PSY

101) Passing Grade

Repeati-ble Grade

24 additional Pell-eligible students would have passed the class

if the intervention was applied to all participating students.

37

Fall 2014 / Spring 2015 Intervention Results:

No Significant Difference Between Experimental/Control Groups.

38

One Explanation: Low Reach

39

Fall 2014 (n = 1,220)

Row Labels # Triggers Message Open Rate Clickthrough Rate

Econ1 8 76% 36%Psych1 6 70% 29%Psych2 7 69% 35%Stat3 9 62% 25%Stat4 8 65% 27%Grand Total 38 68% 30%

Spring 2015 (n = 1,138)

Row Labels # Triggers Message Open Rate Clickthrough Rate

Anthro-In Person 17 57% 10%Anthro-Online 7 71% 35%Comp Engineering

15 52% 14%

Econ 15 44% 13%Psych1 17 60% 13%Psych2 17 63% 13%Stat3 21 64% 9%Stat4 20 55% 5%Grand Total 129 58% 12%

Proposed Next Steps

Add interventions that move “beyond informing” students to address underlying study skills and behaviors

Supplemental Instruction <http://www.umkc.edu/asm/si/>

Adaptive Release within online courses (content, activities)

40

1. Data from academic technology use predicts student achievement; diverse sources provide better predictions.

2. Tech use > demographic data to predict course success; adding demographic data provides nuanced understandings and identifies trends not otherwise visible.

3. Academic technology use is not a “cause” in itself, but reveals underlying study habits and behaviors (e.g. effort, time on task, massed vs. distributed activity).

4. Predictions are necessary, but not sufficient, to change academic outcomes. Research into interventions is promising.

5. We’re at an early stage in Learning Analytics; expect quantum leaps in the near future.41

Conclusions and Implications

4. How we’re Applying this Research @ Blackboard

42

Improved instrumentation for learning activity within applications

Blackboard’s “Platform Analytics” Project

A new effort to enhance our analytics offerings across our academic technology applications that include

Applied findings from analysis (inc. inferential statistics and data mining)

Integrated analytics into user experiences(inc. student and faculty)

Aggregated usage data across cloud applications (anonymized, rolled-up, privacy-compliant)

Blackboard Analytics for Learn

3. Wrap-Up and Discussion

Factors affecting growth of learning analytics

Enabler

Constraint

WidespreadRare

New education models

Resources ($$$, talent)

Data governance (privacy, security, ownership)

Clear goals and linked actions

Data valued in academic decisions

Tools/systems for data co-mingling and analysis

Academic technology adoption

Low data quality (fidelity with meaningful learning)

Difficulty of data preparation

Not invented here syndrome

Call to action [with amendments](from a May 2012 Keynote Presentation @ San Diego State U)

You’re not behind the curve, this is a rapidly emerging area that we can (should) lead... [together with interested partners]

Metrics reporting is the foundation for analytics [don’t under or over-estimate the importance]

Start with what you have! Don’t wait for student characteristics and detailed database information; LMS data can provide significant insights

If there’s any ed tech software folks in the audience, please help us with better reporting! [we’re working on it and feel your pain!]

Discussion / Questions

John Whitmer, Ed.D.john.whitmer@blackboard.com@johncwhitmer

http://bit.ly/jwhitmer-jisc

top related