research in to practice: building and implementing learning analytics at tribal
TRANSCRIPT
Research into PracticeBuilding and implementinglearning analytics at Tribal
Chris Ballard, Data Scientist
Building and implementing learning analytics
1. Start at the very beginning
2. How research and practice differ
3. Building a learning analytics platform
4. Implementing learning analytics
5. Summary
A leading provider of technology-enabled management solutions
for the international education, learning and training markets
Higher Education
VocationalLearning
Schools /K-12
Research universities
Employment-focused universities
Government agencies
Further education colleges
Training providers and employers
Government agencies
Schools
School groups
State and district government agencies
About Tribal
3
Objectives
• Predict student academic performance to optimise success
• Predict students at risk of non-continuation
• Build on research into link between VLE activity and academic success
• Scale data processing
• Understand risk factors and compare to cohorts
3 years of matched student and activity data used to build predictive models
Staff can use student, engagement and academic data to understand how they affect student outcomes.
Information accessible in one place on easy to understand dashboards.
Integrated with Tribal SITS:Vision and staff e:vision portal
Consultation with academic staff on presentation and design
Accuracy of module academic performance predictions* 79%
*Using module academic history and demographic factors
R&D project overview
Current projects
Providing learning analytics for 160,000 students across a state-wide vocational and further education provider in Australia
Student Insight being implemented as part of the JISC UK Effective Learning Analytics programme
From research to practice
Research• Domain knowledge• Interpretation• In depth understanding• Testing an approach
Practice• Integrated into everyday life• Interpret easily• Take action• Implementing an approach
Domain and people
• What is the problem?• Identify the users and stakeholders• Data owners• Are research results sufficient?• Design• Project cost
Technical
• Research limitations• Architecture• Data munging• Automating manual processes• Data suitability• Robustness of technical platform
Building a learning analytics platform
Key design decisions
1. Transparency – knowing why a student is a risk
2. Flexibility – viewing learning analytics which relates to an institution's curriculum and organisation
3. Efficiency – ease of use, implementation and interpretation
Ensemble
Decision
Combination
Enrolment
Academic
Performance
Engagement
Historic module results
VLE Event Data
Library Event Data
Attendance
VLE
Library
Attendance
%
%
%
%
%
%
%
Demographics
Risk
Prediction
%
Module
History
AssessmentsFormative Assessments
Datasets
Ensemble learning
StudentWeight
Weight
Weight
Weight
Weight
Weight
Reflecting differences between courses
Student activity data is not consistent across all courses/modules
1. Standardise data so it is comparative across all courses and modules
2. Build different models for each course or module
Need to be careful that you have sufficient data for the model to generalise to new data.
Provide opportunity for intervention
Identify student at risk
Log intervention details
Assess intervention effectiveness
Learning analytics
Allocates intervention to support team
Assign SLA
SLA based alerts
Monitor intervention progress
Student support teams
Embed into business process
Need to consider how learning analytics becomes embedded into the day to day working life of academic and support staff.
Notifications – analytics becomes proactive; support different types of notification
Integration – accessible from existing tools and services through single sign on
Implementing learning analytics
CRISP-DM –Cross Industry Standard Process for Data Mining
https://the-modeling-agency.com/crisp-dm.pdf
Data understanding
Understanding which features are important
Example: end month of unit for successful and failed units
Data preparation
Creating comparative features
Example: Total proportion of hours worked on failed units
Modelling and evaluation
Understanding whether model is under- or over- fitting
Example: Learning curve for Random Forest model
Evaluation
Define business focused success criteria
Define model focused success criteria
Define what baseline performance is acceptable
Consider a model cost benefit analysis that takes into account intervention cost
Actual
Withdrawn Enrolled
PredictedWithdrawn benefit cost
Enrolled cost benefit
Summary
Design
Embed learning analytics into business process
Ensure that analytics can be interpreted easily by staff
Intervention processes that are clearly articulated
Measure intervention effectiveness
Implementation
Use a standard project approach such as CRISP-DM
Evaluate data in the context of the business problem and process
Define what success means, including acceptable accuracy and how it needs to be measured