learnometrics: metrics for learning objects

52
Learnometrics: Metrics for Learning Objects Xavier Ochoa

Upload: xavier-ochoa

Post on 17-May-2015

1.351 views

Category:

Technology


3 download

DESCRIPTION

Ph.D. Defense presentation. K.U.Leuven How to measure the characteristics of the different processes involved in the Learning Object lifecycle

TRANSCRIPT

Page 1: Learnometrics: Metrics for Learning Objects

Learnometrics:Metrics for Learning

ObjectsXavier Ochoa

Page 2: Learnometrics: Metrics for Learning Objects

Learning Object

Any digital resource that can be reused to support learning

(Wiley, 2004)

Page 3: Learnometrics: Metrics for Learning Objects

Share and Reuse

Page 4: Learnometrics: Metrics for Learning Objects

Sharing

Page 5: Learnometrics: Metrics for Learning Objects

Sharing

Page 6: Learnometrics: Metrics for Learning Objects

Repository

Page 7: Learnometrics: Metrics for Learning Objects

Metadata

Book Metadata

Page 8: Learnometrics: Metrics for Learning Objects

Learning Object Metadata

GeneralTitle: Landing on the Moon

TechnicalFile format: Quicktime MovieDuration: 2 minutes

EducationalInteractivity Level: LowEnd-user: learner

RelationalRelation: is-part-ofResource: History course

Learning Object

LOM

Page 9: Learnometrics: Metrics for Learning Objects

Learning Object Repository

Object Repository Metadata Repository

and /or

Page 10: Learnometrics: Metrics for Learning Objects

Learning Object Economy

Market Makers

Producers Consumers

Policy Makers

Market

Page 11: Learnometrics: Metrics for Learning Objects

How it works?How it can be improved?

Page 12: Learnometrics: Metrics for Learning Objects

Purpose

Generate empirical knowledge about LOE

Test existing techniques to improve LO tools

Page 13: Learnometrics: Metrics for Learning Objects
Page 14: Learnometrics: Metrics for Learning Objects

Quantitative Analysis

Page 15: Learnometrics: Metrics for Learning Objects

Metrics Proposaland Evaluation

Page 16: Learnometrics: Metrics for Learning Objects

Quantitative Analysis of the Publication of LO

• What is the size of Repositories?

• How do repositories grow?

• How many objects per contributor?

• Can it be modeled?

16

Page 17: Learnometrics: Metrics for Learning Objects

Size is very unequal

17

Page 18: Learnometrics: Metrics for Learning Objects

Size Comparison

Repository Referatory OCW LMS IR

Page 19: Learnometrics: Metrics for Learning Objects

Growth is Linear

Bi-phase Linear ln(a.exp(b.x)+c)

Page 20: Learnometrics: Metrics for Learning Objects

Objects per Contributor• Heavy-tailed distributions (no bell curve)

LORP - LORFLotka with cut-off

“fat-tail”

Page 21: Learnometrics: Metrics for Learning Objects

Objects per Contributor• Heavy-tailed distributions (no bell curve)

OCW - LMSWeibull

“fat-belly”

Page 22: Learnometrics: Metrics for Learning Objects

Objects per Contributor• Heavy-tailed distributions (no bell curve)

IRLotka high

alpha“light-tail”

Page 23: Learnometrics: Metrics for Learning Objects

Engagement

Page 24: Learnometrics: Metrics for Learning Objects

Model

Page 25: Learnometrics: Metrics for Learning Objects

Analysis Conclusions–Few big repositories concentrate most of the material

–Repositories are not growing as they should–There is not such thing as an average contributor

–Differences between repositories are based on the engagement of the contributor

–Results point to a possible lack of “value proposition”

Page 26: Learnometrics: Metrics for Learning Objects

Quantitative Analysis of the Reuse of Learning Objects

• Which percentage of learning objects is reused?

• Does the granularity affect reuse?

• How many times a learning object is reused?

26

Page 27: Learnometrics: Metrics for Learning Objects

Reuse Paradox

Page 28: Learnometrics: Metrics for Learning Objects

Measuring Reuse

Page 29: Learnometrics: Metrics for Learning Objects

Measuring Reuse

Page 30: Learnometrics: Metrics for Learning Objects

Measuring Reuse

~20%

Page 31: Learnometrics: Metrics for Learning Objects

Distribution of Reuse

Page 32: Learnometrics: Metrics for Learning Objects

Analysis Conclusions–Learning Objects are being reuse with or without the help of Learning Object technologies

–Reuse paradox need to be re-evaluated

–Reuse seems to be the results of a chain of successful events.

Page 33: Learnometrics: Metrics for Learning Objects

Quality of Metadata

Page 34: Learnometrics: Metrics for Learning Objects

Quality of Metadata

Title: “The Time Machine”Author: “Wells, H. G.”Publisher: “L&M Publishers, UK”Year: “1965”Location: ----

Page 35: Learnometrics: Metrics for Learning Objects

Metrics for Metadata Quality–How the quality of the metadata can be measured? (metrics)

–Does the metrics work?• Does the metrics correlate with human evaluation?

• Does the metrics separate between good and bad quality metadata?

• Can the metrics be used to filter low quality records?

Page 36: Learnometrics: Metrics for Learning Objects

Textual Information correlate with human evaluation

Page 37: Learnometrics: Metrics for Learning Objects

Some metrics could filter low quality records

Page 38: Learnometrics: Metrics for Learning Objects

Study Conclusions–Humans and machines have different needs for metadata

–Metrics can be used to easily establish some characteristics of the metadata

–The metrics can be used to automatically filter or flag low quality metadata

Page 39: Learnometrics: Metrics for Learning Objects

Abundance of Choice

38

Page 40: Learnometrics: Metrics for Learning Objects

Relevance Ranking Metrics–What means relevance in the context of Learning Objects?

–How existing ranking techniques can be used to produce metrics to rank learning objects?

–How those metrics can be combined to produce a single ranking value?

–Can the proposed metrics outperform simple text based ranking?

Page 41: Learnometrics: Metrics for Learning Objects

Metrics improve over Base Rank

Page 42: Learnometrics: Metrics for Learning Objects

RankNet outperform Base Ranking by 50%

Page 43: Learnometrics: Metrics for Learning Objects

Relevance Ranking Metrics• Implications

–Even basic techniques can improve the ranking of learning objects

–Metrics are scalable and easy to implement

• Warning:–Preliminary results: not based in real world observation

Page 44: Learnometrics: Metrics for Learning Objects

Applications - MQM

Page 45: Learnometrics: Metrics for Learning Objects

44

Applications - RRM

Page 46: Learnometrics: Metrics for Learning Objects

Applications - RRM

45

Page 47: Learnometrics: Metrics for Learning Objects

General Conclusions• Publication and reuse is dominated by heavy-tailed distributions

• LMSs have the potential bootstrap LOE

• Models/Metrics sets a baseline against which new models/metrics can be compared and improvement measured

• More questions are raised than answered46

Page 48: Learnometrics: Metrics for Learning Objects

Publications• Chapter 2

– Quantitative Analysis of User-Generated Content on the Web. Proceedings of the First International Workshop on Understanding Web Evolution (WebEvolve2008) at WWW2008. 2008, 19-26

– Quantitative Analysis of Learning Object Repositories. Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications ED-Media 2008, 2008, 6031-6040

• Chapter 3– Measuring the Reuse of Learning Objects. Third

European Conference on Technology Enhanced Learning (ECTEL 2008), 2008, Accepted.

Page 49: Learnometrics: Metrics for Learning Objects

Publications• Chapter 4

– Towards Automatic Evaluation of Learning Object Metadata Quality. LNCS: Advances in Conceptual Modeling - Theory and Practice, Springer, 2006, 4231, 372-381

– SAmgI: Automatic Metadata Generation v2.0. Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications ED-Media 2007, AACE, 2007, 1195-1204

– Quality Metrics for Learning Object Metadata. World Conference on Educational Multimedia, Hypermedia and Telecommunications 2006, AACE, 2006, 1004-1011

Page 50: Learnometrics: Metrics for Learning Objects

Publications• Chapter 5

– Relevance Ranking Metrics for Learning Objects. IEEE Transactions on Learning Technologies. 2008. 1(1), 14

– Relevance Ranking Metrics for Learning Objects.LNCS: Creating New Learning Experiences on a Global Scale, Springer, 2007, 4753, 262-276

– Use of contextualized attention metadata for ranking and recommending learning objects. CAMA '06: Proceedings of the 1st international workshop on Contextualized attention metadata at CIKM 2006, ACM Press, 2006, 9-16

Page 51: Learnometrics: Metrics for Learning Objects

My Research Metrics (PoP)• Papers: 14• Citations: 55• Years: 6• Cites/year: 9.17• Cites/paper: 4.23• Cites/author: 21.02• Papers/author: 6.07• Authors/paper: 2.77

• h-index: 5• g-index: 7• hc-index: 5• hI-index: 1.56• hI-norm: 3• AWCR: 13.67• AW-index: 3.70• AWCRpA: 5.62

Page 52: Learnometrics: Metrics for Learning Objects

Thank you for your attention

Questions?

51