who are you and makes you special?

52
Who are you and makes you special? Simon Buckingham Shum Professor of Learning Informatics Director, UTS Connected Intelligence Centre @sbuckshum • Simon.BuckinghamShum.net utscic.edu.au Keynote, Library Data Carpentry 2016, Sydney http://librarydatacarpentry.github.io

Upload: simon-buckingham-shum

Post on 14-Apr-2017

509 views

Category:

Education


3 download

TRANSCRIPT

Who are you and makes you special?

Simon Buckingham Shum Professor of Learning Informatics Director, UTS Connected Intelligence Centre @sbuckshum • Simon.BuckinghamShum.net

utscic.edu.au

Keynote, Library Data Carpentry 2016, Sydney http://librarydatacarpentry.github.io

Learning  Technology  

KMi,  Open  U.  

AI  &  Argumenta<on  

Learning  Disposi<ons  

Human-­‐Centred    Informa<cs  

Learning  Analy<cs  

Seman<c  Scholarly  Publishing  

Dialogue  /  Issue  /  Argument  Visualisa<on  

Introducing my quantified

background

(at least, in Nov. 2013 courtesy LinkedIn Labs)

OUR CONTEXT

3

OUR CONTEXT

4

Large scale data and analytics are pervading societal life

Data and Algorithms have deep societal implications – good and bad – demanding informed debate

Implications for the future workforce…

How universities teach, research, operate — and are assessed…

How to equip graduates for “the age of complexity” (Stephen Hawking)

2011 2011

Envisioning “the Data Intensive University”

UTS-wide Forum to consider the profound implications of

the data revolution

Followed by UTS-wide consultation, strategy devpt,

and launch of CIC

UTS STRATEGIC CONVERSATION AROUND ANALYTICS

UTS CONNECTED INTELLIGENCE CENTRE

6

CIC catalyses the use of data and analytics among UTS students, educators, researchers and leaders

We teach human-centred data science • design analytics tools for UTS • evaluate these • disseminate internally and globally

We aim to shape critical debate on big data in education, and societal learning

PARTNERING ACROSS UTS

7

Faculties &

Institutes

Student Support

Units

CIC

Business Units

“LibrAIrian” a University Library staff member who advises

students, educators and researchers on the uses and abuses of

AI, Data Science and Human-Centered Computing for learning, knowledge and innovation

What’s the difference between a LibrAIrian and a data technician?

Panel debate, LAK 2013 With thanks to John Behrens (Pearson)

hIp://simon.buckinghamshum.net/2013/03/lak13-­‐edu-­‐data-­‐scien<sts-­‐scarce-­‐breed    

Things that make me crazy

“Looks cool. What does it mean?”

“I don’t know”

“Looks great at a high level, how have you explored it and tested the assumptions?” “What assumptions?”

“What are the relevant background issues in this area?”

“Ask the SME, I’m just the data person”

“Does it really make sense to get that result?”

“I don’t know, it just came out that way”

Be a philosopher

critical perspective algorithmic + scholarly + creative

intelligence

18

http://www.uts.edu.au/future-students/analytics-and-data-science

A DISTINCTIVE APPROACH TO DATA SCIENCE

utscic.edu.au

machine learning • statistics • data curation • ethics • user experience • information science visualization • narrative • social computing • learning analytics • project management project-based learning • authentic assessment regular, meaningful employer engagement

What’s the difference between a LibrAIrian and a learning analytics

technician?

critical perspective learning analytics

means many things to many people

learning analytics are not neutral 22

It’s out of the labs and into products: every learning tool now has an “analytics dashboard” (a Google image search)

23

https://guides.instructure.com/m/4152/l/66789-what-are-course-analytics

Summary statistics in the LMS (Canvas)

24

https://grockit.com/research

Written skills mastery for a SAT

25

Intelligent tutoring for skills mastery (CMU)

Lovett M, Meyer O and Thille C. (2008) The Open Learning Initiative: Measuring the effectiveness of the OLI statistics course in accelerating student learning. Journal of Interactive Media in Education 14. http://jime.open.ac.uk/article/2008-14/352

“In this study, results showed that OLI-Statistics students [blended learning] learned a full semester’s worth of material in half as much time and performed as well or better than students learning from traditional instruction over a full semester.”

26

Purdue University Signals: real time traffic-lights for students based on predictive model

27 Campbell et al (2007). Academic Analytics: A New Tool for a New Era, EDUCAUSE Review, vol. 42, no. 4 (July/August 2007): 40–57. http://bit.ly/lmxG2x

Validate a statistical model from: •  ACT or SAT score •  Overall grade-point average •  CMS usage composite •  CMS assessment composite •  CMS assignment composite •  CMS calendar composite

Predicted 66%-80% of struggling students who needed help

Spatial clustering algorithm to provoke reflection

28 Eric Coopey, R. Benjamin Shapiro, and Ethan Danahy. 2014. Collaborative spatial classification. In Proceedings of the 4th International Conference on Learning Analytics & Knowledge (LAK '14). ACM, New York, NY, USA, 138-142. DOI= http://dx.doi.org/10.1145/2567574.2567611  

Co-located collaboration spaces Analyse the students’ activity traces for significant patterns Timely feedback for personal and team reflection

Co-located collaboration spaces …can now be instrumented with sensors

Voice

Gesture

Pen

Touch

Co-location activity dashboards Multimodal data fusion and analysis… …to deliver visual analytics for reflection

e.g. this dashboard shows team member participation on different modalities

Applications for researchers working on high performance teams; group dynamics?

R. Martinez, K. Yacef, J. Kay, and B. Schwendimann. An interactive teacher’s dashboard for monitoring multiple groups in a multi-tabletop learning environment. Proceedings of Intelligent Tutoring Systems, pages 482-492. Springer, 2012.

Voice

Gesture

Pen

Touch

Visual analytics of f-f teamwork

R. Martinez, K. Yacef, J. Kay, and B. Schwendimann. An interactive teacher’s dashboard for monitoring multiple groups in a multi-tabletop learning environment. Proceedings of Intelligent Tutoring Systems, pages 482-492. Springer, 2012.

A field exercise…

33

Posture analysis of fieldwork students

34 Masaya Okada and Masahiro Tada. 2014. Formative assessment method of real-world learning by integrating heterogeneous elements of behavior, knowledge, and the environment. Proceedings 4th International Conference on Learning Analytics and Knowledge (LAK '14). ACM, New York, NY, USA, 1-10. DOI= http://dx.doi.org/10.1145/2567574.2567579  

1st International Workshop on Discourse-Centric Learning Analytics

analytics that look beneath the surface, and quantify linguistic proxies for ‘deeper learning’

Beyond number / size / frequency of posts; ‘hottest thread’

http://ww

w.glennsasscer.com

/wordpress/w

p-content/uploads/2011/10/iceberg.jpg

http://solaresearch.org/events/lak/lak13/dcla13

Highlighted sentences are colour-coded according to their broad type

Sentences have Function Keys signalling where an academic rhetorical move has been

recognised (e.g. a claim of Novelty )

AWA: Academic Writing Analytics ANALYTICAL writing

https://utscic.edu.au/tools/awa

Reflective writing (Nursing)

Applications for researchers working with text corpora, e.g. interview transcripts; literature

analysis; scenario planning?

Buckingham Shum, S., Ágnes Sándor, Rosalie Goldsmith, Xiaolong Wang, Randall Bass and Mindy McWilliams (2016, In Press). Reflecting on Reflective Writing Analytics: Assessment Challenges and Iterative Evaluation of a Prototype Tool. 6th International Learning Analytics & Knowledge Conference (LAK16). Edinburgh, UK. ACM Press. http://dx.doi.org/10.1145/2883851.2883955 Preprint: http://bit.ly/LAK16paper

Educa<onal  worldview  

38

epistemology

pedagogy assessment

Knight, S., Buckingham Shum, S. and Littleton, K. (2014). Epistemology, Assessment, Pedagogy: Where Learning Meets Analytics in the Middle Space. Journal of Learning Analytics, 1, (2), pp.23-47. http://epress.lib.uts.edu.au/journals/index.php/JLA/article/download/3538/4156 Knight, S. and Buckingham Shum, S. (In Press). Theory & Learning Analytics. Handbook of Learning Analytics & Educational Data Mining.

the middle

space of learning analytics

What epistemological assumptions are shaping the assessment regime, and hence the pedagogy? What questions are analytics used to help answer?

To go deeper into analytics for “21st century competencies”

39 hIp://simon.buckinghamshum.net/2015/05/cfp-­‐learning-­‐analy<cs-­‐for-­‐c21-­‐competencies    

Contributions are invited to this special issue: •  Analytics for higher order competencies such as critical thinking,

curiosity, resilience, creativity, collaboration, sensemaking, self-regulation, reflection/meta-cognition, transdisciplinary thinking, or skilful improvisation

•  Theoretical arguments around the opportunities, or indeed the limits, for analytics in illuminating particular competencies

•  Principles and methodologies for combining complementary analytical approaches, including reflections on conventional educational assessment instruments, and computational approaches

•  Methodologies for validating analytics •  Analytics for learning dispositions/mindsets/“non-cognitive” factors

known to shape readiness to engage in learning •  Analytics for different kinds of authentic assessment and inquiry-based

learning •  Technological challenges and opportunities for lifelong, life-wide

learning analytics extending beyond formal educational contexts •  Arguments regarding whether analytics could effect a shift in the

assessment regimes, and associated pedagogies and epistemologies, promoted by conventional education policy

•  Analysis of the systemic organisational adoption issues for such analytics

•  Visualisation design for different user groups, in particular, to promote increasing learner self-awareness and capacity to take responsibility for one’s learning

Next  Special  Issue  (due  July  2016)  

What’s the difference between a LibrAIrian and a knowledge

infrastructure scholar?

critical perspective knowledge infrastructures

embody values and assign power

41

Framing future knowledge infrastructures

http://knowledgeinfrastructures.org

Framing future knowledge infrastructures

http://knowledgeinfrastructures.org

This too, however, is not a neutral feature. As knowledge infrastructures shape, generate and distribute knowledge, they do so differentially, often in ways that encode and reinforce existing interests and relations of power. […] At scale, the effect of these choices may be an aggregate imbalance in the structure and distribution of our knowledge.

Framing future knowledge infrastructures

http://knowledgeinfrastructures.org

“Transformative infrastructures cannot be merely technical; they must engage fundamental changes in our social institutions, practices, norms and beliefs as well. For that reason, many scholars have dropped the dualistic vocabulary of “technical” and “social” altogether as anything other than a first order approximation, replacing those terms with concepts such as collectives (Latour 2005), assemblages (Ong & Collier 2005), or configurations (Suchman 2007…”

Accounting tools are not neutral

Du Gay, P. and Pryke, M. (2002) Cultural Economy: Cultural Analysis and Commercial Life. Sage, London. pp. 12-13

“accounting tools...do not simply aid the measurement of economic activity, they shape the reality they measure”

45

Bowker, G. C. and Star, L. S. (1999). Sorting Things Out: Classification and Its Consequences. MIT Press, Cambridge, MA, pp. 277, 278, 281

“Classification systems provide both a warrant and a tool for forgetting [...] what to forget and how to forget it [...] The argument comes down to asking not only what gets coded in but what gets coded out of a given scheme.”

46

Selwyn, N. (2014).  Data entry: towards the critical study of digital data and education. Learning, Media and Technology. http://dx.doi.org/10.1080/17439884.2014.921628

“observing, measuring, describing, categorising, classifying, sorting, ordering and ranking). […] these processes of meaning-making are never

wholly neutral, objective and ‘automated’ but are fraught with problems and compromises, biases and omissions.

47

To learn more…

https://youtu.be/RVgXvmeSnUk

http://governingalgorithms.org

In  an  increasingly  algorithmic  world  […]  What,  then,  do  we  talk  about  when  we  talk  about  “governing  algorithms”?   

49

To learn more…

50

hIp://simon.buckinghamshum.net/2016/03/algorithmic-­‐accountability-­‐for-­‐learning-­‐analy<cs    

“LibrAIrian” a University Library staff member who advises

students, educators and researchers on the uses and abuses of

AI, Data Science and Human-Centered Computing for learning, knowledge and innovation

51

Thank You! Discussion…

@sbuckshum