x api introduction and acrossx solution (1)

35
xAPI Introduction and AcrossX Solution Jessie Chuang Classroom Aid Inc.

Upload: jessie-chuang

Post on 17-Aug-2015

45 views

Category:

Education


1 download

TRANSCRIPT

xAPI Introduction and AcrossX Solution

Jessie ChuangClassroom Aid Inc.

Paradigm shift

Time to Rethink Learning !● Mobile learning● Blended learning● Distributed learning● Flipped Classroom● Inquiry-based learning● Self-directed learning● Project(problem)-based

learning● Social learning ● Cooperative Problem Solving

(PISA 2015 will add CPS asses.)● 70-20-10 model of learning● Performance support

3

Technology is evolving very fast and segmented

Mobile first!

Simulations, AR, location-based learning, wearables, IoT ...

4

xAPI tracking all kinds of learning experiences

5

Recording Learning Events

Learning happens in interactions:

openclipart.org

Instructors, Peers, Experts….

Contents: Courses, Books, Web pages, Games, AR ….Activities(making, exercises,

researching, online, offline ….)

Learner

6

Course Webpage Game Simulator

CoachingSocial LearningProjectMobile

Apps

LRS Learning data is sent to LRS

Other activities

7

LMS LRS

LRS

ReportingTool

Learning records can be delivered to LMSs,LRSs or Reporting Tools.

8

Recording Learning Events

This is only the basic idea. Crafting the statements with more context and related information is necessary to support analytics and reporting.

Based on JSON, xAPI originates from ActivityStreams (AS): stream of activity data statements(borrowed from social analytics), and has been modified for learning

9

The freedoms of the Tin Can API

● Statement freedom: the structure of “statements” using nouns, verbs and objects lets you record almost any activity.

● History freedom: the Tin Can API allows LRSs to talk to each other. LRSs can share data and transcripts with one another, and your experiences can follow you from one LRS (or organization) to another. Learners can even have their own “personal data lockers” with their personal learning information.

● Device freedom: any enabled device can send Tin Can API statements (mobile phones, simulations, games, a CPR dummy, the list goes on). A constant network connection isn’t necessary — occasional connectivity is fine.

● Workflow freedom: tracking learning events doesn’t have to start or end in an LMS, it can start wherever the learner is and on whatever device they choose to use. Your content isn’t tied to an LMS.

From What’s Tin Can API?10

Elements of Data–Enriched Assessment

ContinuousThere is no need to distinguish between learning activities and moments of assessment. Instead, a model of the learner’s knowledge state is continually assessed and updated. This enables learning to be modeled as an ongoing process rather than as a set of discrete snapshots over time.

Feedback - oriented Feedback can be provided directly to the learner, to an instructor, or to the system (e.g., an adaptive test or an intelligent tutor).

Personalized feedback - for instance, based on a design principle proposed by Shute (2008) in a review of the feedback literature, the system could offer direct hints to low–achieving learners and reflection prompts to higher–achieving learners.

The effective presentation of feedback in online learning environments poses an interesting design challenge.

Elements of Data–Enriched Assessment (con’t)

MultifacetedLearners’ abilities to learn from resources or interactions with others is influenced by factors beyond their current knowledge state. (the following facets have been researched)

● Affective state – the learner's mood or emotions● Interpersonal competencies(communicate, collaborate) ● Self–regulation(study strategies) (Zimmerman, 1990)● Goal orientation(a learner’s purpose)● Mindset(a learner’s beliefs)● Learners’ attributions of social cues in their environment (social belonging)

The multiple facets of a learner translate into key competencies for individuals to be productive and resilient in future educational and professional settings. Explicitly assessing these competencies as desired outcomes of learning can inform the design of learning environments to support their development and thereby better serve learners for the long term.

A Whole Picture

14

Now...

Don’t get them(schools) started on data! (pain)

“Give me my $#%* data.”

“It’s our data. Why do we have to negotiate for it?”

“Don’t hold our data hostage!”

“Data silos is the biggest issue in EdTech!”

What school systems want:

● Direct access to their data from software vendors.● Help managing their data; ● Better data warehousing and data mart solutions that provide actionable, real-time

data; ● Common data standards that are shared among software vendors.

Citation: research report from EdSurge: “School and Software, What’s Now and What’s Next”

Users are willing to trade product features with better system and data integration!

Goal: Data Loops AND Action Loops

DEI LRS

Learning Planning

LRS

Activity/Agent Profile API

IRS, Quiz service

ReportingGamification Dashboard

Mobile LearningMicro Learning

xAPI data

AcrossX Solution *Adaptive design*Learner preference

Flipped learning system

This image is made with Edynco.

Before Class In Class After Class

An example of a learning plan enabled by xAPI

Major Learning Design Goals

Social layer Gamification layer

Self-paced learning and practicing

Integrated data & experiences across systems

Gamification Across Systems

Teacher can experiment the weighting on each behavior, and there are 4 metrics (badges) for 4 different categories (to recognize multiple values):

● XP (efforts, engagement) - time spent in all activities● PowerA (personal competency in subject knowledge)

○ Test scores, practiced exercises and passed levels○ Thinking process in CPS and forum (the teacher evaluates & inputs score)

● PowerX (study skills - with goals)○ Verb counts from ebook, video, self-practice and interaction with feedback widget (highlighted,

noted, attempted, completed, answered, requested, interacted) => encouraging taking actions!!

● Force (soft skills) ○ Verb counts from forum and CPS (asked, posted, responded, liked, edited)○ Evaluated by the teacher

■ Team work (peer learning, from group data improvement)■ Help others on forum (from xAPI records of answering other’s questions, and peers’ rating)■ Cooperation shown in problem solving (teacher judges & inputs score)

Learner Feedback Widget Embedded in Contents

● Agent profile API will record learner’s competency level:○ real-time display in feedback

widget○ learners will get appropriate

practice items no matter what resource he/she is using

● Next step: assignment, repetitions, practices recommended, study skill reminder, urgent msg from teacher, special event... (must-have: content units aligned to competency standard etc.)

PowerA level

Good job!

XP

Learning Power

Different Metrics in Different Levels (conceptual)by Classroom Aid Inc.

Item level:MCQsPromptsQuests….

Test level:Summative assess.Formative assess.Group polling….

Raw level:VideosTexts….

LO/ActivityUDL optionsCompetency aligned

Authoring Tools

Learning Design Tools

Learning Patterns / Pedagogies / Gamification

Metrics:Duration / TimestampResponseCompletion / Attempts / Usages(e.g. skip)

Metrics:Duration / Timestamp / AttemptsScore / Success / RatingAffective states / Communicate, Collaborate

Metrics:Patterns vs. PerformanceQuestions to be answered

VisualizationCommunicationAction(able)Iteration

Analytics Backbone

+Customized

Interaction design => Data can be drilled down

activities included in the lesson plan

Time patterns

Each activity type can be drilled down on timeline.

Video

Can be drilled down

Accumulated or individual view

Practice

Encoding data with colors

Can be drilled down

Different practices offered for different learners

Practice viz drilled down

Time spent

Use hint/feedback?

How many tries?

Device preference

Multivariate analysis

Which verb counts matter? (for the performance index selected)

Customizable index for performance & engagement

(more data can be encoded here)

Correlation Analysis

Questions to Answer and Actions

● Outside classroom, which content is better for self-study (compare content’s impact), which type of content is better for whom(learning style), you can do A/B testing and Design-Based Implementation Research (DBIR).

● Before or after class… Who has done what? when? for how long? time spent in granularized events? event’s sequence on timeline? online or offline?

● How well has he/she been doing in practices? how many tries and how long in each try? with or without hint/feedback ?

● Group/social interactions? peer learning impact? team work records? does group learning help high performers or low performers? who and who should be grouped together?

● Everyone’s contributions (actions and contents) to a collaborative problem solving task? (teachers can evaluate and score after due date)

● Teachers can do inventions whenever and wherever needed.

Integrated services across systems

Services from all vendors

Services from schools

Services from Dep. of Education

Services from Taipei City

CooC services

Users (teachers, students, parents..)

Decisions

life-long ePortfolio

Modeling / algorithms

Evaluation by teacher

The foundations of adaptive learning

Competency framework

Knowledge map

Learning standards

Performance framework

Bloom’s taxonomy

Rubrics

content units

Mapping Interactions

Content/activity types

Data as fuel for analytics

Visualization

Image credit: LACEproject

Note: We now have all new kinds of data from xAPI (compared with what LA analysts had in the past).

● Help evaluate learners● Help develop modeling

4 levels of Learning Analytics (LA)

The whole picture =Training and Learning Architecture(TLA)

● ePortfolio● Learner modeling● Machine readable

● Competency standards

● Knowledge map● Standard

alignment

● xAPI COP● Common

vocabulary● Learning

Design

● Sharing of metadata & paradata

● Re-usability● Semantic analysis

Thank you!

Contact : [email protected]