investigating performance: design & outcomes with xapi | lscon 2017

89

Upload: ht2-labs

Post on 09-Apr-2017

22 views

Category:

Education


4 download

TRANSCRIPT

Designing for Data

1

Med Story

WE can see the completions and failures, what is the next question we should ask?

Why? Why did people fail?

Why?

WE are going to talk about data in the intitial design process. And the use of data to assess the effectiveness of the learning intervention and provide for future design improvements.5

Know Your DataWhat makes the data come into being?How hard is it to get the data I want/need? Whats my Data Supply Chain?

Data Supply ChainActivity ProvidersSystems architecture

Data management, ownership, and accessibilityPotential Data Integration concernsWhats good enough?

Content StrategyAlignAuditStrategy

8

Content StrategyContentWhat activities can provide measurable data?Analysis What data can we (and should we) act on?

TechnicalWhat data can we get from our systems?

9

xAPI FormatJanet presented Content Strategy

Statement API looks a bit like this10

xAPI FormatJanet presented Content Strategy

11

xAPI FormatJanet presented Content Strategy

12

xAPI FormatJanet presented Content Strategy

With: Lscon 2017Place: OrlandoResult: 80% of room still awake

13

Event Mapping

Event Mapping

We are looking for confirmation that the learner go what they needed to know. And we can ask the question 16

We need to figure out what change means to us. If we were just looking to see how many people accessed something numbers are enough. When it comes to really seeing what behavior change our interventaion has made we need to figure out how we measure the change. So we need to figure out17

xAPI Collecting more than just completions and test scores

WE are looking at Collecting more than just completions and test scores, but if we are collecting completions and test scores, what does it actually mean. If I score a 90% on a quiz how does that affect my job performance? What path did I take to get to the quiz, does that affect my score. Why did I miss the last 10%? Was there a job role that scored better than others? Why?18

Data strategy19

Who are the Players?

L&D OwnersL&D CustomersData Owners Data Customers

20

Strategic Goals

Org/Institution (Strategic goals, Impact)L&D (Courses, Strategies)

Strategic Goals

Informational

Behavioral (micro) habits, individual professional development

Behavioral (macro) large scale process or procedural change across BUs or across time

Obstacles

Examples:the data isnt where it should beinteroperability over timeChange management

Data Stewardship

What Lies Beneath?Youll have to teach someone.When everybodys in charge, nobodys in charge.

What about data quality?

Australian Bureau of Standardshttps://www.nss.gov.au/dataquality/aboutqualityframework.jsp

Were ultimately interested in what is:

observable

measurable

actionable

Did the intervention actually work?

27

Quantitative

Quantitative

Quantitative is what you nomrally would think of as data. It is numbers, something that is qunatifiable28

Here we see a typical quantitative view of data. It is numbers, how many people looked at a page, how many people looked at a piece of content. Quantitative data is pure numbers.29

How do we collect the data? Really two main ways to collect the datafrom our content. We can use Google analytic sor the experience API. How many peopl ehave heard of the experience api?

30

However, numbers mean almost nothing. In a lot of cases the numbers will simply show us a quantity.31

We need to set some context to the numbers. Numbers are numbers, there is no meaning. By adding context to them we give them meaning.32

Qualitative

Qualitative data is data the captures feelings and obsrervations that are not quantifiable. 33

So what is Qualitative Data?

So what exactly is qualitative data?34

Qualitative looks at how you feel about something. In what we are doing it is watching someone use interface, watching their reaction to a piece of content or a design. It is basically trying to see how people are reacting to what is in front of them. This used a lot in interface design. Like I did in the opening slide.

35

36

How do we collect qualitative data?

So now that we know what qualitative data is, how do we collect it?37

The second one we will look at is collecting feedback form the deployed material. We can seek out feedback through interviews where you sit down and interview the person who has performed the task. User interviews can be conducted as one on one interviews or group interviews.

38

Surveys can be sent out to the users who performed the test to gather how the felt about the design that was delivered to them. There are many ways to set this up with ratings or short answer questions. You can also use surveys as step one and then conduct an interview to gain more perspective from the user.39

Be careful many studies have found that directly interviewing and providing someone with a survey, they are not always truthful. How many times have you been asked about an interface and said its OK. Or if the person who designed is the one talking to you, you might tell them it is fine when really there were portions that confused or aggravated you.40

Actually observing users is that best way to get real information. Although it can be too late (and should be) to observe users after deployment for feedback. Really the best and most effective time to observe users is using a portotyping phase. Lets take a deep dive into prototyping and how it is effective for qualitative analysis.41

Investigating performance not just about improving our learners, but also ourselves

42

How can we get this information earlier in the process?

IN most cases we start to look at qualitative data during interface design. Does it work, is there too much travel? Is there confusion on the part of the end user?44

The last thing we want is to require our users to have a gps to get through the material. If we have to provide detailed instructions to use an intervention we have already failed45

Because this is the last thing you want to see is a frustrated user after you have deployed some content. Collecting some qualitative data early in the design process from stakeholders and end users can help avoid this.

46

A key to gathering this observation data is to start collecting it as early as possible. The best way to do that is to build prototypes.

47

Physical Prototype

Next we build a physical prototype. These are great for watching users as we are going to get a first look at how they interact with a design. These are great first pieces of data that will affect how the design goes forward.48

Image of paper prototype

One method that I love in the early design phases is a paper prototype. What I love about the paper prototype is the instant feedback and how quick and easy it is to make. The other great thinga bout this is you can hand the user or stakeholder a marker and some post it notes to help generate the designs.49

Here we see a more advanced version of the paper prototype in action. Notice the mdeia changing as the user clicks through the interface. It is a great bit of ffedback to see how the interact with the screen. You can for excessive movement, is there confusion about where to go next, things like that. After tweaking and coming up with a few designs you can move on to the next step

50

Physical PrototypeWireframe Digital Prototype

Once we narrow down the designs using physical prototypes we can build some wire frame prototypes, we will get to an example of this in just a few minutes.51

Digital prototypes

Digital prototypes are also good because you can expand the reach of a prototype. Using Skype or a GoTo session with a webcam you can watch the person as they work through the content. Watching the face is important because it can tell you a lot about what they are thinking. After gathering data from this wider group of users52

The prototypes are incomplete, keep it simple as long as possible. Changes are much less expensive to make to prototypes than they are to released designs.53

Prototype example: Palm pilot54

Physical PrototypeWireframe Digital PrototypeBeta Testing

Finally we can collect the data using a refined digital prototype. This is a prototype that is very close to the final product. So lets dive into this a little further.55

Use it yourself, sometimes just getting something in a physical state allows you to see the initial problems. You can gather data from yourself, dont discount your feelings as you sue the first digital prototypes.

56

Workshops57

User Testing

User Testing58

WE can add context by using qualitative data. Why did so many people click on a certain element? Why was a certain path followed through a set of modules. So we are making a correlation between the design and the data that we are collecting.59

Qualitative Data

The numbers can only tell us so much qual = more work, but also more nuanced info60

Qualitative Data

Qualitative Data

63

From Data to AnalysisAvailable Relevant

Available? Whats relevant? Where do they intersect?

Lets start with whats available

64

What data am I likely to get?

Kinds data you can get that can be built into evidence, information, tell stories

Explore Your Data

Explore Your DataClean

Sense Check

Spread

Centers

Explore Your Data

He uses statistics as a drunken man uses lampposts for support rather than for illumination

Andrew Lang

69

What Do You Want to Know?

Participation

Performance

70

Participation RelatedHow are people using resources available?How much time are they spending? What elements/resources are proving useful?What aspects of participation affect results?What aspects of participation affect completions?

Performance RelatedAre we teaching the right things?

How does course performance correspond to job performance?

Did X have the desired effect on Y?

The game is on

Time for AnalysisBut we have a problem

When it comes to analysis, your brain may be your worst enemy

Illustration by Gerald FisherFrom Pscychology of Intelligence Analysis - Heuer

75

Analysis of Competing HypothesesRichards J. Heuer

Heuer Pscy of Intel Analysis / In sci we called this Multiple Working Hypotheses // Keeps you from being married to one idea pushes you to dig deeper and ask better questions than you would otherwise = more complete, more nuanced understanding.

Simple example Email nudges- did they help? What if we were talking about sales training?76

Is it diagnostic?

Is the data diagnostic? 77

WE have talked about a single course but think about if we had modules

So lets expand this out to thing about a series of modules that we might have in say an LMS or just out on a website somewhere. Looking at the data that appears..

78

A series of modules we can see how people use the content and what the end result from that use looks like.

We could start to see what people are doing start to see learning paths emerge. Now when we compare this to the behavior change data, we can start to make suggestions to future users. If a certain path shows to provide a good path to expertise for a job type. We can suggest that to people looking to move into that job type.79

How did the expert get here

80

A software company was looking to certify users. Now in software, a written test really provides almost no value to actual usage of the software. They need to see the user actually working in the software. So one company started generating xAPI statements from their software. The user clicked a button, interacted with a panel, created something.81

Show final result

That alloweed them to have user run through a process and then they could compare that to what an expert did on the same process. The number of interactions and the amount of time it took to perform them were tracked. If the user was 40% langer than the expert, a self paced module was assigned to them for the topic. They could go take the module, which can be tracked with xAPI and then retake the test. They could then see if the user could perform the task after the intervention. I know that this was a very quick high level overview of designing for data, let me leave you with one thought

82

What suggestions can I make to future users as to the path they take.83

Repeatable?

This is an iterative process, did the intervention work? If not why not? What can we do to make it better? Adjustments can be made based on the results being seen from the data generated, did the interventions actually work. This feedback loop is something that we are missing in a lot of current cases.

85

Wrap up story

Data by itself is useless. Data is only useful if you apply it.

Todd Park Chief Technology Officer White House

87

88

@[email protected]

@[email protected]

2014Instrumental4702.051www.orangefreesounds.com