microfeedback - ux camp switzerland

Download Microfeedback - UX Camp Switzerland

If you can't read please download the document

Post on 10-Feb-2017




0 download

Embed Size (px)


Microfeedback / Maximum Insight?Lets look at examples and share experiences

UX Camp Switzerland #uxcampch 2016Julius Dietz | brandwatch.com | @juliusdietz

UX Camp Europe & UX Camp HH...

Warning ;)

& UX Camp CH :)

Warning ;)

Julius Dietz@juliusdietz

VP Engineering UXat Brandwatch (@brandwatch)About me

Heading up a team of great UX designers at Brandwatch

MicrofeedbackWhat is it?

start out with a short definition of Microfeedback.

Sarah doody says it better than I could

Microfeedback is little bits of information collected from customers at specific trigger points in your products experience. The goal is to get definitive feedback about key interactions with or outcomes from your product.Sarah Doody inGet Better Qualitative Data ... With Microfeedback What is Microfeedback

As the name suggests, microfeedback is little bits of information collected from customersat specific trigger points in your products experience. The goal of gathering microfeedback is to get definitive feedback about key interactions with or outcomes from your product.

paternity leaveFriedrich, my 5 year old sonlast nightone way to show how simple MF is: even he can do it give instant feedback about key interactions(also a way how to get him into his first presentation ;) )

MicrofeedbackWhere does it fit?

First of all, want to quickly discuss where Microfeedback fits in the Design process...

Design Process / Usually looks something like this...

Source: http://www.meetpatrick.de/

So our design process usually looks something like this one which is based on the Stanford school of design processfirst we look at our users + environment + their problems needs and jobs to be donethen we define the problem space that we want to create a solution forwe create lots of ideas for possible solutions, go wideThen converge on one or two solutionsprototype them with the least effort required we need to test our assumptions, defined earlier(and, missing for me actually in this diagram:we launch, then (ideally) we continue to iterate and go back to any of the steps, depending on what change we find is required

Design Process / Feedback & Input Mechanisms

Source: http://www.meetpatrick.de/

Mostly in the Understand/Observe and the Test phase youre likely to use any (and lots more) of these:

InterviewsUsability TestsContextual InquiriesSurveys & loads more...Feedback & Input / Mechanisms

Fedback, Input = User Research tools:

InterviewsUsability TestsContextual InquiriesSurveys ...

Surveys...zoom in

So let's look at surveys as they are most relevant when looking at micro feedback

When do you send out surveys?

Before launch (learn about user)Surveys / A closer look

Weve classically sent them: before to learn about the user

When do you send out surveys?

Before launch (learn about user)

After launch (quite some time)(learn about happiness, get feedback)Surveys / A closer look

and after

(let some time pass so enough people have actually interacted with your product)

Problem / Out of context

Reaches people out of context...

Vague memories

Unfortunately often vague memories of that one feature that youre currently interested in

Lost valuable time to react.

Building the next thing?Also...

Info from survey comes quite late because you have to wait before sending it out.Quite practically: you may have moved on to design and develop the next thing

You want to learn about several aspects of your productAnd Finally

You want to learn about several aspects of your product, so the risk is you end up sending out something like this: (next slide)

Credit: Sarah Doody, post on Microfeedback

doesnt look like too much fun filling that in!

To the rescue...

Youre guessing it already: thats where I claim...

Micro FeedbackE.g. Micro SurveysTo the rescue...

that Microfeedback can come to the rescue

As you can imagine, the idea here is to ask users during/shortly after the experience

Fresh Emotionally engaged

Whats good about this?

Microfeedback lets you ask for feedback while its still fresh in the users mind

I think also while users are still emotionally engaged: angry, happy, dont care

Again a reason why theyre more likely to feed back

More accurate feedbackHigher willingness to participateLeads to

so this leads to

More accurate feedbackHigher willingness to participate

More accurate feedbackHigher willingness to participate

Less effort for users more reliable insights for you

Leads to


So you end up with a win win situation:

Less effort for users more reliable insights for you

Out in the wild

Youve seen one example of Micro feedback already: the punch buttons from the DIY shop.

Now Id like to show you some more examples

Example / (Automated) Email

Source: https://userbrain.net/blog/how-to-integrate-continuous-micro-feedback-into-your-business

41% response rate according to Alex!And thats not even quite so MICRO, rather Medium ;)

response might be higher if shorter!

But.. something to consider building into a product, should be quite simple!

Example / (Automated) Email 2

Source: https://userbrain.net/blog/how-to-integrate-continuous-micro-feedback-into-your-business

60% response rate

(though Id be careful - many advise to give more options as otherwise you may get the wrong picture (often 5 are recommended)

Example / On-site overlay with text field

Source: https://userbrain.net/blog/how-to-integrate-continuous-micro-feedback-into-your-business

asks question immediately after the specified behavior (in this case leaving the registration page), and the question appears right on the page theres no link to click.

Example / Classic: newsletter unsubscribe

Source: https://userbrain.net/blog/how-to-integrate-continuous-micro-feedback-into-your-business

Youve probably all seen this - in this case Mailchimp

From https://userbrain.net/blog/how-to-integrate-continuous-micro-feedback-into-your-business

Example / Classic: helpful page?

Example / Classic: helpful page?

More sophisicated version

Example / Classic: helpful page?

Example / Classic: helpful page?

Example / How can we make it better?

Source: https://userbrain.net/blog/how-to-integrate-continuous-micro-feedback-into-your-business

If you dont have thousands of users feeding back this might be the most effective - if you are more after improvement suggestions and qualitative feedback than satisfaction rates

Gotowebinar on the iPad

progressive disclosure could have helped a lot here though

Example / SMS rating FTW

Via sms. You see this increasingly in several services that ask you for just one number via text message to feed back about your experience

(other examples Ive experienced this: call centers etc)

Example / Wednesday night...

Good intentions

Example / Thursday night...

Good intentions

Example / Teasing ;)

Good intentions

Example / Teasing ;)

Good intentions

UX Camp CH / Sticky notes feedback at the door

Good intentions

How we did it(v1)

What we did this resulted from a Funky Friday project (explain FF)

just clickablebut option to get more feedback

there are lots of tools that you could embed, like intercom etc. but what is nice

Brandwatch Query Builder / New Version

Query editor, that takes you through writing these (quite complicated) queries


Brandwatch Query Builder / New Version

Brandwatch Query Builder / New Version

Then ask them to fill in a longer survey

3 Minute Survey / Lets see what we really need

3 Minute Survey / Thats better!

focusing on qualitative feedbackdoesnt want to influenceopen but shortshould have the same effect as the one on the left (left one could only be done with guidance)

Different messages for

Promoters (9-10)Passives (7-8)Detractors (0-6)

Little Detail:

Different messages for

Promoters (9-10)Passives (7-8)Detractors (0-6)

Little Detail:

Promoters DetractorsLittle Detail 2:

Difference:Promotors: what do you LIKE (want a free mug? 2nd step)Detractors: what do you NOT LIKE (didnt ask for identity, be less honest)

Animation (for design reasons, not for effect)

How did it go?


Shown to 2151 users782 users fed back / 902 closed

36% response rateThe obligatory stats

(might even be 50% response rate)

Nearly half the interactions were feedback rather than closingThe obligatory stats

A third of the users that were shown the micro survey actually gave feedback rather than closing! We showed it again after 7 days (the closers likely to close again)

Both actions, one click

Both actions, one click

though of course a bit more mental effort to rate.

not too bad it seems

Rated vs Full SurveyThe obligatory stats

Rated vs Full Survey

1 in 5 users Full Survey1 in 20 users completedNPS score of ca 12 The obligatory stats

clicked on the full survey,

Enough quantitative feedback to have a benchmark

Good qualitative answers to learn fromSuccess!

so all in all we saw this as succe