hearing the voice of the customer in the enterprise product development lifecycle

Post on 26-Jun-2015

166 Views

Category:

Business

2 Downloads

Preview:

Click to see full reader

DESCRIPTION

These slides were co-presented at UX Australia 2014 by U1 Group and Telstra. Here we discuss our approach to solve a common business problem: How do you go about measuring the user experience of a service you have designed while conforming to common constraints? See the methodology we apply, using customer journey maps and analysing touchpoints, and what metrics we use i.e. NPS (Net Promoter Score) and LTR (Likelihood to Recommend). To read the article, please visit our blog: http://u1group.com/blog/article/service-walkthrough

TRANSCRIPT

Hearing the voice of the customer in the enterprise product development lifecycle

Net Promoter Score (NPS)

The Net Promoter Score is a simple metric

Customers’ likelihood to recommend

Net Promoter Score (NPS) = # of Advocates - # of Detractors

Detractor Passive Advocate

“How likely are you to recommend Telstra?”

0 1 2 3 4 5 6 7 8 9 10

From customer satisfaction model to customer advocacy

Knowing we creating great experiences... before launch

We’re a services company

Measuring Advocacy Prior to Launch

Executive statement…

Then we had to deliver…

“We will not launch an initiative that is not expected to provide a superior NPS to the experience it replaces”

“We will not launch a new initiative with a negative NPS score in pre-launch

testing”

We came up with a complicated model…

Measuring service

The number of possible combinations of interfaces, devices, infrastructure, telecoms & data, and service

Timeframe and budget

Consistency and repeatability

Measurability

Service walkthrough

Case Study 1: Telstra Platinum

-48NPS =

Sample: n = 180

(+/-10.5)*LTR = 9.6

(+/- 0.51)*

Sample: n=18

Product LTR has REACHEDthe target of 7.3

SUS = 90.7

(+/-4.4)*

Sample: n = 19

Project SUS has REACHED the target of 70

This indicates the system usability is above average and

likely to detract.

BP1 = 1 (Critical Impact)

BP2 = 1 (High Impact)

BP3 = 3

BP4 = 0

BP5 = 3

A range of metrics(not real ones)

Issues relating to Live Chat

Problems accessing remote desktop access for support

Technical jargon used in collateral

Opportunities to improve support agent scripts

Case Study 2: Telstra Cloud Services

Pre sales Sales Install Usage Assurance Billing MAC

Week 1 Week 2 Week 3

LTR

SUS

9.7 (+/- 0.81)

80.5 (+/- 7.8)

Episode Metrics

3.6 (+/- 0.92)

50.1 (+/- 9.3)

7.6 (+/- 1.10)

90.7 (+/- 6.9)

Experience changes over time

Identify the discrepancies between Virtual server management and Dedicated server management

Interactive Voice Response (IVR) terminology mismatch

Navigational issues

Platform instability

Wins for TelstraPage 20

Pushing boundaries of traditional lab-based user testing & iterating methodology

Discover and mitigate issues relating to overall service

Being able to show with tangible evidence how the overall service impacts the uptake of a product

Operationalised testing model

Faster and better decision making on product launch & risk

Reportable outcomes for exec governance forums

Ensuring uptake

Limitations

The NPS is a funny metric

The tail wagging the dog

Difference between a service walkthrough and the true live state

Outcomes when the horse has bolted

Thank youMegan Cruickshank, Telstra (@brightsparrow)Nilma Perera, U1 Group (@u1group)

top related