decisions analytics to make ui a/b tests in google · make ui decisions” first, a couple comments...
TRANSCRIPT
A/B Tests in Google Analytics To Make UI
Decisions
David SchlotfeldtCo-owner of Plaudit Design
Who do you think you are!?
What’s this about?
“A/B Tests in Google Analytics to Make UI Decisions”First, a couple comments on the title...
● Applicable beyond “UI” - making any decisions you can test on your website (e.g. price of a product)
● “A/B” but beyond “A” and “B” to allow for many variations
Covered in This Presentation
● What “AB Tests” Are
● Running an Experiment
○ Howto Setup an Experiment in GA
○ Reading Results of an Experiment During and After
What “AB Tests” Are
Explaining, by explaining why
Modified questions on contact form
Why an increase?
Because my form change frickin rocked!!
..or…
● We are in a seasonal industry● Sales department picked up involvement in expos● Started a large social marketing campaign● Many reasons!
If we could see what would have happened if we didn’t make the change...
Same blue line...
Modified questions on contact form
AB Tests
● Allow you to compare with to without
● Split users into the control group (they see the original) and those who will see changes (variations)
● Isolate the effects of the change○ Everything beyond the change we care to measure
modifies the results of each segment equally.
Types of TestsA/B - Testing two variations of a page.
Multivariate - Testing various combinations of components on a single pagee.g. 5 different images x 4 different headlines = 20 variations
A/B/N - Testing two or more variations of a page.
== A/B/N model == Experiments
Running an Experiment
Example
Will be using a real example throughout this presentation
AB test selected because it is:● Recent● Small
Perfect for after presentation group discussion
Example
Goal: Increase sales leads
Metric: Online contact form submissions
Identified: A low number of users who view the first step of the contact form move on to the second step
... ContinuedWe believe the first step is overwhelming users. It contains a large number of options and each has a paragraph explanation.
Hypothesis:If we simplify and decrease the number of options a user chooses between at any given time then form submissions will increase.
Steps to running an experiment:
1. Plan2. Create Variations3. Configure Experiment4. Let it Run5. Analyze Results
1) Plan
What variations will be tested?Original always one variation
When testing multiple changes at once attempt to have a variation per combination
How many variations?
● Consider how long you want the experiment to run for and the traffic available when selecting number of variations.
● Up to 10 variations per experiment in Google Analytics (According to docs but you can configure 35 + the original.)
How to evaluate?We’ll expand on the following as we walk through configuring an experiment.
Need to decide:● Metric that will be used to evaluate the results
○ Page viewed, session duration, bounce, etc.● Percentage of traffic to include in the experiment● Distribute traffic evenly across variants (yes or no)
Control length of experiment:● Specify minimum time the experiment will run● Minimum confidence threshold to be achieved
before GA declares a winner
Our ExampleEvaluate: Increase contact form submissions
VariationsVariation 1 - Present options in multiple steps and remove descriptions. Doing so we believe users won’t become overwhelmed and their level of commitment will increase with each selection they make.
NOTE: Ideally we would have ran multiple variations to test multiple changes, but as noted earlier multiple variations increases timeline and best to minimize changes per variation.
2) Create Variations
Develop the Variations● HTML, CSS, blah, tech stuff, blah
● Each variation will either be○ Their own page○ Differences contained on one page
● Which approach you use depends on how you integrate (More on this later)
3) Configure Experiment
Create Experiment1. Login2. “Reports” (Top)3. “Behavior” > “Experiments” (Left side)4. Click “Create experiment” (Right side)
Percentage of users to include in experiment.
Those not included will see the original.
When deciding consider:● Volume of traffic to page● Number of variation● Riskiness of the changes
Multi-armed bandit experiments● “Distribute traffic evenly across all variants”
○ “off” by default○ “on” GA does not adjust traffic dynamically based on variation
performance
● GA uses multi-armed bandit for experiments which has the goal of finding best performing and adjust distribution of users as experiment progresses
● Twice per day adjusts the fraction of traffic that each variation will receive going forward.
● Adjustments made using a statistical formula that considers sample size and performance metrics together.
● You benefit by traffic being moved towards winning variations gradually, instead of having to wait for a "final answer"
Multi-armed bandit experiments
Two Options
Options presented are:1. “Manually insert the code”2. “Send the code to webmaster”
These options are:● Easy to implement● Simply copy and paste Javascript● Code will select a variation for each user and
redirect them to the URL of the variation.
Oh wait! There are two more options!
Additional options not presented in wizard
● Both require programming
● Both do not redirect to a separate page
● URL fields for variations can be fake. E.g. “1.com”, “2.com”. Validation errors can be safely ignored
Both are more involved so I’m only introducing them here.
Option 3: Javascript API Client SideAdvanced Javascript Implementation
More Info: https://developers.google.com/analytics/solutions/experiments-client-side
<!-- Load the Content Experiment JavaScript API client for the experiment -->
<script src="//www.google-analytics.com/cx/api.js?experiment=YOUR_EXPERIMENT_ID"></script>
<script>
// Ask Google Analytics which variation to show the user.
var chosenVariation = cxApi.chooseVariation();
</script>
● Requires backend coding - PHP, Java, .net, etc. (Google provides great documentation on exactly how to code this.)
● No delay/flash in page load/initialization waiting for JS
● Able to test more - e.g. change price of a product
Option 4 : Server Side
Option 4 : Server Side1. Store and Refresh - Periodically update local data about
experiments
2. Choose Variationa. For the current user interaction (e.g. pageview), is there an
experiment running?b. Has the user been previously exposed to this experiment?c. Should the user be included in the experiment?d. Choose a variation for the user
3. Send experiment and select variation to Google Analytics (Normally injecting JavaScript into the page)
More Info: https://developers.google.com/analytics/solutions/experiments-server-side
4) Let it Run
Let it run!● Last step of wizard: Click “Start Experiment”
● Automatically stops when clear winner or hits 3 months max
List of Running Experiments:
View of an Experiment:...
View of an experiment
Once start you can modify
● Name of experiment and variation pages
● Percentage of users in the experiment pages
● If email notifications are sent
● Variations can be disabled.○ e.g. If you feel one of your variations is clearly under
performing.○ Will shorten experiment since more users split between less
variations.
● Stop an experiment. Keep in mind it cannot be restarted
5) Analyze Results
It finished! (Or you got tired of waiting…)
Now what?
● Original page only - Users will no longer be presented variations
● Best variations - If a variation won, modify the site
● If variations existed at different URLs, set up permanent redirects to new page for those who bookmarked a variation
● Run another test! With more insight from that experiment consider running a new one for additional improvement
Analyzing Beyond Who Won● Create segment to analyze behavior of the users who were
served a specific variation
● Can use segment to analyze and compare any report in Google Analytics○ How they used the site: Behavior reports (e.g. Users Flow,
Behavior Flow, etc)○ How they found the site: Acquisition reports (e.g.
Channels, AdWords, etc)
More Information: https://support.google.com/analytics/answer/4380401?hl=en&ref_topic=1745208
Wrapping UpThe Preso
Summary● Experiments allow you to isolate the effect of changes● Google Analytics uses A/B/N model
● The process to running an experiment:1. Plan
a. What variations will be tested?b. How to evaluate?
2. Create Variationsa. Development
3. Configure Experimenta. Setup in Google Analyticsb. Configure on site with “four” potential approaches
i. “Manually insert the code”ii. “Send the code to webmaster”iii. Javascript API Client Sideiv. Server Side <-- My preference
4. Let it Runa. Watch reports.b. Modify if needed including disabling variations.
5. Analyze Resultsa. Put winner in place.b. Use segments to further analyze
Critique - Room for improvement!What changes could be tested in the next experiment to further improve conversions?
Process1. In small groups brainstorm ideas2. Share ideas with the larger group3. As a group select top ideas4. Define variations
The end!Until next time