tips for making your website load faster

47
WEBSITE PERFORMANCE TIPS ISL X ADDTHIS / JUNE 2016

Upload: addthis

Post on 08-Feb-2017

4.442 views

Category:

Technology


3 download

TRANSCRIPT

PowerPoint Presentation

Website Performance Tips

ISL X Addthis / June 2016

Thanks Justin, I appreciate the introduction.

Let's get started with some tips for improving website performance and load times!

[next slide]

HELLO, internet!Andrew KrawchykSenior Web Developer at ISL@krawchyk on [email protected] on Email cold-brew coffee

Like Justin mentioned, I'm Andrew Krawchyk and I work at ISL. I'm a Senior Web Developer and way more of a nerd than I should admit especially when it comes to performance.

If you want to reach out to me, I'm @krawchyk on Twitter, but if that's too many consonants you can also reach me by shooting an email to [email protected].

Side-note, I do love cold-brew coffee, so if you mention you'll get me some I'll be sure to respond to you quite quickly.

[next slide]

Performance What? Images Third Party Resources Caching Content Delivery Networks Perceived Performancetodays agenda

Let's go over the agenda for the webinar.

First, we're going to look at an overview of what website performance is, what load time and page weight mean, and most importantly why it all matters for your users.

Then, we'll cover some common areas in website development that are key for reducing load time: Images, 3rd party resources, caching, and content delivery networks. These are the easiest parts of a website to change that can result in dramatic decreases in load time and page weight.

Finally, we'll cover approaches to consider if all else fails. We'll discover how to hide a long wait without making the user really feel like they're waiting on something.

So to get started... [next slide]

Performance What?

ISL x Add This / June 2016

First up let's take an overview of what I mean when I say "website performance". Because in the physical-world, it's easy to measure performance with things like size, time, and distance. For example, a larger object will take more time to transfer farther distances.

And similar rules apply to the Internet. The bigger a resource is the more your browser needs to download which increases the time it takes to load a website.

[next slide]

WebsitE PerformanceHow quickly a website responds to web requests.

ISL x Add This / June 2016

So, simply put, website performance is "how quickly a website responds to web requests". Size matters, distance matters, time matters. All of these factors (and more) compound and result in loooooooooooong wait times and slooooooooow loads.

Our job in making websites is to minimize amount of time to fulfill that user's desire to retrieve content. You want to keep user's attention but unfortunately their attention spans are shorter than ever.

And the reality is... [next slide]

Nobody likes to wait.

Nobody likes to wait. A user chose to come to your website out of the millions of websites out there. If they have an unpleasant experience on your website, they'll remember that. It will impact repeat visits, if they even decide to come back.

Let's start out with an example to help frame the conversation... [next slide]

Example

http://www.webpagetest.org/

This example was captured by WebpageTest.org which is a fantastic tool for testing website performance.

This is a visual website comparison of the same website, but only one that's been optimize. It has identical content and resources, so let's take a look at the visual impact performance optimizations can have on page load.

First on the left is unoptimized... [play left video]OK, what did we see there? Blank white screen until all the content finally popped in. That was less than 5 seconds, but still the wait felt uncomfortable, it feels unnatural. Because when we do things in the physical world we always get a response. Pick up an object, you can immediately feel the weight of it in your hand. Touch something, you can immediately feel it's hardness, or temperature, etc. But this is not the case with the Internet.

Second video on the right is of the optimized website... [play right video]Little bit faster there, about a second faster overall. But still, there is a very important distinction from the last example. Here, in the optimized version the content appears faster, and the site is more visually complete earlier. This is called the "Speed Index" which is a measure of how much critical content is visually complete while the page finishes loading. We'll talk about this more later, so keep that on your radar.

[next slide]

Example

Blank canvas, all content pops in (finally)Critical content loads early

Now let's look at the play-by-play of the previous comparison videos. This is broken down in half-second increments and it makes the difference between the two examples much more clear. The optimized site displays content as early as 1 second instead of making the user wait almost 5 seconds for feedback that the page is loading.

Also notice in the optimized example the sidebar is not loaded until last. That's OK because it is a non-critical element of the page. The user is probably not dying to see your page's sidebar, so why bother prioritize loading it?

Instead, the header and the article are prioritized. The header is small, but even by loading a small but critical part of the website it informs the user that the page is loading quickly.

[next slide]

Critical PathContents of a website to display that relate to the user's current action.

ISL x Add This / June 2016

So, the critical path is the content needed to for the user's current action on the site. That content should have the highest priority and be loaded as quickly as possible.

In the optimized example, the article was prioritized, not the sidebar. The user wants to read the article... they clicked to read content, not to see your sidebar.

[next slide]

Need to know

http://httparchive.org/trends.php?s=Top100&minlabel=May+15+2013&maxlabel=May+15+2016

Let's switch it up and take a look at some data gathered by HTTPArchive.org. These two graphs show recent trends in website page weight and composition of the top 100 websites on the internet (Alexa list).

The first graph on the left shows how the average page weight has increased over the past five years. Also this shows a relationship with the increase in image size over the same amount of time. Images are one of the most commonly forgotten resources to optimize, even though they are comparably easy to optimize.

The second graph on the right shows how different resource types make up the contents of these websites in the past year. Images take the largest slice of 68% of total page weight. And coming in second are scripts with a hefty 22%.

[next slide]

Need to know

http://engineroom.ft.com/2016/04/04/a-faster-ft-com/

Here's another graph of engagement related to wait time.

In early June, the engineers at the Financial Times website published an article about an experiment they ran to see how page load times impacted return on investment. They purposefully imposed increases of 1-5 seconds of load time on their users to test the impact on their browsing sessions.

It shows an unified look at the engagement these users had with the website in pages per session. There's a drop in engagement for users visiting more than 2 pages. That's abandonment. That's less ads seen per user, less subscriptions per user, and potentially less return visits per user.

Straight up, waiting leads to abandonment.

[next slide]

Need to know

Nielsen Norman Group identified three important limits for response timeshttps://www.nngroup.com/articles/response-times-3-important-limits/ 1/10 Second1 Second10 SecondsUsers feel they are directly manipulating the interfaceUsers feel they are freely navigating the websiteUsers feel lost, struggle to keep attention on the task

To help quantify this, let's look at some critical limits in a user's experience on a website as defined by The Nielsen Norman Group. At each of these limits, a user experiences a different feeling. At 1/10th of a second the user feels the page is reacting instantly and they are directly manipulating the interface. At 1 second, the user's though process remains uninterrupted and they feel they are navigating the website with ease.

However, the closer the wait approaches the 10 second threshold of keeping a user's attention on a task, the more likely the user is to abandon the task. Anything slower than this needs an alternative way to handle the wait, either with UI indicators like spinners or progress bars (showing the user the remaining time is great), or even by sending them a notification once their task has completed. This frees up the user to concentrate on something else besides their frustration :)

[next slide]

BUtterButter is that feeling you get when everything goes so smoothly, you don't even notice it.

ISL x Add This / June 2016

Butter is a silly analogy I like to use for relating performance to users. It's like when you're cooking and you're in the middle of a recipe that needs butter, but you then realize there's a giant hiccup since you don't have any butter. Like performance, you don't want to run out of butter when you need it the most.

(You can also say it's like plumbing, you don't realize it's there until it's broken or missing. Except when you make some performance improvements, 'Butter!' is a lot more fun to shout)

Butter is what user's should feel on a website, not frustration. Their experience should be smooth and seamless. A user's taps and clicks should feel like natural extensions of their bodies as they get immediate feedback from a device.

[next slide]

lets make the web fast!

So all in all, let's make the web fast! Waiting affects a user's ability to accomplish a task and it leads to user abandonment (less subscriptions, less ad views). Ultimately, performance shapes a user's perception of a website or even a brand as a whole. When a user thinks of my website or brand, I'd want to be remembered as being fast!

[next slide]

Images

ISL x Add This / June 2016

OK, so now that we have an overview of website performance and load time, let's talk shop.

First up, how to optimize images to improve performance on websites. Like we saw in the graphs before, images are commonly the largest slice of the pie when it comes to resources on a webpage. That means they can hold the most opportunity for improvement in performance.

But before we do that... [next slide]

Image TypeBitmap images are most common, but vector images are small and scalable.

ISL x Add This / June 2016

We need to discuss the differences between types of images. There are two big players in the image arena, bitmap and vector images. Both are great for different purposes so you'll need to learn when to use each.

Bitmap images have been around for the longest and are most familiar, the well known file formats JPG, PNG, GIF.

Vector images are newer to the scene but are fantastic for images on the web. The most common format is SVG and it's been a web standard since 2001, so they're very well supported.

Let's look at the difference between the two... [next slide]

vs.

On the right, we have a bitmap image representation of PacMan. The grid is 18x18, where each square represents a single pixel, so there are a total of 324 pixels in the image. Each square must save the image data of that single pixel, so for larger and larger images you need to save more and more pixels. That means bigger files!

Also, if you enlarge this image you don't get new pixels for free. Imagine there were 2 pixel margins between each pixel here. You'd have to fill in the empty spaces to make the image look seamless, so the end result looks jagged and aliased instead of smooth and round.

With SVG, you don't have to worry about either of these drawbacks. SVG files are just XML-like text files (think HTML, similar to HTML tags but for shapes) that instruct the browser how to draw shapes, paths, colors, gradients, etc. This means the browser can just increase the inputs to those instructions to enlarge without any quality loss. It's all just math.

So what about optimizing... [next slide]

Optimizing ImagesRemoving unneeded image data (lossy) to reduce the size of the resource.

ISL x Add This / June 2016

Image optimization works on both bitmap and vector image types. Unfortunately, to properly optimize images for websites, you need to use "lossy" compression.

Lossy compression reduces quality of images by removing image data. That means each time you optimize an image it will decrease in quality. But, this can dramatically decrease the file size of images, so it's necessary to find a quality/file-size balance.

So, let's see what options we have for compressing images... [next slide]

This is a purposefully over-optimized JPG image. It's dimensions were reduced by half and it was exported at 10% quality.

It doesn't look terrible, but it's pretty bad. We can still understand the image, but we probably wouldn't want to use this image fullscreen, but I'm using it just for example.

[next slide]

Quality Choices

Original: 14MBOptimal: 172KB

https://www.flickr.com/photos/78948650@N08/17388497476/

So, what's the deal with that image and why does it look so bad? Well here's a side-by-side comparison of the optimized image next to its original. Now, I definitely over-optimized this and it shows in the poor quality of the resulting image. But, notice how much of a difference the optimization made on the file-size. The optimized image is 1% the size of the original. There's a very clear lack of image detail, but it's microscopic in size comparatively. You wouldn't want a user to load the original 14MB over 3G, that's for sure.

[next slide]

pro tips

Always request full-quality source assetsDetermine actual size of imageResize or crop unnecessary image sizeAggressively optimize imagesTinyPNG.com / TinyJPG.comgifsicle, jpegtran, optipng, pngquant

https://tinyjpg.com

Here's a few tips for getting the best results out of your optimized images. First, always request full-quality source assets from wherever you get your images. That way, if you over-optimize your images you can always cut a new one, and if you under-optimize them you won't have to suffer the extra quality loss of re-optimizing a previously optimized image.

Second, determine the actual size of the image on the page where you'll be using it. After you've done that, resize or crop away any unnecessary image size to reduce the amount of pixels.

And last, always aggressively optimize images. There's no silver-bullet quality level to choose since all images will optimize differently. But I like to start at 60% and work my way down until I can't stand the quality loss. Most of the time, your users aren't going to be putting your images under a microscope anyway.

Also, if you'd like to automate this process, TinyPNG and TinyJPG are essentially the same tool that works great! And the CLI utilities below can be used in scripts to run on deployments and such.

[next slide]

Vector (SVG)

https://github.com/SVG-access-W3CG/use-case-examples/blob/gh-pages/refMapWithScatter.svg

Scalable to any size No quality loss when scaled Use SVGOMG to optimizehttps://jakearchibald.github.io/svgomg/Compress better than bitmap (gzip)

Vector images are quite different from bitmap images though. Like I mentioned before, they are scalable to any size and don't suffer from loss in quality from scaling. Optimizing them can be tricky because of the many different possible optimizations. I use SVGOMG which has a nice interface for showing how the different optimizations affect my image.

SVG also compresses better than bitmap images because they are just text and can take advantage GZIP compression.

[next slide]

Vector vs Bitmap

...as opposed to bitmap imagesSVG is great for geometric shapes and text

If we look at the same image side-by-side in SVG vs a bitmap format, we can see the differences in their use cases. SVG images are great for geometric shapes and displaying text, while bitmap images are not suited for this. The lines are much more blurry, and the gradients are much less smooth than that of the SVG image.

However, it would be surely impossible to sue SVG to reconstruct the palm tree photo we saw earlier. So, when choosing a format, consider if the image is photography or not to help determine what type of image to use.

[next slide]

Reduce image weight by X%

The ultimate goal is to reduce the image weight of your webpages by as much as possible. The amount will depend on the amount of images on your site, so fill in whatever number makes sense for you here.

One thing to remember is the easiest way to reduce image weight is to remove the image. An image can't increase load time if it isn't there. Really consider if the images are necessary and always optimize them as much as possible.

[next slide]

3rd Party Resources

ISL x Add This / June 2016

Alright next up let's talk about 3rd party resources and how they impact website performance. Most 3rd party resources come with some JavaScript snippet to insert into your website. They enhance your website, adding features like analytics, ads, support, notifications, and share buttons. These are great since they don't require a lot of effort to implement, but they can introduce performance issues.

[next slide]

Out of your control3rd Party Resources provide additional functionality on your website, but cost users time.

ISL x Add This / June 2016

The biggest drawback to adding 3rd party content to your website is the lack of control over this content. When you put the resource on the page, you agree to ship anything that vendor chooses to the user. This means if your website has 3 images on it, and the 3rd party widget has 11, the user will be forced to download all 14 images.

Hopefully this is a contrived example, but images are not the only concern. Certain scripts can "block" the page from loading, forcing the browser to wait until the script is finished download until it can continue rendering the rest of the page. Others scripts are not optimized and could be much much larger than they need to be. It's not always possible to fix these upstream issues because you probably don't have access to the scripts.

[next slide]

http://www.nytimes.com/2016/06/10/science/bees-asexual-south-africa.html

Here's an example of a popular newspaper website, NYT. Just by looking at the top part of the screenshot, you can tell there's plenty of 3rd party content on the page. I see 2 ads and some share buttons, and that's just above the fold.

Diving under the hood, we can see the network tab of the Chrome developer tools open here. The big green arrow is where the `DOMContentLoaded` event is fired, which means the HTML has finished loading without waiting for any other resources to finish themselves. That HTML includes the critical article content, which loads under 1s. Pretty good, that has to feel nearly instant for the user!

The big red arrow is when the `document load` event is fired, which means the page and all its resources have finished loading. That's over 5 seconds from where the content first loaded for the user. That's not terrible, but it's important to remember there are competing interests here.

[next slide]

Here's the same page at the green arrow, and notice we don't see any of the 3rd party content compared to the fully loaded page. And we shouldn't because it's not critical content.

You get to decide what the critical content is for your users, but ensure it matches a user's expectations. You don't want to stop the page from appearing to load by blocking with scripts. This goes for all scripts, including web font loading scripts and social network scripts. Fortunately, a lot of companies like AddThis take performance seriously and provide non-blocking ways to load their scripts.

[next slide]

Performance Budget

Under BudgetOver Budget Add featureOptimize existing features to make roomRemove low priority feature from the pageDon't add feature

Determine page load and performance goalsMeasure performance impact new content/features haveRepeat for each new feature

So when should we add 3rd party content? Well, performance budgets are a great way to help balance user needs with business needs when it comes to external scripts. For example, if you have a goal of your page being visually complete within 2 seconds, you might have to sacrifice some 3rd party features to accomplish this. Instead of loading ads in the critical area of the page, you'll have to display them after the user scrolls. That might mean adjusting more than just scripts, design could need to be taken into account as well. But that way, we can delay the necessity of the script being loaded, and load it later instead of during initial page load.

[next slide]

Pro Tips

Make a performance budget for your websiteSpeed Index 1000 for cable, 3000 for 3GLoadJS fetches files asynchronously (without blocking page rendering) Test your pages with content blockers and throttled internethttps://github.com/filamentgroup/loadJS

3rd party scripts can also change without your knowledge so keep that in mind when monitoring for performance changes. A good way to start is by making a performance budget around the Speed Index metric I mentioned earlier. An ideal target for cable connections would be 1000, and 3000 for a 3G connection. These aren't easy targets, so use what makes sense for you but be ambitious!

A lot of 3rd party vendors will provide instructions for asynchronously loading their scripts, including AddThis. If they don't, or you just want a little more control over the loading, you can use the LoadJS function by the Filament Group. This function always loads scripts in a non-blocking way, and allows you to conditionally load scripts however you please.

Also, make sure to test your pages thoroughly as your users would use them. A lot of users are still without true broadband, so make sure to test on slow connections to see how slow script loading impacts your website. And make sure you page is not dependent on 3rd party scripts, because users will inevitably block 3rd party content with ad-blockers and the like.

[next slide]

Only Load critical scripts.

Moral of the story is, you never know what a vendor will ship in their scripts that you add to your website. You want to balance user need with business needs and only load critical content first.

Caching / CDN

ISL x Add This / June 2016

So far we've looked at two ways to decrease page weight and load time by reducing image size and 3rd party content. Now, let's see how caching resources and content delivery networks benefit the user. I'm going to refer to content delivery networks as C-D-N from here on out since it's a mouthful.

[next slide]

Browser CacheTemporary storage of web resources used to reduce response time.

ISL x Add This / June 2016

Let's start with caching. Every time you visit a webpage, your browser needs to download resources to display the content. Nothing new, we've been talking about loading resources this whole time. But, chances are within the time you're browsing the site, their resources won't change, and if they do the changes will be piecemeal. It'd be rare to be browsing and suddenly the next page you see a brand new redesign of a site with completely different assets!

So, since these resources we've downloaded probably won't be stale anytime soon, the browser keeps them in a cache to be used again. Instead of needing to download the same resource twice, we can use the cached version and save a lot of loading time!

[next slide]

how it works

ResponseRequest

RequestCached

Here's a simple example of how a browser handles a request to the same page on a repeat view. The first time we visit in the top-half, the browser sends a request to the server to retrieve the resources on the page. That dotted line in the middle represents the internet, or where we need to reach out to to get content.

With the repeat visit on the bottom, we don't need to request those resources again. We just browsed to the same website, so the resources probably haven't changed since our last page view. So instead of getting the assets from the server, we just use those stored in our browser's cache. This can greatly reduce load times for subsequent views on any website.

[next slide]

pro-tips

Identify static content for cachingImages, Video, Text filesSet long Cache-Control max-age headersCache-bust updated static assets with versioned hashesAutomate it!https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/http-caching

Setting up caching on a website is more technical than the previous tips I've discussed. But, there's a few simple rules to remember when caching. First, only static content is suitable for being cached. If you were to cache a user's dashboard page, any other user that browses to the same url will see that user's dashboard, not their own! So, it's a good idea to limit caching to resources you know are going to remain static for a long period of time. Resources like images, video, and text files are great candidates to be cached.

Second, you'll need to set Cache-control max-age headers on the responses your server sends to browsers. I've highlighted this in the image to the right from the Chrome Developer tools network tab. The max-age is set to over 3 million (in seconds) which is the number of seconds in a year. So, assuming the files remain unchanged or the cache hasn't been cleared, a user can browse this website for the next year without needing to request new resources from the server! Notice the Status Code is `304 Not Modified`, that means we didn't need to download the resource since it isn't changed.

Third, make sure to cache-bust your assets with versioned hashes. This way, if you do need to ship updated files to a user, they are guaranteed to download fresh updates from the server since the filename is different. This is highlighted in the top of the image.

A best practice is to automate all of this. There are Grunt and Gulp workflows out there for this, and web frameworks like Rails and Django also have techniques for caching and cache busting.

[next slide]

Cache everything.

Cache everything. OK, maybe not everything... Like I said, dynamic content is not suited for caching. But, there's just no excuse not to cache every static resource on your page. Even if your initial page load is abysmal, caching can save you from user abandonment since repeat page views will be speedy!

[next slide]

Content Delivery Network (CDN)Network of servers distributed over the globe to serve content to users at high-availability.

ISL x Add This / June 2016

Now let's talk about CDN's. So far we've been talking mostly about size and time with respect to website performance. However, distance is a big factor in the time it takes a server to deliver a response to a browser. CDN's reduce the distance between your website's resources and your users by distributing the resources across the globe. Instead of a user in Asia needing to request an asset from a data center in Virginia, they can go to a closer CDN server in Asia for the same resource.

This is also more technically involved to set up, but there are many CDN providers that have step-by-step instructions for how to configure them with your website today.

[next slide]

To help me visualize this, I like to think of a pond with a bunch of lily pads in it. If there's food or a treat we want on the lily pads, we'll likely need to hop across them to get to it. In this case, we only want to jump to the pads that are closer to us to avoid extra time waiting for our feast.

Similarly to websites, we want to deliver that resource as fast as possible to the user which means using the closest server that has that asset.

[next slide]

Need to know

The further away your users are from your servers, the slower the website will respond.Using a CDN in conjunction with long cache headers is best.AWS CloudFront is free up to 50GB of data transferred or 2 million requests.

https://aws.amazon.com/cloudfront/

CDN's make the heavy lifting of spreading your resources across the world a breeze. AWS CloudFront has servers in the US (21), Europe (16), Asia (14), Australia (2), and South America (2). It's never been easier to distribute static resources to speed up a user's experience.

To get the maximum benefit of a CDN, you'll want to use both caching and a CDN in conjunction so the initial request is speedy as well as all subsequent requests.

[next slide]

Always use a CDN.

Key takeaway, just like caching everything possible you should always use a CDN where possible. They're relatively cheap, even free like we just saw with AWS, and the performance benefits for the user can be huge.

Perceived Performance

ISL x Add This / June 2016

OK, so we've covered 4 distinct ways to decrease load times and page weight on websites. But, what if you try all of these and you still can't seem to meet your performance budget? You'll have to get a little creative :)

Perceived performance is how quickly a feature appears to perform its task. This can be accomplished in different ways, so let's look at a few approaches.

[next slide]

We're all just SheepUsers wont feel something they don't notice.

ISL x Add This / June 2016

It's important to know that a just because you're aware of a heavy resource or a long-running task, doesn't mean the user needs to feel it. Perceived performance is all about hiding wait time for a user instead of making them wait for the task to complete.

Especially when the task falls into the 10 second or greater bucket, the user can still get the impression of a smooth interaction if the wait is managed appropriately.

[next slide]

Low Quality Image PLaceholders (LQUIP)

https://github.com/aFarkas/lazysizes/tree/2.0.0#lqipblurry-image-placeholderblur-up-image-technique Start sMall, Get BigPrepare low quality imagesLoad low quality images firstLazy-load the high quality imagesFade in the new image quickly4.6 KB68 KB

Using low quality image placeholders requires a bit of orchestration. You need to prepare separate low-quality copies of your image in order to first serve these to the user. Although the initial image is very poor quality, it still captures the essence of the image. You still get green and orange colors, and some abstract trees.

This can greatly reduce the load of a page while still giving the appearance of it being populated with content. In the images in this slide for example, imagine we had a gallery page with 24 images on it. If they were all high-quality thumbnails initially, that would be 24 * 68 = 1.6MB of image thumbnails for initial page load. Even half that is a lot of initial weight. Instead, we can use low quality images which would only use 110KB of initial page weight for the same amount of images. That's 93% less image weight!

Once the page has been loaded, our lazy-loading script then kicks off the loading of the replacement high-quality images. You'll want to use subtle and fast CSS transitions to style the loading with a fade so there isn't a popping-in appearance. It's also a good idea to provide no-js fallbacks with `noscript` tags. This technique is detailed in the lazysizes library I like to use.

[next slide]

Just In Time (JIT)

Wait for the user to interact with the page.If that interaction needs another resource, use idle time and load the resource in the background.Take advantage of idle time

Here's an example from a website I work on. We have a user registration form and we want to make sure the user considers the strength of their password during registration. Notice the password fields are far down in the form. This means we don't need to immediately load any of the password strength checking scripts. Good thing too, because the library we used weighs in at 320KB so it's great to be able to delay that heavy resource.

Instead of loading it immediately, we wait for the user to focus a field in the form. The likelihood of a user focusing one of the password fields first is slim, so only after the user indicates they'll fill out the form do we load the resource with the LoadJS function mentioned earlier.

[next slide]

Alternative Feedback

If you know a task is long, decrease the delay between a user's request and visual feedback.Free up a user's attention by providing alternative notifications, e.g. email.Acknowledge the wait for the user

And finally, if we need to do some long-running task, make sure to acknowledge that to for the user. They'll understand that this is continuing to process, and are able to free their attention for focus elsewhere. This leads to more productivity for the user and no frustration from staring at a spinner on the page. All Mac users know how annoying it can be to watch the rainbow pinwheel spin spin spin!

[next slide]

Hide the Wait.

Hide the wait. Your users will thank you!

And with that... [next slide]

thank you!

I want to thank everyone for participating! I hope everyone is more equipped with knowledge to improve performance on their websites.

Also big thanks to AddThis for hosting the webinar!

I'll hand it back over to Justin to start off the Q&A.