caching for the mobile web and macklemore's thrift shop
DESCRIPTION
From the New York Front End Coders Meetup on Sept 13, 2013. Ninjas and rockstars are poor role models. This talk is about caching for the mobile webTRANSCRIPT
Sometimes I make poor decisions
The rest of the time
● I work on shutterstock.mobi
● Eventually needs to handle 100k visits per day around the world
● Uses a handful of web servers
● That means serious caching
Caching means better performancefor users and servers
1. Make fewer requests
2. Fetch smaller resources
3. Travel shorter distances
4. Use generic pages that are cacheable
Don’t kill the serversDo serve fast pages
Start by hunting for easy wins
● Use yslow and pagespeed to find ways to make fewer requests and to fetch smaller resources
● Try to score in the high 80s - low 90s
● Minifying, concatenating, and gzipping is only the first step
I’m, I’m hunting, looking for a come up, this is f*cking awesome
HTTP basics are important too
● HTTP is a core protocol of the internet
● HTTP headers determine how content will be cached on the browser AND along the network
● If HTTP headers are set properly the browser will travel shorter distances
Imma take your grandpa’s style
Caching involves more than a browser and a server
Reverse Proxies
Web Servers
Wireless Towers
OptionalCDNs
Internet Routers
Mobile Device w/
Radio
WebBrowser
billions millions 100’s 100k’s a few handfula few
Peep game, come take a look through my telescope
If headers are set correctly,the network serves the pages
This isn’t a beer or basketballLess hops is better
Reverse Proxies
Web Servers
Wireless Towers
OptionalCDNs
Internet Routers
Mobile Device w/
Radio
WebBrowser
billions millions 100’s 100k’s a few handfula few
A brief example
● Cache-Control: public, max-age=31536000
● Date: Tue, 17 Sep 2013 01:59:24 GMT
● Expires: Wed, 17 Sep 2014 01:59:24 GMT
● Server: nginx
● ETag: "151472-1379361363000"
Savin' my money and I'm hella happy that's a bargain, b*tch
Keep pages generic where possible
● Separate user-specific content from public pages at the request level to make pages generic and cacheable
Dressed in all pinkexcept my gator shoes,
those are green
Generic
User-Specific
The overall effect
Generic pages loaded from the network. User-specific content follows from the
web server via AJAX.
Reverse Proxies
Web Servers
Wireless Towers
OptionalCDNs
Internet Routers
Mobile Device w/
Radio
WebBrowser
billions millions 100’s 100k’s a few handfula few
● Any request caches generic pages in the network, creating a pull effect
● The more users there are, the faster the site appears to respond
● For layering in user-specific content via AJAX○ Use different domains
○ Use the same domain and scope user-specific content (and cookies) to a subdirectory
More generic pages means more caching
Don’t send cookies with generic pages.EVER.
● Sending cookies with generic pages signals that the response is not generic and should not be cached
Having the same one as six other peoplein this club is a hella don’t
Everyone go visit TMZ right now
Raise your hand if you can identify why it’s slow in this diagram
Reverse Proxies
Web Servers
Wireless Towers
OptionalCDNs
Internet Routers
Mobile Device w/
Radio
WebBrowser
billions millions 100’s 100k’s a few handfula few
If the first page load is slow in the the forest,and no one is around
● On first page load, there are too many variations in mobile network connectivity to perceive whether the site itself is slow
● I submit don’t optimize the first page load
● I submit the most important page load is the second
What you know about rockin' a wolf on your noggin?What you knowin' about wearin' a fur fox skin?
Optimize for the second load
● Serve all JS/CSS/Icons for the site on every page with a year-long cache expiry
● First Load: 297.3K 16 requests
● Homepage: 7.7K 1 request
● Login: 5.4K 1 request
● Search: 28.0K 1 request + imagesAs for performance impact:
Browsers get better all the time,the network doesn’t
Questions?
Special shout out to @ShutterTech!Keepin’ me honest since 2011