In Optimizing Optimizely Part 1 we uncovered how the implementation of the snippet impacts on customer experience and discussed some best practices covering how to implement the Optimizely snippet. In this article we take a look at how to optimize further to gain improved web performance for client-side in-browser implementations.
Sizing the Optimizely snippet
Part 1 covered how long the snippet can take to download and be processed by the browser. A key tenant of web performance is ‘only download what the web page needs’ as this enables both the network bandwidth and browser CPU to be optimized.
From HTTP Archive figure 1 shows a wide range of snippet sizes being deployed from 60 KB to almost 400 KB. As compressed values these are large downloads in their own right but they must be uncompressed and processed by the client CPU and even a 147 KB gzipped download (90th Percentile) converts to 688 KB of code to be parsed and executed.
As Optimizely X does not have the prerequisite of jQuery it does not need to be loaded either before or as part of Optimizely. At a minimum, this can save over 30 KB of gzipped and minified jQuery code that must be loaded and executed.
React Has Overheads Too!
The removal of jQuery offers improved web performance for Optimizely users but this gain is being negated by the adoption of React.js, or similar, in Optimizely snippet development.
There is no doubt that React can deliver a productivity gain for development engineers as they can develop and test in Optimizely prior to consolidating, with minimal code change, new features into the codebase.
However, from a web performance perspective, this approach has the same negative impact as jQuery.
These delays can be similar to those observed with jQuery (as discussed in Part 1). Unfortunately, legacy experiments may still require jQuery to be present adding to web performance delays until jQuery is completely removed.
Secondary Page Loading
The benefits of caching data are well known but when running experiments, caching snippets for too long may result in the experiment becoming stale and out of date. Optimizely’s time-to-live (TTL) feature provides a dynamic caching facility to enable this problem to be controlled. The time setting ensures that the validity of the snippet is checked after the TTL value. Consequently, the website visitor will be able to benefit from a cached version of the snippet that can be reused as they move from page to page on the site.
The default value for the TTL is 2 minutes which may result in repeated downloads of the snippet if the session is longer than this and extends across several pages.
An alternative to this is to extend the TTL to a value just longer than your average session length. Options to do this are limited as TTL can be set at 5 minute intervals so most websites will benefit from the setting of 5 or 10 minutes, unless the snippet changes more frequently than every 5 or 10 minutes. However, beware not to make this setting too long otherwise as the cache may not expire for some time with website visitors unable to benefit from any updates that you implement.
Wrapping it all up
In these two articles we have reviewed many aspects of optimization for Optimizely that can be investigated on your website that can help towards improving web performance. The key recommendations are summarized:
- Ensure priority CSS resources are referenced in the HTML ahead of the Optimizely Snippet.
- Do not use a tag manager for implementing the snippet.
- Remove redundant A/A tests once all the supporting development work has been completed.
- Review the TTL value for caching of the snippet to reflect the length, in minutes, of the majority of visitor sessions.