Page load time has become a crucial indicator of user experience that’s even considered by Google as a core web vital for its ranking algorithm.
Caching is certainly one of the most important elements for page load optimization. However, improper use of it could lead to other problems.
This article will help you understand why setting up caching for websites is essential and make you aware of the common pitfalls. At first, let’s explore the benefits in detail.
With caching, data retrieving gets faster since it limits the need for full network roundtrips. If we cache content in the browser, the retrieval is instantaneous.
However, the user might experience a delay in the initial page load until the cache is populated.
In practice, there are many caching strategies you can use. Out of these, the lazy loading strategy gives out improved page load time, and it loads data to the cache only when necessary.
This avoids filling up the cache with unnecessary data.
Caching static content is safer than caching dynamic content where the browser takes care of the complexities. However, it won’t fit dynamic content since the browsers aren’t predictable for cache-invalidation.
However, if you need to implement dynamic content caching, I would recommend using a Content Delivery Network (CDN), which is more of a middle ground.
With CDN, content is cached closer to the user in the network path, reducing the latency. Besides, you can perform cache invalidation at the CDN, giving the relevant instructions. It is a clear differentiator, which makes it suitable for dynamic content caching, in contrast to browsers.
Caching helps to improve the availability of the website at different levels. When we cache content at the browser level, it avoids navigation issues under poor network conditions.
Also, we can use service workers to improve the offline working experience.
Besides, using CDNs will help to manage availability issues when the origin server goes offline. For instance, while doing a release or server update, the cache could serve as a failover.
Handling many requests through a single server requires a powerful infrastructure (better memory, storage, processing power, etc.). Maintaining such a server with high availability can be very costly.
Since frontend caching decreases the number of requests sent to the server, the need for a very powerful machine is avoided.
For example, if you have many high-resolution images on your website, fetching those files will always lead to heavy traffic on the server. However, if these are cached, a larger percentage of traffic can be eliminated. Therefore, a machine with a less powerful infrastructure will also be suitable.
With front-end caching implemented, even SMEs have a higher chance of scaling and accepting high traffic which gives a better chance at competing with the big players in the market.
Take frontend development to the next level with independent components. Build and collaborate on component-driven apps to easily unlocks Micro Frontends, and to share components.
OSS Tools like Bit offer a great dev experience for building and composing independent components, and build solo or together with your team.
So far, we have looked at how caching can make your application better. However, is caching always suitable? Are there instances where cached content can create problems? — Let’s find out!
Expiry headers (Time To Live — TTL) for content determine how often the content will be updated. Correctly set expiry headers will invalidate the cache in a timely manner, enabling the users to see the application's latest updates. This avoids stale data.
However, if you have set an expiry header far into the future for frequently changing content, users will see stale data on the site. Far-future cache headers are only suitable for static content that may not change often. Therefore, setting the correct expiry headers is very important to keep your application up to date.
Two main facts need to be considered as a best practice to set the TTL for cached content.
And remember, you have minimal control over the browser cache once the TTL is set. Therefore, you might also need to consider release frequency and ensure that the pages can use the correct version of the assets (CSS, JS, Images).
Most of the time, developers only focus on improving page load time for the website's homepage. However, when a user visits your site, they will go through a few pages before purchasing your service.
Therefore, even if your homepage loads really fast, if other pages are underperforming, there is a high chance that the user will not proceed with the purchase.
If that happens, spending so much effort to optimize the homepage will not give you the expected results. Therefore, it is important to implement caching strategies to the entire site or at least all the important pages that will help a user turn into a customer.
When the content is updated, a new copy needs to be loaded for the users. If a content refresh cannot be enforced automatically (using cache invalidation strategies), we should inform the users to do a hard refresh.
Failing to communicate this may result in showing stale data to a user. Therefore, it is important to send a notification or display a banner asking the user to refresh the site to retrieve the latest updates when a new version of the application is pushed.
You can use a technology like WebSockets to inform the web application that a new version is available. The message could be displayed as a banner on the homepage instructing to reload the page.
The first caching strategy implemented on your website may not be the best when it grows and attracts more traffic. As we update the website with new content and features, it is important to re-visit the caching configuration.
For example, if your website started with static content initially, the expiry of cache control headers would be set to far-future dates. However, with new implementations, if there is content that is changed more often, the cache configurations should also be modified with lower TTLs to retrieve fresh elements often.
Cache invalidation is important to tell the browser that a new version of the asset is available. If cache-invalidation strategies aren’t implemented, users of your application will always see stale data. Cache invalidation will help break the cache and force the browser to download a new copy of the asset.
It’s a best practice for single-page web applications to prevent the index page from caching in the browser to avoid serving the older version after a release, which could also impact stability serving a mix of asset versions.
Besides, the impact on performance is low since the HTML is typically loaded once per user session.
Let’s look at a few cache-invalidation strategies.
So far as we’ve seen, caching gives a lot of added advantages to our applications.
However, determining the best caching strategy for your application depends on the context of your application. For example, suppose your website is full of dynamic data that needs to be cached. In that case, it is better to cache it at the CDN level, where dynamic parameters can be considered for updating and serving cached content.
You can use tools like Chrome Dev Tools to view the cached content of your browser.
On the other hand, you must be well aware of correct caching strategies to avoid any pitfalls you may face throughout the development process. Besides, having a wrong caching strategy is worse than not having caching at all.
Therefore, watch out for those corner cases and establish the best practices around caching.