Smart Caching with Sulu: Improve your performance and serve dynamic content
Website performance means more than just a speedy site. It has a trickle-down effect on everything from search ranking, user engagement and bounce rates to sales, conversion rates, and operational costs. When it comes to website performance, it’s not just seconds but milliseconds that count. The effect is proportional: the more traffic that hits your site, the greater the impact on each of these factors. That’s why nailing site performance is crucial for any website today.
For site visitors, performance affects their experience as soon as they visit your site. Users will jump to a competitor if the site takes longer than the blink of an eye to load — a mere 400 milliseconds. And it’s not just speed that matters; up-to-date content is crucial too. For your DevOps teams, responsible for hosting, managing, and configuring the web servers, a heavy load on the infrastructure directly translates to a heavy load on them. These effects trickle down to impact CTOs and CEOs, who want to reduce unnecessary operational costs, and at the same time stay competitive with a fast performing website and an engaged user base.
How can you decrease the load on your servers and DevOps teams, serve visitors the latest content, and improve overall website performance to keep CEOs happy? Sulu’s answer to these issues is smart caching.
Considerations with caching
A lot of different factors impact how you implement caching in your infrastructure, including:
- Typical traffic — how many simultaneous requests your server generally receives over a given period.
- Repeat users — understand how people use your site. Repeat visits may require redownloading the same content, which takes longer.
- Peak times or “exceptional circumstances” — consider the nature of the content of your site. Will you often have ad campaigns to drive new users? Be aware of your high traffic times and anticipate increased loads.
- Location of servers — the physical distance from your servers to your users can affect load time.
- Size and power of your servers — you don’t want to be wasting your server resources by having too much power or not enough.
- Amount of dynamic vs. static content — regularly updated or personalized content (known as dynamic content) creates more complexity, longer load times, and more strain on your servers.
Introducing smart caching with Sulu
Taking an honest assessment of these factors will help you to configure a solution that best fits your needs. A basic configuration makes sense if your site’s content is more static and you have a smaller amount of traffic. You will need a more complex configuration if you have heavy traffic, or if your site has more dynamic content that changes frequently.
Whether a basic or complex configuration suits your needs, Sulu communicates with HTTP caching proxies as part of their smart caching practices. These caching proxies reduce the time it takes to fetch data for a user request and act like a buffer between your application and your users.
Use Symfony HttpCache for a basic configuration
Symfony HttpCache is Sulu’s default caching proxy. Written in PHP, it provides fast caching out-of-the-box, and it tends to work well for smaller companies who receive less traffic and need a basic configuration. Symfony HttpCache supports:
- Full page caching where the whole page is rendered and returned to the user. Full page caching is a good use case for pages with more static content.
- Cache invalidation and cache expiry come pre-packaged so you don’t have to build them yourself.
- Edge Side Includes (ESI) to handle dynamic content
You can read more about caching with Symfony HttpCache in Sulu’s documentation.
Use Varnish for a more complex configuration
The other caching proxy Sulu uses is Varnish, a fully-featured reverse proxy cache. Varnish is a more optimal solution for a large website with more dynamic content, especially if it receives high traffic. One of Varnish’s main benefits is its speed: it’s roughly two times faster than the default caching proxy, and it’s configurable to boot. Varnish also supports the use of ESI to handle dynamic content, as well as:
- Distributed setup which enables multiple servers
- Cache invalidation so your site will appear more up-to-date
- Cache warming to resolve the “blank page” issue
- Varnish Configuration Language (VCL) to customize and control cache behavior
Use Varnish with ESI to handle dynamic content
Sometimes, the majority of content on a page stays consistent for each user, but certain elements are dynamic. In the past, this was a difficult scenario to work with, because while dynamic content is extra engaging for users, it was not always considered cacheable. But technologies like ‘Edge Side Includes’ (ESI) have changed this. ESI allows specific parts of a page to have a different caching strategy to the main page.
As an example, say you have a page that is relatively static except for a news ticker at the bottom of the content. With ESI, you can cache the news ticker independently of the rest of the page.
How Varnish helps solve the “blank page” issue
If you have a high number of users who request the same data at the same time, they have to wait while the data is fetched from the server, and in the meantime they see the dreaded “blank page”. Most users don’t tolerate a blank page for long, and will quickly bounce from your site. To prevent this, Varnish provides a grace mode, which allows Varnish to deliver slightly stale content to clients while getting a fresh version from the backend.
When is it time to upgrade to a more complex caching setup?
It can be difficult to know when it’s time to move from a more basic caching proxy, like Symfony HttpCache, to a more fully-featured one, like Varnish. Some pointers on what to look for:
- If any of the above preconditions apply to your infrastructure, such as peak times or a distributed system with multiple servers
- If you have high traffic that has been steadily growing, especially if the growth is over a period of years
If you notice either of these two points, it may be time to upgrade to Varnish.
Getting started with Varnish
The first step is simple: install Varnish.
Next, reconfigure your web server. If you’re running Apache or Nginx, you have to reconfigure a virtual host to listen to another port and 80, the different default port for HTTP. After reconfiguring your web server, it’s time to configure Varnish to work with Sulu. Varnish then uses the Port 80 to be the default entry point for the web server over HTTP.
Finally, restart Varnish. If you’re experiencing issues, don’t forget to deactivate the Symfony HttpCache by commenting out some lines of code in your index php file, and then change the “proxy client” from “symfony” to “varnish”.
Varnish also has an optimal configuration, which Sulu has used in many projects. To get the most out of Varnish, you need to enable the tags option to “true”. This ensures that any updates made in the admin side are available immediately.
Now you are ready to start using Varnish for faster and more powerful caching. To learn more details on caching with Varnish, you can check out Sulu’s documentation.
Sulu uses smart caching to help clients handle high traffic and complex, dynamic content smoothly
Sulu’s client, Küchengötter, is a high-traffic recipe website with over 40,000 recipes online. With over 7 million visitors per month, Küchengötter would get flooded with requests around mealtimes. These spikes of traffic brought the servers down at least once a week, leading to frustration and high bounce rates from users, and a scramble for the Küchengötter DevOps team. The nature of Küchengötter’s pages were complex — the system needed to assemble content like recipe ingredients, cooking steps, and ratings. Combining this with the number of parallel requests Küchengötter received during its high traffic peaks, along with trying to clear the cache simultaneously, meant that Küchengötter needed a caching configuration that could evolve over time. Johannes Wachter, Core Sulu Developer, explains, “It was a tricky situation because we had to decide if we go with bigger servers to handle those few minutes of load while we clear the cache, or we could better use the server power by optimizing the Varnish configuration.”
There is only one request per stale cache entry with the Varnish configuration, which made Küchengötter better able to predict the load on their servers and save their DevOps teams a lot of headaches. It also reduced the frequent downtime they were experiencing to almost zero. Küchengötter was also able to use ESI to change only the dynamic parts of their pages that needed updating, and not waste valuable server resources on static content.
Smart caching can improve user experience and your site’s performance
Smart caching matters a lot to your site’s performance, and by extension, your user’s level of engagement. From a business point of view, it impacts conversion rates, SEO, and operational costs. With Sulu’s smart caching practices and the support of a powerful proxy like Varnish, performance and user experience can both be vastly improved. For more details and examples, check out the Sulu documentation, or join the free Slack channel to talk to us and ask more questions.