#![desc = "Random thoughts of a software engineer"]

Let browsers cache static files to greatly speed up your site

There are many factors that determine how long a page takes to show up in a browser. When a page is requested by a browser, the server needs some time to compute the page contents (controller, model/database, view) and returns HTML to the browser. Besides network latency and throughput (which can be optimized by choosing a good hosting provider), there are several ways to speed up the page generation itself (like memoization, action/fragment/query-caching, using memcached, and so on).

But even after the HTML page itself has been transferred to the browser, the page is not necessarily ready to display yet. A page usually references more resources like stylesheets, javascripts and images that need to be requested and transferred. These are mostly static files (located in the public folder of a Rails application) and are directly served by the webserver without invoking the Rails application. So these files takes very little computation time on the server, but they still need some time and bandwith to transfer, which can be optimized.

Reducing the number of static file requests per host

A browser usually limits the number of concurrent requests it sends to a single IP address. If an application returns pages with many references, page loading will slow down because the browser needs to serialize requests. To speed up the loading of many static files, it's common practice to reduce the number of static files, e.g. by bundling multiple stylesheets and javascripts into a single file (see the :cache option of javascript_include_tag and stylesheet_link_tag), or to set up multiple asset hosts so that more requests can be done concurrently.

Reducing the number of static file requests at all

Modern browsers usually don't request all static files again for each page. Files are cached between requests and they're only transferred again, if they have changed on the server. For this purpose, the browser asks the server if its cached version of a file is still valid by adding a special HTTP header (like If-Modified-Since) to the request. If the file wasn't changed, it doesn't need to be transferred again and the server answers with a 304 Not modified result without sending the file again. As far as I know, this behaviour is configured by default on todays default webserver installations.

Btw, you can use this for your HTML content as well, if you add conditional get support to your controllers.

Conditional get support prevents unnecessary data transfer and therefore saves you quite some bandwith. However, browsers still contact the server for every static file to ask if their cached version is up to date. This can take some time and slows down loading of pages with many images. Since static files don't ever change (unless you deploy a new version of your application), it would be ok for browsers to don't ask the server at all and just use their locally cached version. This greatly speeds up page loading since the browser can skip requesting any referenced static file (except for the first time a visitor comes to your site).

For Apache, simply enable mod_expires and add the following to your Apache configuration:

ExpiresActive On
<FilesMatch "\.(ico|gif|jpe?g|png|js|css)$">
  ExpiresDefault "access plus 1 year"

Using this configuration statements, Apache adds response headers for static files that instruct browsers to cache the files for up to one year without further checking.

You don't even need to worry about cache invalidation if you deploy a new version of your app. Rails automatically adds timestamps to all references to static files if you use the according helper methods (like stylesheet_link_tag, javascript_include_tag and image_tag). This means, that the URL of a static file is changed if the file's timestamp is changed. This way, browsers automatically request the new file.