Server Cache – Basics

All internet users have some favourite websites that we tend to visit often. Or relate to certain information online quite frequently. What does this indicate? One, every time one is visiting a website, there is a demand on the enterprise bandwidth. Two, one is wasting time in going through the web server to the website he or she is interested in. This is where a server cache comes into picture.

A server cache is a dedicated server that saves information on web pages or internet content locally. Cache means temporary. So a server cache is just a temporary storage point. This has been found to be extremely useful because it speeds up access to data and also reduces the demand on the enterprise bandwidth thereby enabling effective traffic management. The other benefit of cache servers is the availability of stored information offline.

Now, a huge database of internet information or website usage cannot be placed in a cache server. The latter has limited size that is optimized based on performance desired. So lots of techniques like web caching are in place to select the particular information that will be stored in the cache server. “Least Recently Used” is one such technique wherein the webpage which was farthest in the usage history is thrown out. So basically, this list becomes dynamic entirely dependent on the user’s activity in the server. Another technique is based on size of the file cache.

Certain servers keep two caches – small file and large file. Each of these caches has predefined minimum and maximum limits to the file size for entry to a particular one. Files having greater size than the maximum size allowed for an entry to the large cache are not kept in cache. While small file cache uses the physical memory, the large file cache uses the virtual memory or what is stored via web caching.

Server cache is enabled by clicking “yes” to cache_enable parameter in the server {} block. It can be disabled by clicking the cache_enable parameter to “no”. By default, it is set to “yes”. Now, not all files can be cached. Legally, caching can be applied only to the content that has been declared cacheable by its owner. But these days , content owners exercise more control on their content because of the monetary aspect and for faster traffic management. Hence most of the content is declared non-cacheable. The goal of caching in is merely to minimize the need to send requests the need to send full responses. The former mechanism is termed as “expiration” while the latter as validation”.

 
Back
Top