Performance is one of the most complicated aspects of your application development cycle. It commands the faith of your application and demands continuous attention from you in terms of improving it at every stage.
Every application represents the two sides coid - first is the content it displays and second is the underlying infrastructure where it is hosted. It needs to be a perfect amalgamation of making your application load fast at the same time keeping load light weight on your infrastructure.
To bring this theory in reality - you cannot depend upon a single tool or a system. Let's look at such different systems which can be used at different levels and for specific reasons.
Here is a quick look at the Cache Layers of web application:
01. Browser Cache
For performance improvement and faster page delivery in the browser, browser cache can be enabled. But, in the event of utilization of the services such as Cloudfront, Varnish, Redis and DBs, browser cache can create a problem by showing stale content.
Hence, it is best to not utilize the browser as web applications will have no (or minimal) control about purging the browser cache.
02. Content Delivery Network
Options: Cloudflare, Cloudfront, Acamai, Fastly.
CDN gets your cached data decentralized and reduces the direct hits to the server dramatically.
This way CDN serves the cached data of your web pages from the nearest data center pinging the shortest possible route.
CDN caches the full page and is useful for anonymous users.
We need to configure exactly what kind of data needs to be cached and for how long.
- Bypass the cache for admin pages
- Bypass the cache for anything specific to user sessions
- Cache everything (excluding above two specs)
The specification above will cache your entire site except the admin or active user session specific pages.
03. HTTP Reverse Proxy
Options: Varnish, Nginx, Squid and TrafficServer
HTTP reverse proxy also known as HTTP accelerator. It caches the content and serves the same directly to the browsers or CDN while acting as a mediator between client and server. It primarily speeds up the content delivery by skipping the data manipulations at business and DB layers.
04. In Memory Cache
Options: Memcache, Redis, Amazon DynamoDB Apache Ignite
In memory caching temporarily stores the data in dynamic memory (RAM) and enables significantly faster data retrieval in the cases where common and repetitive data access patterns are followed by the application.
Flexibility of such systems depends on the volume of dynamic data and the scalability it needs to achieve.
In case web application is caching the pages internally, that can be disabled and In-memory cache such as Memcache can be configured.
05. DB Query Caching
Options: MySQL, Postgres, MariaDB, MongoDB, DB2
Query caching is a mechanism in which frequently queried data is stored temporarily in the memory. This helps in not hitting the DB engine and manipulating the operations but serves the queried data from cache.
It is highly efficient in high-read, low-write kind of situations which is quite common to most of the websites.
06. Cache Purging
In simple words, caching generates the static copy of the web page and serves it to the client without making any dynamic calls to servers (app / DB).
Purging the cache means - next time your clients renders the page, for very time it will get the latest (new) content from the underlying layers such as HTTP Proxy, In-memory Cache or DB.
Not every web application will have all the cache layers explained here. Based on the application, its serving audiences and the nature of visitors - layers can be added or removed to bring the performance efficiency.