Low-level Query Caching in Rails
Brought down page load time from 200ms to 100ms
Performance is all we need
Almost every day we talk about "performance". It is one of the most crucial parts of problem-solving. In context of web-development, performance can be attributed to how fast the sever is responding to client's requests. It can be dependent on a lot of factors but the one which I am interested in is
caching. Caching is the silver bullet used in almost every web application to provide a seamless experience to its users.
Caching in Rails
Rails has various different kinds of caching like Page Caching, Action Caching, Fragment Caching, Low-level Query Caching, etc. We will talk about the last one that is the
Low-level Query Caching.
How to enable caching with Redis in Ruby on Rails?
Caching can be enabled from environment specific files. To enable it for development environment, we will add
config.cache_store = :redis_store, "redis://localhost:6379 in
We are good to go now and cache our queries.
Cache Posts List
For this blog, we are taking the posts list page as an example and we are going to enable caching posts query.
Initially we had:
# posts_controller.rb def index @posts = Post.all end
This would run the SQL on every request even though the result will be the same. Why not run it for the first time and cache it. Next time, we can directly send the result from cache instead of hitting DB to run SQL query.
Storing it in the cache for the first time would definitely take some time(in ms) which in turn wouldn't show any improvement for the first request but from, later on, it will check the cache and if there is a value stored, it will return the result from there itself which would save the time it took to execute SQL queries.
# ..controllers/posts_controller.rb def index @posts = Post.cached_all_posts #cached_all_posts is a method defined in Post model end # ..models/post.rb class Post < ApplicationRecord ... # this methods checks for `all_posts` key-value in Redis # if it finds, it returns the value # if it doesn't find it, it runs the block given to it # and stores it value under `all_posts` key in Redis def self.cached_all_posts Rails.cache.fetch(["all_posts"]) do all.to_a end end end
In the attached screenshot, you can see that when we ran
Post.cached_all_posts.count it executed the SQL and took 2.5ms but when it ran it for the second time, it didn't execute any SQL and directly returned the count in no time.
But we are not yet done!
With great power comes great responsibility
The missing part here is that in case we add new posts or delete existing ones, the result will be still the same as the cache/redis is unaware of that and hence we would end up serving the stale and incorrect response. 😨 😨
To tackle that problem, we have to bust the cache whenever a new record is added or existing ones are deleted. Busting the cache involves deleting the key-value from Redis. When a request for
posts is made again,
cached_all_posts won't be able to find the
all_posts key in Redis and hence it will execute the block which in turn will save the fresh results in cache.
with cache-busting added in place
# ..controllers/posts_controller.rb def index @posts = Post.cached_all_posts end # ..models/post.rb class Post < ApplicationRecord ... after_commit :flush_cache def self.cached_all_posts Rails.cache.fetch(["all_posts"]) do all.to_a end end def flush_cache Rails.cache.delete(['all_posts']) end end
after_commit hook will get executed whenever we
delete a post and hence it will, in turn, delete the
all_posts from Redis.
Before and after of UI
This is just the tip of the iceberg
What we just did is not even 1% of performance enhancements. We can achieve a hell lot of performance by aggressively yet carefully caching an application.
Thanks for going through this article.