How to Redis Caching in ASP.NET Core

In this article, let's talk all about caching in general and integrating & working with Redis Cache in an ASP.NET Core application.

Caching is one of the most important strategies developers generally consider while building efficient and scalable APIs or applications.

Why Caching?

Caching helps reduce load on the backend databases by persisting frequently accessed data and contributing to overall application performance and turnaround time for such requests.

Types of Caching

Cache is generally referred to a dedicated memory that maintains these data with a very high read times. In the world of ASP.NET Core (or any framework in general), we have two options for caching to consider:

  1. In-Memory Caching and
  2. Distributed Caching

In-Memory caching refers to storing these frequently accessed data inside the application memory, which works perfectly fine for individual application loads. But one disadvantage of this is that the caching is internal to a single application node and this cache is cleared when the application restarts upon any event.

This approach also doesn’t work well in the idea of load-balanced environments where one request might be handled by more than one application server node for processing. This paves way for the Distributed Caching mechanism.

Pros of Distributed Caching

In the Distributed Caching mechanism, the cache is maintained external to the application nodes as an individual distributed component and the applications can still access these cache nodes with high read times.

This results in:

  1. Consistency – of data across nodes
  2. Shareability – of cached data across nodes
  3. Persistence – of data even when one or more nodes restart or fail

There are two popular solutions for Distributed Caching, which are implemented and provided as managed services by almost all popular cloud providers:

  1. Redis
  2. Memcached

We have looked in detail about working with Memcached and later integrating Memcached as a managed service when deployed in the AWS environment using the ElastiCache service.

Integrating Redis Cache with ASP.NET Core

In this article, we shall look in detail about integrating Redis cache in our ASP.NET Core application and look at the general idea of implementing a cached layer of data.

Setup Redis Server (via Docker)

Redis Cache is a caching server which needs to run as a service for the application to communicate and work with. Redis can be installed in Windows/Linux machines via their distributions, but the fastest and simplest approach is to run as a docker container (my personal favorite). To run Redis in a docker container, use the following command

> docker run -d -p 6379:6379 --name myredis redis 

This gets the Redis up and running, and is available for the other applications to communicate over the port 6379.

Install Redis Package

ASP.NET Core provides a generic interface IDistributedCache with methods to store and retrieve data from any cache implementation that is registered on this interface. To install and register Redis as a cache provider, we use the below nuget package.

> dotnet add package Microsoft.Extensions.Caching.StackExchangeRedis

And then we register the service in the ConfigureServices() method of Startup class as below:

services.AddStackExchangeRedisCache(options =>
{
    options.Configuration =
 
quot;{Configuration.GetValue<string>("Redis:Server")}:{Configuration.GetValue<int>("Redis:Port")}"; }); 

Where the Redis server information is configured in the appsettings.json as below:

"Redis": {
    "Server": "localhost",
    "Port": 6379
}

Integrating the Service

As mentioned before, ASP.NET Core provides a generic IDistributedCache service which provides methods necessary to interact with the cache in a simple way. It provides methods to store data in the form of bytes or strings. I’d want to wrap this service further into my own CacheService which can handle complex types, say a Hero record from the database.

My ICacheService interface looks like below:

namespace RedisHeroesApi.Contracts
{
    public interface ICacheService
    {
        T Get<T>(string key);
        T Set<T>(string key, T value); 
    }
}

I’d not say this is the perfect design of an abstraction, but it gets things done for us.

The CacheService which implements this is as below:

namespace RedisHeroes.Core.Services
{
    public class CacheService : ICacheService
    {
        private readonly IDistributedCache _cache;

        public CacheService(IDistributedCache cache)
        {
            _cache = cache;
        }

        public T Get<T>(string key)
        {
            var value = _cache.GetString(key);

            if (value != null)
            {
                return JsonConvert.DeserializeObject<T>(value);
            }

            return default;
        }

        public T Set<T>(string key, T value)
        {
            var options = new DistributedCacheEntryOptions
            {
                AbsoluteExpirationRelativeToNow = TimeSpan.FromHours(1),
                SlidingExpiration = TimeSpan.FromMinutes(10)
            };

            _cache.SetString(key, JsonConvert.SerializeObject(value), options);
            
            return value;
        }
    }
}

The IDistributedCache interface provides Get methods Get() and GetString() which return byte array and string for an input key respectively. Similarly we have Set() and SetString() which operate in byte array and string respectively. In this I’d go with a GetString() because that’s relatively simple. The GetString() returns a stringified Hero record (for example) for a given key and I’d return a deserialized Hero object from the string.

In the case of SetString(), we would also need to specify the caching options such as how much time the cache shall be valid and so on. We’ve set two options here:

  1. AbsoluteExpirationRelativeToNow -> sets the time in which the cache would expire starting from the time of insertion (which represents the now)
  2. SlidingExpiration -> time upto which the cache entry shall be valid, before which if a hit occurs on the time shall be extended further.

This service shall be injected into the HeroesRepository where we use it to cache and return a single Hero record for a GetSingleHero request. I’m reusing the DapperHeroesRepository which we created while working with Dapper Integration into our Heroes API.

namespace RedisHeroesApi.Core.Repositories
{
    public class DapperHeroesRepository : IHeroesRepository
    {
        private readonly IDapperr _dapperr;
        private readonly ICacheService _cache;

        public DapperHeroesRepository(IDapperr dapperr, ICacheService cache)
        {
            _dapperr = dapperr;
            _cache = cache;
        }

        public Hero Single(long id)
        {
            // TryGet Hero from Cache
            // If not Available pull from DB
            var cached = _cache.Get<Hero>(id.ToString());

            if (cached != null) return cached;
            else
            {
                var sql = @"SELECT h.* FROM Heroes h WHERE h.Id == @id";
                var dp = new DynamicParameters(new { id });
                var result = _dapperr.Query<Hero>(sql, dp).FirstOrDefault();

                // insert into cache for future calls
                return _cache.Set<Hero>(id.ToString(), result);
            }
        }

        ... other implementations
    }
}

The API layer is unchanged and doesn’t need to know that we’ve created a cached implementation of our Repository while integrating with a Redis cache server for help.

We can further perfect this approach by implementing a Decorator pattern for our Repository, and by using caching strategies such as Lazyloading and Write-through caching, which we’ve discussed in detail earlier.

Conclusion

Caching is an important and simple solution for performance optimization. By reducing a database read, we can both improve the response time and the cost as well.

Distributed caching takes this a step further and help efficient caching in a load-balanced environment. Redis cache is one of the most popular cache servers out in the market which offers some features such as caching events and other stuff, apart from the usual Get and Set caching.

Buy Me A Coffee

Found this article helpful? Please consider supporting!

ASP.NET Core further simplifies this by providing IDistributedCache service that wraps the functionalities of a cache in a generic implementation.

To efficiently implement caching in applications, we can follow a Decorator pattern for extending our existing non-cached functionalities. We can also apply one of the two caching strategies – Lazyloading and Write-through caching based on our requirements and design.

Full Example: https://github.com/referbruv/redis-heroes-dotnetcore-example


Buy Me A Coffee

Found this article helpful? Please consider supporting!

Ram
Ram

I'm a full-stack developer and a software enthusiast who likes to play around with cloud and tech stack out of curiosity. You can connect with me on Medium, Twitter or LinkedIn.

Leave a Reply

Your email address will not be published. Required fields are marked *