Distributed Caching with NCache in ASP.NET Core

In this article, let's look at how we can implement distributed caching in ASP.NET Core, with NCache as the caching provider.

A cache is a high-speed memory that applications use to store frequently accessed data. Using a cache reduces unnecessary database hits, since the data being requested is readily available in the cache and hence the response times can be significantly lower when compared to the otherwise. Caching is popularly used as a performance improvement technique.

Use of Caching in APIs

In the context of Web APIs – developers can either adopt Response Caching, where the API sends additional information about the response in the headers, using which the consuming client can cache the response.

In this article, we’ll focus on Data Caching, where cache is used as an auxiliary store for performance optimisation.

Types of Caches

Speaking of using cache as an auxiliary store, applications can use a part of its memory for caching frequently accessed data. This is called as In-Memory Caching. This approach is suitable for simpler applications that run on a single server node, and all the requests are served by that node alone.

But it isn’t helpful for a load-balanced distributed system where a request could be handled by one of the many application nodes.

This is where we externalize the cache out of the application server nodes and is maintained as a separate system. All the application server nodes connect to this “external” caching server and set or get data as required. This is called Distributed Caching.

What is a Distributed Cache?

A Distributed Cache is a cache that is placed external to the application nodes, with the same properties as that of an in-memory cache – high speed memory with low-latency data reads and writes. In a distributed cache, the cached data spans across several nodes across one or more clusters, spanning across regions.

Implementing a Distributed Caching system helps applications leverage the advantages of caching, while reducing data redundancy – a scenario where two or more application nodes end up caching the same data.

Since a distributed cache sits external to the application, it has a very slight latency when compared to an In-Memory cache; although it is very negligible. Using a Distributed Cache system also brings Highly Available, Scalable and Fault Tolerant cache to the system.

wp-content/uploads/2022/05/nc-distcache-arch.png

Developers can choose from the many popular distributed caching options available in the market such as Redis, Memcached, NCache etc. Even Cloud providers like AWS provide managed caching solutions such as AWS ElastiCache.

Distributed Caching in ASP.NET Core

We can connect our ASP.NET Core applications to any distributed cache cluster and use it for caching as required via the IDistributedCache interface provided by dotnetcore. Almost all the popular cache providers provide their implementation to the IDistributedCache and we can register the implementation accordingly into the IoC container via IServiceCollection.

In this article, let’s look at how we can implement distributed caching in ASP.NET Core, with NCache as the caching provider.

What is NCache?

NCache is a popular Cache provider in the .NET space. The cache is built using .NET and has a very good support for .NET applications. It has a rich set of library which can help in implementing query caching over Entity Framework Core.

NCache offers a great set of features and comes in three flavors – Open Source, Professional and Enterprise; which customers can choose from based on their needs. Do check out the editions while deciding which one to choose. https://www.alachisoft.com/ncache/edition-comparison.html

NCache fully supports caching in ASP.NET Core and provides its implementation of IDistributedCache which we can register and use it accordingly.

Using NCache for Distributed Caching in ASP.NET Core

To demonstrate how to connect and work with NCache for caching, let’s take the example of an API that returns a list of Items from a database. This Items API has two endpoints – one which returns a list of all the Items and another which returns a single Item by Id.

We’ll integrate NCache as a caching tier for this API implementation and store frequently accessed items onto the cache.

To keep things simple, I’ll use the ContainerNinja.API and integrate the caching tier over it. ContainerNinja.API project is a part of the ContainerNinja.CleanArchitecture boilerplate solution built using .NET 6 following industry best practices. You can find the boilerplate here – https://github.com/referbruv/ContainerNinja.CleanArchitecture

To connect with NCache, first I’ll install the NCache caching extensions package into my Core layer (./ContainerNinja.Core/)

> dotnet add package NCache.Microsoft.Extensions.Caching --version 5.1.0

Then I’ll register the NCache DistributedCache into the IoC container, through which we can later access the caching service. I’ll place this inside the ServiceExtensions class inside the ContainerNinja.Core layer. We need to mention the cache name and other additional options as required.

services.AddNCacheDistributedCache(configuration =>
{
    configuration.CacheName = "myCache";
    configuration.EnableLogs = true;
    configuration.ExceptionsEnabled = true;
});

We can also externalize these configuration values using another overload of the same method as below:

services.AddNCacheDistributedCache(configuration.GetSection("NCacheSettings"));

where the section NCacheSettings is within the appsettings.json and is as below:

{
  --- other sections ---

  "NCacheSettings": {
    "CacheName": "mycache",
    "EnableLogs": "True",
    "RequestTimeout": "90"
  }
}

Observe that I haven’t mentioned any details about the NCache server to which the API needs to connect with. These settings are available inside the “client.ncconf” file I’ll place inside the API layer and is copied into the executables when deployed.

I have my NCache cluster running in my local machine where the application is being run, so it by default looks into the NCache installation folder for client.nconf file.

It looks like below:

<?xml version="1.0" encoding="UTF-8"?>
<!-- Client configuration file is used by client to connect to out-proc caches. 
Light weight client also uses this configuration file to connect to the remote caches. 
This file is automatically generated each time a new cache/cluster is created or 
cache/cluster configuration settings are applied.
-->
  <configuration>
    <ncache-server connection-retries="5" retry-connection-delay="0" retry-interval="1" command-retries="3" command-retry-interval="0.1" client-request-timeout="90" connection-timeout="5" port="9800" local-server-ip="xxx.xxx.xxx.xxx" enable-keep-alive="False" keep-alive-interval="0"/>
    <cache id="demoLocalCache" client-cache-id="" client-cache-syncmode="optimistic" skip-client-cache-if-unavailable="True" reconnect-client-cache-interval="10" default-readthru-provider="" default-writethru-provider="" load-balance="False" enable-client-logs="False" log-level="error">
      <server name="xxx.xxx.xxx.xxx"/>
    </cache>
    <cache id="mycache" client-cache-id="" client-cache-syncmode="optimistic" skip-client-cache-if-unavailable="True" reconnect-client-cache-interval="10" default-readthru-provider="" default-writethru-provider="" load-balance="False" enable-client-logs="False" log-level="error">
      <server name="xxx.xxx.xxx.xxx"/>
      <security>
        <primary user-id="MYDOMAINbruv" password="*************"/>
        <secondary user-id="" password=""/>
      </security>
    </cache>
  </configuration>

Using IDistributedCache service for accessing cache

Till now we have completed the connectivity part – I have the package installed and I got my client nconf file to connect to the cache.

Now I need to integrate this cache into my business logic to insert and fetch from data as required.

To demonstrate, let’s say I want to implement caching over GetItemById() API call where I query the database for a given itemId passed by the client.

For this, I’ll inject the IDistributedCache service into the query handler class GetItemByIdQueryHandler (./ContainerNinja.Core/Handlers/Queries/) and add a condition on top of the database query, basically to –

“check inside the cache and only if not available then query the database. And while returning the query results, store it inside the cache for a possible future hit”.

The code looks like below:

namespace ContainerNinja.Core.Handlers.Queries
{
    public class GetItemByIdQuery : IRequest<ItemDTO>
    {
        public int ItemId { get; }
        public GetItemByIdQuery(int id)
        {
            ItemId = id;
        }
    }

    public class GetItemByIdQueryHandler : IRequestHandler<GetItemByIdQuery, ItemDTO>
    {
        private readonly IUnitOfWork _repository;
        private readonly IMapper _mapper;
        private readonly ICachingService _cache;
        private readonly ILogger<GetItemByIdQueryHandler> _logger;

        public GetItemByIdQueryHandler(ILogger<GetItemByIdQueryHandler> logger, IUnitOfWork repository, IMapper mapper, ICachingService cache)
        {
            _repository = repository;
            _mapper = mapper;
            _cache = cache;
            _logger = logger;
        }

        public async Task<ItemDTO> Handle(GetItemByIdQuery request, CancellationToken cancellationToken)
        {
            var cachedItem = _cache.GetItem<ItemDTO>($"item_{request.ItemId}"); 
            
            if (cachedItem != null) 
            { 
                _logger.LogInformation($"Item Exists in Cache. Return CachedItem."); 
                return cachedItem; 
            } 
            
            _logger.LogInformation($"Item doesn't exist in Cache."); 
            
            var item = await Task.FromResult(_repository.Items.Get(request.ItemId)); 
            
            if (item == null) 
            { 
                throw new EntityNotFoundException($"No Item found for Id {request.ItemId}"); 
            } 
            
            var result = _mapper.Map<ItemDTO>(item); 
            
            _logger.LogInformation($"Add Item to Cache and return."); 
            
            var _ = _cache.SetItem($"item_{request.ItemId}", result); 
            return result; 
        } 
    } 
} 

The ICachingService implementation uses IDistributedCache to store and retrieve entities from the connected cache – which in this case is the NCache cluster.

namespace ContainerNinja.Core.Services
{
    public class DistributedCachingService : ICachingService
    {
        private readonly IDistributedCache _cache;

        private readonly DistributedCacheEntryOptions options = new DistributedCacheEntryOptions
        {
            AbsoluteExpirationRelativeToNow = TimeSpan.FromHours(24),
            SlidingExpiration = TimeSpan.FromMinutes(60)
        };

        public DistributedCachingService(IDistributedCache cache)
        {
            _cache = cache;
        }

        public T? GetItem<T>(string cacheKey)
        {
            var item = _cache.GetString(cacheKey);
            if (item != null)
            {
                return JsonConvert.DeserializeObject<T>(item);
            }
            return default;
        }

        public T SetItem<T>(string cacheKey, T item)
        {
            _cache.SetString(cacheKey, JsonConvert.SerializeObject(item), options);
            return item;
        }
    }
}

This implementation is registered in the container, inside the Core layer.

services.AddSingleton<ICachingService, DistributedCachingService>();

Comparing the API response times – cache miss vs hit

When we run the API solution and hit the endpoint “/api/v1/Items/:id” with some Id value available, for the first time the API looks up for the item in the cache and since the cache doesn’t have the item it queries in the database and returns the data. This is a cache miss scenario and we can notice a typical response time depending on the query performance.

The below screenshot shows you the average time taken for the API call via postman. The request took around 28 ms for the response.

wp-content/uploads/2022/05/nc-nocache-postman.png

When I hit the same for a second time, the API looks up for the item in the cache and this time since the item was added in the first call the cached item is pulled out and sent as response. This is a cache hit scenario and we can see the reduced response time in the postman (around 19 ms which is a 50% reduction, can go even lower).

wp-content/uploads/2022/05/nc-cache-postman.png

Also we can see the cache hit in the NCache Web Manager portal which helps us monitor the cache cluster health and traffic in real-time.

wp-content/uploads/2022/05/nc-cache-ncacheui.png

Final Thoughts

Caching is an important technique used to hold and readily use frequently accessed data. The benefit is that you can save on the time to querying the database reducing latency and also the costs involved.

While In-Memory caching gives a good headstart, Distributed Caching takes it to the next level and is most suitable for distributed systems and cloud native solutions.

NCache is one of the popular choices in caching solutions along with others such as Redis, Memcached and so on. It offers features such as realtime cluster management and rich set of libraries for integration with ASP.NET Core.

Buy Me A Coffee

Found this article helpful? Please consider supporting!

On the other hand, NCache also offers query caching which can be easily integrated with Entity Framework Core, which helps in reducing latency at the Infrastructure tier.

The code snippets used in this article are from the ContainerNinja.API project; which is a part of the ContainerNinja.CleanArchitecture boilerplate solution built using .NET 6 following industry best practices.

You can find the boilerplate here – https://github.com/referbruv/ContainerNinja.CleanArchitecture. Please do leave a star if you find the solution useful.


Buy Me A Coffee

Found this article helpful? Please consider supporting!

Ram
Ram

I'm a full-stack developer and a software enthusiast who likes to play around with cloud and tech stack out of curiosity. You can connect with me on Medium, Twitter or LinkedIn.

Leave a Reply

Your email address will not be published. Required fields are marked *