Table of Contents
Introduction
In today’s digital landscape, businesses must be able to access and process data quickly to stay ahead of the competition.
As microservices architecture becomes increasingly popular, accessing data from multiple microservices can lead to latency and slow response times.
Caching is a technique that can improve the performance by reducing time to access data.
In this article, we will discuss the importance of caching in microservices, the types of caching available, and how we can use NCache for high-speed data access.
What is Microservice Architecture?
In Microservices architecture, large applications are broken down into smaller, independent components called microservices.
Each microservice takes care of a specific functionality of the application and communicates with other microservices using APIs.
Benefits of Microservice Architecture
- Loosely Coupled. Changes to one microservice will not impact other microservices.
- Developers can work on individual microservices independently, which can lead to faster delivery cycles
- It provides better application scalability and resilience.
What is Caching?
Caching is the process of persisting frequently accessed data in a high-speed memory for a fixed amount of time so that future requests for that data can be served faster. This high-speed memory is called a Cache.
Instead of accessing the original data source every time data is requested, the data is first looked up in the cache. If the data is found in the cache, it is served from the cache instead of the original data source.
This can greatly reduce latency and improve application response times.
Benefits of Caching in Microservice Architecture
Implementing a cache layer in a microservice architecture can provide the following benefits –
- Performance Improvements. Caching can reduce data access times, which results in faster application performance.
- Reduced load on data sources. By caching frequently accessed data, microservices can avoid unnecessary requests to the original data source, resulting in reduced load and better resource utilization.
- Cost savings. As the processing time on the original data source is reduced, it can also translate reduced compute cost and infrastructure costs.
- Reliability. Caching can help microservices remain available and responsive even when the original data source is down or unavailable for sometime.
- Security. Caching can help protect sensitive data by reducing the number of requests that need to be sent to data sources, thus minimizing the risk of data breaches.
- User experience. Caching can help deliver a more consistent user experience by reducing the response time of frequently accessed data, improving the overall responsiveness of the application.
Types of Caching
Based on how and where the data is cached, caching can be broadly divided into two types.
In-Memory Caching
A microservice can use a part of its memory for caching frequently accessed data.
This is called In-Memory Caching. This approach is suitable for simpler applications that run on a single server node, and all the requests are served by that node alone.
But it isn’t helpful for a load-balanced distributed system where a request could be handled by one of the many application nodes. The same data could be cached in multiple nodes, which is an inefficient use of resources.
Distributed Caching
Caching layer is externalized out of the application server nodes and is maintained as a separate system.
All the application server nodes connect to this external caching layer and set or get data as required. This is called Distributed Caching.
This approach is efficient and beneficial for microservices because the caching layer can be maintained and scaled as required, and any microservice can interact with the caching layer for data that might be placed by another microservice.
Popular Caching Frameworks for Microservices
There are several popular distributed caching frameworks available in the market such as Redis, Hazelcast, Memcached, NCache etc.
Cloud services such as Azure (Azure Cache for Redis, NCache) and AWS (Elasticache for Redis and Memcached) provide these caching frameworks as services for building cloud native microservice architectures.
What is NCache?
NCache is a popular distributed caching solution for .NET and Java applications. It provides in-memory caching of frequently accessed data that can enhance application performance and scalability.
Features of NCache used in Microservice Architecture
NCache provides several solutions for implementing caching in microservices developed in .NET technologies.
Distributed Cache
NCache supports Distributed Caching, which can help microservices to use a common cache layer for storing frequently accessed data with minimal latency and high efficiency.
To connect with NCache, just install the NCache caching extensions package into the microservice code
dotnet add package NCache.Microsoft.Extensions.Caching --version 5.1.0
Then register the NCache DistributedCache into the IoC container, through which we can access the caching service. We just need to mention the cache name and other additional options as required.
services.AddNCacheDistributedCache(configuration =>
{
configuration.CacheName = "myCache";
configuration.EnableLogs = true;
configuration.ExceptionsEnabled = true;
});
I have written a detailed guide on how to implement Distributed Caching with NCache, you can read it here – https://referbruv.com/blog/distributed-caching-with-ncache-in-aspnet-core/
Pub/Sub Messaging
NCache provides a simple but powerful Pub/Sub Messaging platform to allow microservices to communicate and notify other microservices in the case of any events. This is much better than other synchronous protocols like HTTP/HTTPS, WebSockets, etc.
A Publish/Subscribe (Pub/Sub) messaging paradigm involves an intermediary channel (called a topic) to exchange messages between multiple applications without the need to know about the sender (publisher) or the receiver (subscriber). NCache acts as a medium for topics so that the publisher publishes the message to the topic.
This ensures loose coupling within the model, providing additional benefits to the distributed topics. NCache also provides a dedicated Pub/Sub Messaging cache for the same.
SQL Query on Cache
NCache provides capability for the microservices to quickly find relevant data through SQL Searching or grouping make it very easy to process it.
This helps significantly when there is a huge amount of data being cached and microservices need to quickly lookup through the stored objects.
This feature is useful in cases where the values of the keys against which the required information is stored are unknown.
For example, you can query for all the cached items associated as below –
var cache = _databaseContext.GetCache();
string query = "select * from dbo.Items WHERE ItemId > ?";
// Use QueryCommand for query execution
var queryCommand = new QueryCommand(query);
// Providing parameters for query
queryCommand.Parameters.Add("ItemId",50000);
// Executing QueryCommand through ICacheReader
ICacheReader reader = cache.SearchService.ExecuteReader(queryCommand);
// Check if the result set is not empty
if (reader.FieldCount > 0)
{
while (reader.Read())
{
string result = reader.GetValue<string>(1);
// Perform operations using the retrieved keys
}
}
else
{
// Null query result set retrieved
}
Conclusion
Caching is an important component of microservices architecture. It can significantly improve the performance and scalability of microservices-based applications.
By caching frequently accessed data, microservices can retrieve data from the cache instead of making expensive calls to the database or other services. This can improve response times and reduce network latency.
By using a distributed caching system, microservices can achieve high availability, scalability, and fault tolerance while improving performance and reducing latency.
NCache is one of the most popular Distributed Caching frameworks available in the market. It offers great features and support for building Microservices in .NET technology.
NCache offers features such as Distributed Caching for overall externalized caching, Pub/Sub messaging for event driven architecture and SQL Query on Cache to query over cache in simple SQL statements.
Check out the below documentation references for more details.
https://www.alachisoft.com/use-cases/technical/microservices.html
https://www.alachisoft.com/blogs/scale-microservices-performance-with-distributed-caching/
https://www.alachisoft.com/resources/docs/ncache/prog-guide/sql-ncache.html
https://www.alachisoft.com/resources/docs/ncache/prog-guide/publish-subscribe-ncache.html