Card image cap

In-Memory Caching Strategies - Write-through Cache in ASP.NET Core

ASP.NET Core  • Posted 2 months ago

In a previous article we have been discussing the different strategies used in effectively implementing In-Memory Cache for improving performance. For starters, cache refers to a high-speed memory section which is dedicated for storing and accessing less frequently updated but frequently read data; a stale (but not so stale) data. Using an In-Memory Cache increases an application performance and reduces database queries for records, thereby reducing costs.

ASP.NET Core employs caching via the IMemoryCache interface which provides seamless and simple in-memory implementation for applications which run in a single node. But as the application moves on to be deployed in multiple nodes in a load-balanced environments, we do have options to go for popular centralized cache tier options such as Redis, Memcache, NCache and so on.

There are two kinds of caching patterns which can be employed:

  1. Lazy loading
  2. Write-through

Write-through caching:

The Write-through caching strategy works on top of the limitations of a "Lazyloading" cache approach. In Lazyloading, whenever there's data missing in the cache, its a "MISS" which adds to latency.

In the Write-through approach, the cache acts as a layer in between the application-tier and the data-tier, with all the data operations virtually passing through the cache-tier before reaching the data-tier. In this approach, any data which is to be written or updated onto the database is also updated on the cache along with the data-tier so that there shall be no data which isn't available on the cache which is available database. In other words, the cache maintains a partial datastore of sorts for itself.

This approach comes with potential advantages for itself:

  1. Data in the cache is never stale (old). Obvious, because all updates on the data-tier also happen on the data in the cache-tier and so the application-tier always gets to the latest and most recent data to access.
  2. Although the data is written twice: one at the data-tier and other at the cache-tier, still its passable because it adds upto the latency of a data write operation. It is a common perception that the "writes" take more time than a "read" and so when there's a little latency added it is still accepted in the application. This compensates with the relatively always-faster data "read" from the cache.

And there are disadvantages for itself:

  1. There is a lot of data unnecessarily stored onto the cache-tier along with the data-tier. This adds upto the additional cache storage and maintenence for such data which are not so frequently accessed in the application.
  2. Whenever a node in a cache-tier fails, there is data missing from the cache which is virtually "unusable" unless new data is written onto it for new writes. This is a potential limitation considering the scale of data which might be stored and processed in the node.

Implementing a Write-through cache:

Tagging along with the Readers application-tier we have used in the Lazyloading approach, let's add method to add new data to the data-tier within the IReaderRepo abstraction, which shall be implemented by the Repository implementation ReaderRepo and a Decorator implementation CachedReaderRepo as below:


namespace ReadersMvcApp.Providers.Repositories
{
    public interface IReaderRepo
    {
        IQueryable<Reader> Readers { get; }

        Reader GetReader(Guid id);

        Reader AddReader(Reader reader);
    }
}


namespace ReadersMvcApp.Providers.Repositories
{
    public class ReaderRepo : IReaderRepo
    {
        private readonly DbSet<Reader> _readers;

        public ReaderRepo()
        {
            // initialization logic
        }

        public Reader GetReader(Guid id)
        {
            return _readers.Where(x => x.Id == id).FirstOrDefault();
        }

        public Reader AddReader(Reader reader)
        {
            // add reader to database
        }

        public IQueryable<Reader> Readers => _readers.AsQueryable();
    }
}

and the CachedReaderRepo implements the AddReader() method as below:


namespace ReadersMvcApp.Providers.Repositories
{
    public class CachedReaderRepo : IReaderRepo
    {
        private readonly IReaderRepo repo;
        private readonly IMemoryCache cache;

        public CachedReaderRepo(IReaderRepo repo, IMemoryCache cache)
        {
            this.repo = repo;
            this.cache = cache;
        }

        public Reader AddReader(Reader reader)
        {
            // add record to database
            var result = repo.AddReader(reader);

            // write record to cache
            return cache.Set(result.Id, result);
        }

        public Reader GetReader(Guid id)
        {
            // since all data resides in the cache
            // read from cache directly
            cache.Get<Reader>(id);
        }

        public IQueryable<Reader> Readers => repo.Readers;
    }
}

Observe that each time the AddReader() is called by the application-tier, the record to be added to the data-tier is also added to the cache-tier once written on the datastore. This is what we call a "Write-through" approach. And since the data "virtually" always exists in the cache-tier, there are very less cache "MISS" to occur.

Combining two strategies together:

Since a "perfect-caching pattern" doesn't exist in a general perspective, developers can combine these two individual caching strategies which improve a "query" (lazy loading) and push (write-through) for designing their caching-tier. The CachedReaderRepo can look like this:


namespace ReadersMvcApp.Providers.Repositories
{
    public class CachedReaderRepo : IReaderRepo
    {
        private readonly IReaderRepo repo;
        private readonly IMemoryCache cache;

        public CachedReaderRepo(IReaderRepo repo, IMemoryCache cache)
        {
            this.repo = repo;
            this.cache = cache;
        }

        // Load cache along with a write "Write-through"
        public Reader AddReader(Reader reader)
        {
            var result = repo.AddReader(reader);

            return cache.Set(result.Id, result);
        }

        // Load cache for a Miss "Lazyloading"
        public Reader GetReader(Guid id)
        {
            Reader reader;

            if (!cache.TryGetValue(id, out reader))
            {
                var record = repo.GetReader(id);

                if (record != null)
                {
                    return cache.Set(record.Id, record);
                }
            }
            
            return reader;
        }

        public IQueryable<Reader> Readers => repo.Readers;
    }
}

We use cookies to provide you with a great user experience, analyze traffic and serve targeted promotions.   Learn More   Accept