Boost Application Performance with Caching in ASP.NET Core

Caching is an essential technique for improving the performance and scalability of ASP.NET Core web applications. By caching frequently accessed data in memory, you can significantly speed up response times and reduce load on the database and other backend systems.

Caching in ASP.NET Core

In this comprehensive guide, you will learn:
  • What is caching and why is it important for ASP.NET Core apps
  • Built-in caching abstractions provided by ASP.NET Core
  • How to configure and use in-memory caching
  • Implementing a distributed Redis cache
  • Caching tag helpers to cache portions of a view
  • Caching best practices for high-performance web apps
What is Caching?

Caching refers to the temporary storage of frequently used data in memory for rapid access. Instead of reading from a slower backend data store, cached data can be retrieved from the much faster in-memory cache. This avoids unnecessary database and API calls.

The major benefits of caching include:
  • Improved page load times and response times
  • Reduced traffic and load on databases and APIs
  • Ability to scale applications to handle more traffic
  • Better resiliency to handle spikes in traffic
ASP.NET Core Caching Abstractions

ASP.NET Core has built-in abstractions for caching. This allows you to cache data without dealing with the underlying implementation details.

The key caching abstractions are:

IMemoryCache - represents an in-memory cache stored on the web server

IDistributedCache - interface for distributed caches like Redis

This means you can switch between different caching providers without changing your application code.

In-Memory Caching

The simplest way to get started with caching is to use the in-memory IMemoryCache. This caches data in the memory of the web server process.

Here is an example of using IMemoryCache to cache Order data:
public class OrderRepository
{
    private readonly IMemoryCache _cache;
 
    public OrderRepository(IMemoryCache memoryCache)
    {
        _cache = memoryCache;
    }
 
    public Order GetById(int id)
    {
        // Look for cache key 
        if (!_cache.TryGetValue(id, out Order order))
        {
            // Key not in cache, so fetch data
            order = _database.Find(id);
 
            // Set cache options
            var cacheExpiryOptions = new MemoryCacheEntryOptions
            {
                AbsoluteExpiration = DateTime.Now.AddMinutes(10),
                Priority = CacheItemPriority.High
            };
 
            // Save data in cache
            _cache.Set(order.Id, order, cacheExpiryOptions);
        }
 
        return order;
    }
}
public class Order
{
    public int Id { getset; }
    public DateTime Date { getset; }
    public string Name { getset; }
    public string Address { getset; }
    public string ProductPurchased { getset; }
}
The cached data has an absolute expiration time set, and a priority in case the cache becomes full. The high priority ensures the cache is not evicted early.

This avoids an unnecessary database query when the Order already exists in the fast memory cache.

Implementing a Distributed Cache

In-memory caching has limitation in that it is local to each web server. To scale the cache across multiple servers, you need a distributed caching solution like Redis or SQL Server.

Distributed caching is a technique used to store frequently accessed data in a shared cache across multiple servers, reducing the need to repeatedly fetch data from the original source. Redis, a popular in-memory data store, provides an efficient solution for distributed caching due to its speed and versatility.

Here is an example using a Redis distributed cache, using the IDistributedCache interface.

Setting up Redis Cache in ASP.NET Core:

Add the below two NuGet packages to your ASP.NET Core project as shown below:

NuGet packages for Caching
Configure Redis Cache in Program.cs as shown below:
//Add Distributed Memory Cache
builder.Services.AddStackExchangeRedisCache(options =>
{
    options.Configuration = builder.Configuration.GetConnectionString("Redis"); // Configuration for Redis server
    options.InstanceName = "MyRedisInstance"// Optional: provide a unique instance name
});
And then, finally use the IDistributedCache into your code as below:
public class ProductService
{
    private readonly IDistributedCache _cache;
 
    public ProductService(IDistributedCache cache)
    {
        _cache = cache;
    }
 
    public async Task<List<Product>> GetProductsAsync(bool useCache = true)
    {
        if (useCache)
        {
            var cachedProducts = await _cache.GetStringAsync("CachedProducts");
 
            if (cachedProducts != null)
            {
                return JsonConvert.DeserializeObject<List<Product>>(cachedProducts);
            }
            else
            {
                var products = await _productRepository.GetProductsAsync();
 
                await _cache.SetStringAsync("CachedProducts", JsonConvert.SerializeObject(products), new DistributedCacheEntryOptions
                {
                    AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10) // Cache for 10 minutes
                });
 
                return products;
            }
        }
        else
        {
            return await _productRepository.GetProductsAsync();
        }
    }
}
The Redis cache is shared across all web server instances, and scales seamlessly.

Caching Best Practices

Here are some key best practices to follow when caching:
  • Set appropriate expiration times based on how often data changes
  • Use cache dependencies to refresh when related data changes
  • Watch cache hit rate and eviction rate
  • Use CDNs to cache static resources
  • Enable compression for cached responses
  • Cache appropriates: avoid caching frequently changing data
Caching is a powerful technique to boost ASP.NET Core application performance. Make use of the built-in caching abstractions to easily add caching components like in-memory caches and distributed Redis caches.

Follow caching best practices to ensure high cache hit rates and efficiency. Measure cache performance over time and tune the expiration times and cache regions to keep hot data cached.

No comments:

Powered by Blogger.