🚀 Unleash the Power of Multi-layer Caching in .NET Core and Watch Your App Fly!

Your app’s response time is crawling like a snail. 😱 Don’t worry; it happens to the best of us.

DotNet Full Stack Dev
5 min readOct 25, 2024

Imagine this: your .NET Core app is gaining traction, users are coming in droves, and… BOOM! Your app’s response time is crawling like a snail. 😱 Don’t worry; it happens to the best of us. When your database gets hammered with tons of requests, performance naturally takes a hit.

But what if I told you that you could cut down on database calls, improve response times, and give your users the fast experience they deserve with one smart trick? The answer is multi-layer caching! This blog is going to take you on a deep dive into multi-layer caching in .NET Core, complete with code snippets and practical examples to get you started. Ready? Let’s dive in!

📌Explore more at: https://dotnet-fullstack-dev.blogspot.com/
🌟 Clapping would be appreciated! 🚀

🌟 What’s Multi-layer Caching Anyway?

Think of caching as giving your app a memory shortcut. Rather than repeatedly hitting the database for the same data, you store it somewhere quicker to access — like your server’s memory, a distributed cache like Redis, or even a persistent storage layer.

Multi-layer caching is like having several layers of backup plans:

  1. First, you check in-memory (super fast, but limited to your server).
  2. If that fails, you hit Redis or another distributed cache (still fast, shared across servers).
  3. As a last resort, you fall back to the database (slow, but you get the right data).

It’s all about finding the data as quickly as possible without overloading your database. 🔥

🧑‍💻 Why Multi-layer Caching? Because Your App Deserves to Be a Speed Demon!

You might be thinking, “Hey, isn’t one cache layer enough?” Well, no, it’s not. As your app scales up and you move to distributed environments or the cloud, caching across multiple servers becomes a necessity. And with the multi-layer approach, you:

  • Boost performance: Data is served from memory or a fast cache rather than a slow database.
  • Enhance scalability: Redis handles distributed cache across multiple instances.
  • Improve reliability: If one layer fails, another can take over. No more performance bottlenecks.

🛠 Let’s Build Multi-layer Caching in .NET Core — Step by Step!

Enough theory! Let’s dive into the real magic — the code. We’ll walk through how to implement multi-layer caching with In-memory, Redis, and the Database in a typical .NET Core app.

🏗️ Step 1: In-memory Caching (Layer 1)

In-memory caching is the fastest layer. It stores frequently accessed data directly in the server’s memory. Let’s start by setting this up.

First, add it in Startup.cs:

public void ConfigureServices(IServiceCollection services)
{
// Enable in-memory caching
services.AddMemoryCache();

// Register the required services
services.AddScoped<IItemService, ItemService>();
}

Then, use it in your ItemService:

public class ItemService : IItemService
{
private readonly IMemoryCache _memoryCache;

public ItemService(IMemoryCache memoryCache)
{
_memoryCache = memoryCache;
}

public async Task<Item> GetItemAsync(int itemId)
{
// Check if the item is already cached in memory
if (_memoryCache.TryGetValue(itemId, out Item cachedItem))
{
Console.WriteLine("🚀 Item found in In-memory cache!");
return cachedItem;
}

// Simulate database call
var dbItem = await GetItemFromDatabase(itemId);

// Store the item in memory for future requests
_memoryCache.Set(itemId, dbItem, TimeSpan.FromMinutes(5));

return dbItem;
}

private Task<Item> GetItemFromDatabase(int itemId)
{
// Simulate a database call (imagine this is slow!)
return Task.FromResult(new Item { Id = itemId, Name = "Sample Item" });
}
}

Here, we’re checking in-memory cache first. If the data’s there, it’s lightning fast. If not, we go to the database.

🏗️ Step 2: Redis Distributed Caching (Layer 2)

Next up, we bring in Redis. Redis gives you distributed caching, which is especially useful when you have multiple instances of your app running in a cloud environment.

First, add Redis caching in Startup.cs:

public void ConfigureServices(IServiceCollection services)
{
services.AddStackExchangeRedisCache(options =>
{
options.Configuration = "localhost:6379"; // Replace with your Redis connection
});
}

Now, update ItemService to use Redis as Layer 2:

public class ItemService : IItemService
{
private readonly IMemoryCache _memoryCache;
private readonly IDistributedCache _distributedCache;

public ItemService(IMemoryCache memoryCache, IDistributedCache distributedCache)
{
_memoryCache = memoryCache;
_distributedCache = distributedCache;
}

public async Task<Item> GetItemAsync(int itemId)
{
// Step 1: Check in-memory cache
if (_memoryCache.TryGetValue(itemId, out Item cachedItem))
{
Console.WriteLine("🚀 Item found in In-memory cache!");
return cachedItem;
}

// Step 2: Check Redis cache
var cachedItemFromRedis = await _distributedCache.GetStringAsync(itemId.ToString());
if (!string.IsNullOrEmpty(cachedItemFromRedis))
{
var item = JsonConvert.DeserializeObject<Item>(cachedItemFromRedis);
_memoryCache.Set(itemId, item); // Cache in-memory for faster access next time
Console.WriteLine("🚀 Item found in Redis cache!");
return item;
}

// Step 3: Go to the database
var dbItem = await GetItemFromDatabase(itemId);
if (dbItem != null)
{
// Cache in Redis and memory
await _distributedCache.SetStringAsync(itemId.ToString(), JsonConvert.SerializeObject(dbItem));
_memoryCache.Set(itemId, dbItem, TimeSpan.FromMinutes(5));
}

return dbItem;
}

private Task<Item> GetItemFromDatabase(int itemId)
{
// Simulate database call
return Task.FromResult(new Item { Id = itemId, Name = "Sample Item" });
}
}

Now, you’ve got Redis as your second line of defense. If your app’s memory cache misses, Redis kicks in, giving you a speed boost even in a distributed environment. ⚡

🏗️ Step 3: Database (Layer 3)

If both your memory cache and Redis miss, you fall back to the database. Once you retrieve the data, cache it in both Redis and memory for future requests.

🔥 Cache Expiry and Invalidation: Keeping Your Cache Fresh

It’s not enough just to cache everything. You need to manage expiration and invalidate stale data properly. Here’s how:

  • Sliding Expiration: The cache resets its expiration timer every time it’s accessed.
  • Absolute Expiration: The cache expires after a fixed time, no matter what.

Here’s how you can set cache options in .NET Core:

_memoryCache.Set(itemId, dbItem, new MemoryCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10), // Cache expires in 10 minutes
SlidingExpiration = TimeSpan.FromMinutes(2) // Reset expiry if accessed within 2 minutes
});

This ensures that your cached data is always fresh and doesn’t hang around forever. Pretty slick, huh?

🔑 Final Thoughts: The Multi-layer Caching Advantage

With multi-layer caching, you’re setting your .NET Core app up for some serious performance gains. Here’s why you’ll love it:

  • Speed: Serving data from memory or Redis is way faster than constantly querying the database.
  • Scalability: Distributed cache like Redis helps you scale horizontally across multiple servers.
  • Reliability: Multiple caching layers give you a fallback if one cache layer fails.

--

--

DotNet Full Stack Dev
DotNet Full Stack Dev

Written by DotNet Full Stack Dev

Join me to master .NET Full Stack Development & boost your skills by 1% daily with insights, examples, and techniques! https://dotnet-fullstack-dev.blogspot.com

No responses yet