Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
In this article, you learn about various caching mechanisms. Caching is the act of storing data in an intermediate-layer, making subsequent data retrievals faster. Conceptually, caching is a performance optimization strategy and design consideration. Caching can significantly improve app performance by making infrequently changing (or expensive to retrieve) data more readily available. This article introduces three caching approaches and provides sample source code for each:
- Microsoft.Extensions.Caching.Memory: In-memory caching for single-server scenarios
- Microsoft.Extensions.Caching.Hybrid: Hybrid caching that combines in-memory and distributed caching with additional features
- Microsoft.Extensions.Caching.Distributed: Distributed caching for multi-server scenarios
Important
There are two MemoryCache classes within .NET, one in the System.Runtime.Caching namespace and the other in the Microsoft.Extensions.Caching namespace:
While this article focuses on caching, it doesn't include the System.Runtime.Caching NuGet package. All references to MemoryCache are within the Microsoft.Extensions.Caching namespace.
All of the Microsoft.Extensions.* packages come dependency injection (DI) ready. The IMemoryCache, HybridCache, and IDistributedCache interfaces can be used as services.
In-memory caching
In this section, you learn about the Microsoft.Extensions.Caching.Memory package. The current implementation of the IMemoryCache is a wrapper around the ConcurrentDictionary<TKey,TValue>, exposing a feature-rich API. Entries within the cache are represented by the ICacheEntry and can be any object. The in-memory cache solution is great for apps that run on a single server, where the cached data rents memory in the app's process.
Tip
For multi-server caching scenarios, consider the Distributed caching approach as an alternative to in-memory caching.
In-memory caching API
The consumer of the cache has control over both sliding and absolute expirations:
- ICacheEntry.AbsoluteExpiration
- ICacheEntry.AbsoluteExpirationRelativeToNow
- ICacheEntry.SlidingExpiration
Setting an expiration causes entries in the cache to be evicted if they're not accessed within the expiration time allotment. Consumers have additional options for controlling cache entries, through MemoryCacheEntryOptions. Each ICacheEntry is paired with MemoryCacheEntryOptions, which exposes expiration eviction functionality with IChangeToken, priority settings with CacheItemPriority, and controlling the ICacheEntry.Size. The relevant extension methods are:
- MemoryCacheEntryExtensions.AddExpirationToken
- MemoryCacheEntryExtensions.RegisterPostEvictionCallback
- MemoryCacheEntryExtensions.SetSize
- MemoryCacheEntryExtensions.SetPriority
In-memory cache example
To use the default IMemoryCache implementation, call the AddMemoryCache extension method to register all the required services with DI. In the following code sample, the generic host is used to expose DI functionality:
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
HostApplicationBuilder builder = Host.CreateApplicationBuilder(args);
builder.Services.AddMemoryCache();
using IHost host = builder.Build();
Depending on your .NET workload, you might access the IMemoryCache differently, such as constructor injection. In this sample, you use the IServiceProvider instance on the host and call generic GetRequiredService<T>(IServiceProvider) extension method:
IMemoryCache cache =
host.Services.GetRequiredService<IMemoryCache>();
With in-memory caching services registered, and resolved through DI, you're ready to start caching. This sample iterates through the letters in the English alphabet 'A' through 'Z'. The record AlphabetLetter type holds the reference to the letter, and generates a message.
file record AlphabetLetter(char Letter)
{
internal string Message =>
$"The '{Letter}' character is the {Letter - 64} letter in the English alphabet.";
}
Tip
The file access modifier is used on the AlphabetLetter type, as it's defined within and only accessed from the Program.cs file. For more information, see file (C# Reference). To see the full source code, see the Program.cs section.
The sample includes a helper function that iterates through the alphabet letters:
static async ValueTask IterateAlphabetAsync(
Func<char, Task> asyncFunc)
{
for (char letter = 'A'; letter <= 'Z'; ++letter)
{
await asyncFunc(letter);
}
Console.WriteLine();
}
In the preceding C# code:
- The
Func<char, Task> asyncFuncis awaited on each iteration, passing the currentletter. - After all letters have been processed, a blank line is written to the console.
To add items to the cache call one of the Create, or Set APIs:
var addLettersToCacheTask = IterateAlphabetAsync(letter =>
{
MemoryCacheEntryOptions options = new()
{
AbsoluteExpirationRelativeToNow =
TimeSpan.FromMilliseconds(MillisecondsAbsoluteExpiration)
};
_ = options.RegisterPostEvictionCallback(OnPostEviction);
AlphabetLetter alphabetLetter =
cache.Set(
letter, new AlphabetLetter(letter), options);
Console.WriteLine($"{alphabetLetter.Letter} was cached.");
return Task.Delay(
TimeSpan.FromMilliseconds(MillisecondsDelayAfterAdd));
});
await addLettersToCacheTask;
In the preceding C# code:
- The variable
addLettersToCacheTaskdelegates toIterateAlphabetAsyncand is awaited. - The
Func<char, Task> asyncFuncis argued with a lambda. - The
MemoryCacheEntryOptionsis instantiated with an absolute expiration relative to now. - A post eviction callback is registered.
- An
AlphabetLetterobject is instantiated, and passed into Set along withletterandoptions. - The letter is written to the console as being cached.
- Finally, a Task.Delay is returned.
For each letter in the alphabet, a cache entry is written with an expiration and post-eviction callback.
The post-eviction callback writes the details of the value that was evicted to the console:
static void OnPostEviction(
object key, object? letter, EvictionReason reason, object? state)
{
if (letter is AlphabetLetter alphabetLetter)
{
Console.WriteLine($"{alphabetLetter.Letter} was evicted for {reason}.");
}
};
Now that the cache is populated, another call to IterateAlphabetAsync is awaited, but this time you call IMemoryCache.TryGetValue:
var readLettersFromCacheTask = IterateAlphabetAsync(letter =>
{
if (cache.TryGetValue(letter, out object? value) &&
value is AlphabetLetter alphabetLetter)
{
Console.WriteLine($"{letter} is still in cache. {alphabetLetter.Message}");
}
return Task.CompletedTask;
});
await readLettersFromCacheTask;
If the cache contains the letter key, and the value is an instance of an AlphabetLetter it's written to the console. When the letter key isn't in the cache, it was evicted and its post eviction callback was invoked.
Additional extension methods
The IMemoryCache comes with many convenience-based extension methods, including an asynchronous GetOrCreateAsync:
- CacheExtensions.Get
- CacheExtensions.GetOrCreate
- CacheExtensions.GetOrCreateAsync
- CacheExtensions.Set
- CacheExtensions.TryGetValue
Put it all together
The entire sample app source code is a top-level program and requires two NuGet packages:
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
HostApplicationBuilder builder = Host.CreateApplicationBuilder(args);
builder.Services.AddMemoryCache();
using IHost host = builder.Build();
IMemoryCache cache =
host.Services.GetRequiredService<IMemoryCache>();
const int MillisecondsDelayAfterAdd = 50;
const int MillisecondsAbsoluteExpiration = 750;
static void OnPostEviction(
object key, object? letter, EvictionReason reason, object? state)
{
if (letter is AlphabetLetter alphabetLetter)
{
Console.WriteLine($"{alphabetLetter.Letter} was evicted for {reason}.");
}
};
static async ValueTask IterateAlphabetAsync(
Func<char, Task> asyncFunc)
{
for (char letter = 'A'; letter <= 'Z'; ++letter)
{
await asyncFunc(letter);
}
Console.WriteLine();
}
var addLettersToCacheTask = IterateAlphabetAsync(letter =>
{
MemoryCacheEntryOptions options = new()
{
AbsoluteExpirationRelativeToNow =
TimeSpan.FromMilliseconds(MillisecondsAbsoluteExpiration)
};
_ = options.RegisterPostEvictionCallback(OnPostEviction);
AlphabetLetter alphabetLetter =
cache.Set(
letter, new AlphabetLetter(letter), options);
Console.WriteLine($"{alphabetLetter.Letter} was cached.");
return Task.Delay(
TimeSpan.FromMilliseconds(MillisecondsDelayAfterAdd));
});
await addLettersToCacheTask;
var readLettersFromCacheTask = IterateAlphabetAsync(letter =>
{
if (cache.TryGetValue(letter, out object? value) &&
value is AlphabetLetter alphabetLetter)
{
Console.WriteLine($"{letter} is still in cache. {alphabetLetter.Message}");
}
return Task.CompletedTask;
});
await readLettersFromCacheTask;
await host.RunAsync();
file record AlphabetLetter(char Letter)
{
internal string Message =>
$"The '{Letter}' character is the {Letter - 64} letter in the English alphabet.";
}
You can adjust the MillisecondsDelayAfterAdd and MillisecondsAbsoluteExpiration values to observe the changes in behavior to the expiration and eviction of cached entries. The following is sample output from running this code. (Due to the nondeterministic nature of .NET events, your output might be different.)
A was cached.
B was cached.
C was cached.
D was cached.
E was cached.
F was cached.
G was cached.
H was cached.
I was cached.
J was cached.
K was cached.
L was cached.
M was cached.
N was cached.
O was cached.
P was cached.
Q was cached.
R was cached.
S was cached.
T was cached.
U was cached.
V was cached.
W was cached.
X was cached.
Y was cached.
Z was cached.
A was evicted for Expired.
C was evicted for Expired.
B was evicted for Expired.
E was evicted for Expired.
D was evicted for Expired.
F was evicted for Expired.
H was evicted for Expired.
K was evicted for Expired.
L was evicted for Expired.
J was evicted for Expired.
G was evicted for Expired.
M was evicted for Expired.
N was evicted for Expired.
I was evicted for Expired.
P was evicted for Expired.
R was evicted for Expired.
O was evicted for Expired.
Q was evicted for Expired.
S is still in cache. The 'S' character is the 19 letter in the English alphabet.
T is still in cache. The 'T' character is the 20 letter in the English alphabet.
U is still in cache. The 'U' character is the 21 letter in the English alphabet.
V is still in cache. The 'V' character is the 22 letter in the English alphabet.
W is still in cache. The 'W' character is the 23 letter in the English alphabet.
X is still in cache. The 'X' character is the 24 letter in the English alphabet.
Y is still in cache. The 'Y' character is the 25 letter in the English alphabet.
Z is still in cache. The 'Z' character is the 26 letter in the English alphabet.
Since the absolute expiration (MemoryCacheEntryOptions.AbsoluteExpirationRelativeToNow) is set, all the cached items will eventually be evicted.
Worker Service caching
One common strategy for caching data is updating the cache independently from the consuming data services. The Worker Service template is a great example, as the BackgroundService runs independently (or in the background) from the other application code. When an application starts running that hosts an implementation of the IHostedService, the corresponding implementation (in this case the BackgroundService or "worker") start running in the same process. These hosted services are registered with DI as singletons, through the AddHostedService<THostedService>(IServiceCollection) extension method. Other services can be registered with DI with any service lifetime.
Important
The service lifetimes are important to understand. When you call AddMemoryCache to register all of the in-memory caching services, the services are registered as singletons.
Photo service scenario
Imagine you're developing a photo service that relies on third-party API accessible via HTTP. This photo data doesn't change often, but there's a lot of it. Each photo is represented by a simple record:
namespace CachingExamples.Memory;
public readonly record struct Photo(
int AlbumId,
int Id,
string Title,
string Url,
string ThumbnailUrl);
In the following example, you see several services being registered with DI. Each service has a single responsibility.
using CachingExamples.Memory;
HostApplicationBuilder builder = Host.CreateApplicationBuilder(args);
builder.Services.AddMemoryCache();
builder.Services.AddHttpClient<CacheWorker>();
builder.Services.AddHostedService<CacheWorker>();
builder.Services.AddScoped<PhotoService>();
builder.Services.AddSingleton(typeof(CacheSignal<>));
using IHost host = builder.Build();
await host.StartAsync();
In the preceding C# code:
- The generic host is created with defaults.
- In-memory caching services are registered with AddMemoryCache.
- An
HttpClientinstance is registered for theCacheWorkerclass with AddHttpClient<TClient>(IServiceCollection). - The
CacheWorkerclass is registered with AddHostedService<THostedService>(IServiceCollection). - The
PhotoServiceclass is registered with AddScoped<TService>(IServiceCollection). - The
CacheSignal<T>class is registered with AddSingleton. - The
hostis instantiated from the builder and started asynchronously.
The PhotoService is responsible for getting photos that match given criteria (or filter):
using Microsoft.Extensions.Caching.Memory;
namespace CachingExamples.Memory;
public sealed class PhotoService(
IMemoryCache cache,
CacheSignal<Photo> cacheSignal,
ILogger<PhotoService> logger)
{
public async IAsyncEnumerable<Photo> GetPhotosAsync(Func<Photo, bool>? filter = default)
{
try
{
await cacheSignal.WaitAsync();
Photo[] photos =
(await cache.GetOrCreateAsync(
"Photos", _ =>
{
logger.LogWarning("This should never happen!");
return Task.FromResult(Array.Empty<Photo>());
}))!;
// If no filter is provided, use a pass-thru.
filter ??= _ => true;
foreach (Photo photo in photos)
{
if (!default(Photo).Equals(photo) && filter(photo))
{
yield return photo;
}
}
}
finally
{
cacheSignal.Release();
}
}
}
In the preceding C# code:
- The constructor requires an
IMemoryCache,CacheSignal<Photo>, andILogger. - The
GetPhotosAsyncmethod:- Defines a
Func<Photo, bool> filterparameter, and returns anIAsyncEnumerable<Photo>. - Calls and waits for the
_cacheSignal.WaitAsync()to release; this ensures that the cache is populated before accessing the cache. - Calls
_cache.GetOrCreateAsync(), asynchronously getting all of the photos in the cache. - The
factoryargument logs a warning and returns an empty photo array - this should never happen. - Each photo in the cache is iterated, filtered, and materialized with
yield return. - Finally, the cache signal is reset.
- Defines a
Consumers of this service are free to call GetPhotosAsync method, and handle photos accordingly. No HttpClient is required as the cache contains the photos.
The asynchronous signal is based on an encapsulated SemaphoreSlim instance, within a generic-type constrained singleton. The CacheSignal<T> relies on an instance of SemaphoreSlim:
namespace CachingExamples.Memory;
public sealed class CacheSignal<T>
{
private readonly SemaphoreSlim _semaphore = new(1, 1);
/// <summary>
/// Exposes a <see cref="Task"/> that represents the asynchronous wait operation.
/// When signaled (consumer calls <see cref="Release"/>), the
/// <see cref="Task.Status"/> is set as <see cref="TaskStatus.RanToCompletion"/>.
/// </summary>
public Task WaitAsync() => _semaphore.WaitAsync();
/// <summary>
/// Exposes the ability to signal the release of the <see cref="WaitAsync"/>'s operation.
/// Callers who were waiting, will be able to continue.
/// </summary>
public void Release() => _semaphore.Release();
}
In the preceding C# code, the decorator pattern is used to wrap an instance of the SemaphoreSlim. Since the CacheSignal<T> is registered as a singleton, it can be used across all service lifetimes with any generic type—in this case, the Photo. It's responsible for signaling the seeding of the cache.
The CacheWorker is a subclass of BackgroundService:
using System.Net.Http.Json;
using Microsoft.Extensions.Caching.Memory;
namespace CachingExamples.Memory;
public sealed class CacheWorker(
ILogger<CacheWorker> logger,
HttpClient httpClient,
CacheSignal<Photo> cacheSignal,
IMemoryCache cache) : BackgroundService
{
private readonly TimeSpan _updateInterval = TimeSpan.FromHours(3);
private bool _isCacheInitialized = false;
private const string Url = "https://jsonplaceholder.typicode.com/photos";
public override async Task StartAsync(CancellationToken cancellationToken)
{
await cacheSignal.WaitAsync();
await base.StartAsync(cancellationToken);
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
while (!stoppingToken.IsCancellationRequested)
{
logger.LogInformation("Updating cache.");
try
{
Photo[]? photos =
await httpClient.GetFromJsonAsync<Photo[]>(
Url, stoppingToken);
if (photos is { Length: > 0 })
{
cache.Set("Photos", photos);
logger.LogInformation(
"Cache updated with {Count:#,#} photos.", photos.Length);
}
else
{
logger.LogWarning(
"Unable to fetch photos to update cache.");
}
}
finally
{
if (!_isCacheInitialized)
{
cacheSignal.Release();
_isCacheInitialized = true;
}
}
try
{
logger.LogInformation(
"Will attempt to update the cache in {Hours} hours from now.",
_updateInterval.Hours);
await Task.Delay(_updateInterval, stoppingToken);
}
catch (OperationCanceledException)
{
logger.LogWarning("Cancellation acknowledged: shutting down.");
break;
}
}
}
}
In the preceding C# code:
- The constructor requires an
ILogger,HttpClient, andIMemoryCache. - The
_updateIntervalis defined for three hours. - The
ExecuteAsyncmethod:- Loops while the app is running.
- Makes an HTTP request to
"https://jsonplaceholder.typicode.com/photos", and maps the response as an array ofPhotoobjects. - The array of photos is placed in the
IMemoryCacheunder the"Photos"key. - The
_cacheSignal.Release()is called, releasing any consumers who were waiting for the signal. - The call to Task.Delay is awaited, given the update interval.
- After delaying for three hours, the cache is again updated.
Consumers in the same process could ask the IMemoryCache for the photos, but the CacheWorker is responsible for updating the cache.
Hybrid caching
The HybridCache library combines the benefits of in-memory and distributed caching while addressing common challenges with existing caching APIs. Introduced in .NET 9, HybridCache provides a unified API that simplifies caching implementation and includes built-in features like stampede protection and configurable serialization.
Key features
HybridCache offers several advantages over using IMemoryCache and IDistributedCache separately:
- Two-level caching: Automatically manages both in-memory (L1) and distributed (L2) cache layers. Data is retrieved from in-memory cache first for speed, then from distributed cache if needed, and finally from the source.
- Stampede protection: Prevents multiple concurrent requests from executing the same expensive operation. Only one request fetches the data while others wait for the result.
- Configurable serialization: Supports multiple serialization formats including JSON (default), protobuf, and XML.
- Tag-based invalidation: Groups related cache entries with tags for efficient batch invalidation.
- Simplified API: The
GetOrCreateAsyncmethod handles cache misses, serialization, and storage automatically.
When to use HybridCache
Consider using HybridCache when:
- You need both local (in-memory) and distributed caching in a multi-server environment.
- You want protection against cache stampede scenarios.
- You prefer a simplified API over manually coordinating
IMemoryCacheandIDistributedCache. - You need tag-based cache invalidation for related entries.
Tip
For single-server applications with simple caching needs, in-memory caching might be sufficient. For multi-server applications without the need for stampede protection or tag-based invalidation, consider distributed caching.
HybridCache setup
To use HybridCache, install the Microsoft.Extensions.Caching.Hybrid NuGet package:
dotnet add package Microsoft.Extensions.Caching.Hybrid
Register the HybridCache service with DI by calling AddHybridCache:
var builder = Host.CreateApplicationBuilder(args);
builder.Services.AddHybridCache();
The preceding code registers HybridCache with default options. You can also configure global options:
var builderWithOptions = Host.CreateApplicationBuilder(args);
builderWithOptions.Services.AddHybridCache(options =>
{
options.MaximumPayloadBytes = 1024 * 1024; // 1 MB
options.MaximumKeyLength = 1024;
options.DefaultEntryOptions = new HybridCacheEntryOptions
{
Expiration = TimeSpan.FromMinutes(5),
LocalCacheExpiration = TimeSpan.FromMinutes(2)
};
});
Basic usage
The primary method for interacting with HybridCache is GetOrCreateAsync. This method checks the cache for an entry with the specified key and, if not found, calls the factory method to retrieve the data:
async Task<WeatherData> GetWeatherDataAsync(HybridCache cache, string city)
{
return await cache.GetOrCreateAsync(
$"weather:{city}",
async cancellationToken =>
{
// Simulate fetching from an external API
await Task.Delay(100, cancellationToken);
return new WeatherData(city, 72, "Sunny");
}
);
}
In the preceding C# code:
- The
GetOrCreateAsyncmethod takes a unique key and a factory method. - If the data isn't in the cache, the factory method is called to retrieve it.
- The data is automatically stored in both in-memory and distributed caches.
- Only one concurrent request executes the factory method; others wait for the result.
Entry options
You can override global defaults for specific cache entries using HybridCacheEntryOptions:
async Task<WeatherData> GetWeatherWithOptionsAsync(HybridCache cache, string city)
{
var entryOptions = new HybridCacheEntryOptions
{
Expiration = TimeSpan.FromMinutes(10),
LocalCacheExpiration = TimeSpan.FromMinutes(5)
};
return await cache.GetOrCreateAsync(
$"weather:{city}",
async cancellationToken => new WeatherData(city, 72, "Sunny"),
entryOptions
);
}
The entry options allow you to configure:
- HybridCacheEntryOptions.Expiration: How long the entry should be cached in the distributed cache.
- HybridCacheEntryOptions.LocalCacheExpiration: How long the entry should be cached in local memory.
- HybridCacheEntryOptions.Flags: Additional flags for controlling cache behavior.
Tag-based invalidation
Tags allow you to group related cache entries and invalidate them together. This is useful for scenarios where related data needs to be refreshed as a unit:
async Task<CustomerData> GetCustomerAsync(HybridCache cache, int customerId)
{
var tags = new[] { "customer", $"customer:{customerId}" };
return await cache.GetOrCreateAsync(
$"customer:{customerId}",
async cancellationToken => new CustomerData(customerId, "John Doe", "john@example.com"),
new HybridCacheEntryOptions { Expiration = TimeSpan.FromMinutes(30) },
tags
);
}
To invalidate all entries with a specific tag:
async Task InvalidateCustomerCacheAsync(HybridCache cache, int customerId)
{
await cache.RemoveByTagAsync($"customer:{customerId}");
}
You can also invalidate multiple tags at once:
async Task InvalidateAllCustomersAsync(HybridCache cache)
{
await cache.RemoveByTagAsync(new[] { "customer", "orders" });
}
Note
Tag-based invalidation is a logical operation. It doesn't actively remove values from the cache but ensures that tagged entries are treated as cache misses. The entries eventually expire based on their configured lifetime.
Remove cache entries
To remove a specific cache entry by key, use the RemoveAsync method:
async Task RemoveWeatherDataAsync(HybridCache cache, string city)
{
await cache.RemoveAsync($"weather:{city}");
}
To invalidate all cached entries, use the reserved wildcard tag "*":
async Task InvalidateAllCacheAsync(HybridCache cache)
{
await cache.RemoveByTagAsync("*");
}
Serialization
For distributed caching scenarios, HybridCache requires serialization. By default, it handles string and byte[] internally and uses System.Text.Json for other types. You can configure custom serializers for specific types or use a general-purpose serializer:
// Custom serialization example
// Note: This requires implementing a custom IHybridCacheSerializer<T>
var builderWithSerializer = Host.CreateApplicationBuilder(args);
builderWithSerializer.Services.AddHybridCache(options =>
{
options.DefaultEntryOptions = new HybridCacheEntryOptions
{
Expiration = TimeSpan.FromMinutes(10),
LocalCacheExpiration = TimeSpan.FromMinutes(5)
};
});
// To add a custom serializer, uncomment and provide your implementation:
// .AddSerializer<WeatherData, CustomWeatherDataSerializer>();
Configure distributed cache
HybridCache uses the configured IDistributedCache implementation for its distributed (L2) cache. Even without an IDistributedCache configured, HybridCache still provides in-memory caching and stampede protection. To add Redis as a distributed cache:
// Distributed cache with Redis
var builderWithRedis = Host.CreateApplicationBuilder(args);
builderWithRedis.Services.AddStackExchangeRedisCache(options =>
{
options.Configuration = "localhost:6379";
});
builderWithRedis.Services.AddHybridCache(options =>
{
options.DefaultEntryOptions = new HybridCacheEntryOptions
{
Expiration = TimeSpan.FromMinutes(30),
LocalCacheExpiration = TimeSpan.FromMinutes(5)
};
});
For more information about distributed cache implementations, see Distributed caching.
Distributed caching
In some scenarios, a distributed cache is required—such is the case with multiple app servers. A distributed cache supports higher scale-out than the in-memory caching approach. Using a distributed cache offloads the cache memory to an external process, but does require extra network I/O and introduces a bit more latency (even if nominal).
The distributed caching abstractions are part of the Microsoft.Extensions.Caching.Memory NuGet package, and there's even an AddDistributedMemoryCache extension method.
Caution
AddDistributedMemoryCache should only be used in development or testing scenarios and is not a viable production implementation.
Consider any of the available implementations of the IDistributedCache from the following packages:
Microsoft.Extensions.Caching.SqlServerMicrosoft.Extensions.Caching.StackExchangeRedisNCache.Microsoft.Extensions.Caching.OpenSource
Distributed caching API
The distributed caching APIs are a bit more primitive than their in-memory caching API counterparts. The key-value pairs are a bit more basic. In-memory caching keys are based on an object, whereas the distributed keys are a string. With in-memory caching, the value can be any strongly typed generic, whereas values in distributed caching are persisted as byte[]. That's not to say that various implementations don't expose strongly typed generic values, but that's an implementation detail.
Create values
To create values in the distributed cache, call one of the set APIs:
Using the AlphabetLetter record from the in-memory cache example, you could serialize the object to JSON and then encode the string as a byte[]:
DistributedCacheEntryOptions options = new()
{
AbsoluteExpirationRelativeToNow =
TimeSpan.FromMilliseconds(MillisecondsAbsoluteExpiration)
};
AlphabetLetter alphabetLetter = new(letter);
string json = JsonSerializer.Serialize(alphabetLetter);
byte[] bytes = Encoding.UTF8.GetBytes(json);
await cache.SetAsync(letter.ToString(), bytes, options);
Much like in-memory caching, cache entries can have options to help fine-tune their existence in the cache—in this case, the DistributedCacheEntryOptions.
Create extension methods
There are several convenience-based extension methods for creating values. These methods help to avoid encoding string representations of objects into a byte[]:
Read values
To read values from the distributed cache, call one of the Get APIs:
AlphabetLetter? alphabetLetter = null;
byte[]? bytes = await cache.GetAsync(letter.ToString());
if (bytes is { Length: > 0 })
{
string json = Encoding.UTF8.GetString(bytes);
alphabetLetter = JsonSerializer.Deserialize<AlphabetLetter>(json);
}
Once a cache entry is read out of the cache, you can get the UTF8 encoded string representation from the byte[].
Read extension methods
There are several convenience-based extension methods for reading values. These methods help to avoid decoding byte[] into string representations of objects:
Update values
There is no way to update the values in the distributed cache with a single API call. Instead, values can have their sliding expirations reset with one of the refresh APIs:
If the actual value needs to be updated, you must delete the value and then re-add it.
Delete values
To delete values in the distributed cache, call one of the Remove APIs:
Tip
While there are synchronous versions of these APIs, consider the fact that implementations of distributed caches are reliant on network I/O. For this reason, it's usually preferable to use the asynchronous APIs.