Overview of caching in ASP.NET Core
Note
This isn't the latest version of this article. For the current release, see the .NET 8 version of this article.
Warning
This version of ASP.NET Core is no longer supported. For more information, see .NET and .NET Core Support Policy. For the current release, see the .NET 8 version of this article.
Important
This information relates to a pre-release product that may be substantially modified before it's commercially released. Microsoft makes no warranties, express or implied, with respect to the information provided here.
For the current release, see the .NET 8 version of this article.
By Rick Anderson and Tom Dykstra
In-memory caching
In-memory caching uses server memory to store cached data. This type of caching is suitable for a single server or multiple servers using session affinity. Session affinity is also known as sticky sessions. Session affinity means that the requests from a client are always routed to the same server for processing.
For more information, see Cache in-memory in ASP.NET Core and Troubleshoot Azure Application Gateway session affinity issues.
Distributed Cache
Use a distributed cache to store data when the app is hosted in a cloud or server farm. The cache is shared across the servers that process requests. A client can submit a request that's handled by any server in the group if cached data for the client is available. ASP.NET Core works with SQL Server, Redis, and NCache distributed caches.
For more information, see Distributed caching in ASP.NET Core.
HybridCache
The HybridCache
API bridges some gaps in the IDistributedCache and IMemoryCache APIs. HybridCache
is an abstract class with a default implementation that handles most aspects of saving to cache and retrieving from cache.
Features
HybridCache
has the following features that the other APIs don't have:
A unified API for both in-process and out-of-process caching.
HybridCache
is designed to be a drop-in replacement for existingIDistributedCache
andIMemoryCache
usage, and it provides a simple API for adding new caching code. If the app has anIDistributedCache
implementation, theHybridCache
service uses it for secondary caching. This two-level caching strategy allowsHybridCache
to provide the speed of an in-memory cache and the durability of a distributed or persistent cache.Stampede protection.
Cache stampede happens when a frequently used cache entry is revoked, and too many requests try to repopulate the same cache entry at the same time.
HybridCache
combines concurrent operations, ensuring that all requests for a given response wait for the first request to populate the cache.Configurable serialization.
Serialization is configured as part of registering the service, with support for type-specific and generalized serializers via the
WithSerializer
andWithSerializerFactory
methods, chained from theAddHybridCache
call. By default, the service handlesstring
andbyte[]
internally, and usesSystem.Text.Json
for everything else. It can be configured for other types of serializers, such as protobuf or XML.
To see the relative simplicity of the HybridCache
API, compare code that uses it to code that uses IDistributedCache
. Here's an example of what using IDistributedCache
looks like:
public class SomeService(IDistributedCache cache)
{
public async Task<SomeInformation> GetSomeInformationAsync
(string name, int id, CancellationToken token = default)
{
var key = $"someinfo:{name}:{id}"; // Unique key for this combination.
var bytes = await cache.GetAsync(key, token); // Try to get from cache.
SomeInformation info;
if (bytes is null)
{
// Cache miss; get the data from the real source.
info = await SomeExpensiveOperationAsync(name, id, token);
// Serialize and cache it.
bytes = SomeSerializer.Serialize(info);
await cache.SetAsync(key, bytes, token);
}
else
{
// Cache hit; deserialize it.
info = SomeSerializer.Deserialize<SomeInformation>(bytes);
}
return info;
}
// This is the work we're trying to cache.
private async Task<SomeInformation> SomeExpensiveOperationAsync(string name, int id,
CancellationToken token = default)
{ /* ... */ }
}
That's a lot of work to get right each time, including things like serialization. And in the "cache miss" scenario, you could end up with multiple concurrent threads, all getting a cache miss, all fetching the underlying data, all serializing it, and all sending that data to the cache.
Here's equivalent code using HybridCache
:
public class SomeService(HybridCache cache)
{
public async Task<SomeInformation> GetSomeInformationAsync
(string name, int id, CancellationToken token = default)
{
return await cache.GetOrCreateAsync(
$"someinfo:{name}:{id}", // Unique key for this entry.
async cancel => await SomeExpensiveOperationAsync(name, id, cancel),
token: token
);
}
}
The code is simpler and the library provides stampede protection and other features that IDistributedCache
doesn't.
Compatibility
The HybridCache
library supports older .NET runtimes, down to .NET Framework 4.7.2 and .NET Standard 2.0.
Additional resources
For more information, see the following resources:
Response caching
The Response caching middleware:
- Enables caching server responses based on HTTP cache headers. Implements the standard HTTP caching semantics. Caches based on HTTP cache headers like proxies do.
- Is typically not beneficial for UI apps such as Razor Pages because browsers generally set request headers that prevent caching. Output caching, which is available in ASP.NET Core 7.0 and later, benefits UI apps. With output caching, configuration decides what should be cached independently of HTTP headers.
- May be beneficial for public GET or HEAD API requests from clients where the Conditions for caching are met.
To test response caching, use Fiddler, or another tool that can explicitly set request headers. Setting headers explicitly is preferred for testing caching. For more information, see Troubleshooting.
For more information, see Response caching in ASP.NET Core.
Output caching
The output caching middleware enables caching of HTTP responses. Output caching differs from response caching in the following ways:
The caching behavior is configurable on the server.
Response caching behavior is defined by HTTP headers. For example, when you visit a website with Chrome or Edge, the browser automatically sends a
Cache-control: max-age=0
header. This header effectively disables response caching, since the server follows the directions provided by the client. A new response is returned for every request, even if the server has a fresh cached response. With output caching the client doesn't override the caching behavior that you configure on the server.The cache storage medium is extensible.
Memory is used by default. Response caching is limited to memory.
You can programmatically invalidate selected cache entries.
Response caching's dependence on HTTP headers leaves you with few options for invalidating cache entries.
Resource locking mitigates the risk of cache stampede and thundering herd.
Cache stampede happens when a frequently used cache entry is revoked, and too many requests try to repopulate the same cache entry at the same time. Thundering herd is similar: a burst of requests for the same response that isn't already in a cache entry. Resource locking ensures that all requests for a given response wait for the first request to populate the cache. Response caching doesn't have a resource locking feature.
Cache revalidation minimizes bandwidth usage.
Cache revalidation means the server can return a
304 Not Modified
HTTP status code instead of a cached response body. This status code informs the client that the response to the request is unchanged from what was previously received. Response caching doesn't do cache revalidation.
For more information, see Output caching middleware in ASP.NET Core.
Cache Tag Helper
Cache the content from an MVC view or Razor Page with the Cache Tag Helper. The Cache Tag Helper uses in-memory caching to store data.
For more information, see Cache Tag Helper in ASP.NET Core MVC.
Distributed Cache Tag Helper
Cache the content from an MVC view or Razor Page in distributed cloud or web farm scenarios with the Distributed Cache Tag Helper. The Distributed Cache Tag Helper uses SQL Server, Redis, or NCache to store data.
For more information, see Distributed Cache Tag Helper in ASP.NET Core.
In-memory caching
In-memory caching uses server memory to store cached data. This type of caching is suitable for a single server or multiple servers using session affinity. Session affinity is also known as sticky sessions. Session affinity means that the requests from a client are always routed to the same server for processing.
For more information, see Cache in-memory in ASP.NET Core and Troubleshoot Azure Application Gateway session affinity issues.
Distributed Cache
Use a distributed cache to store data when the app is hosted in a cloud or server farm. The cache is shared across the servers that process requests. A client can submit a request that's handled by any server in the group if cached data for the client is available. ASP.NET Core works with SQL Server, Redis, and NCache distributed caches.
For more information, see Distributed caching in ASP.NET Core.
HybridCache
The HybridCache
API bridges some gaps in the IDistributedCache and IMemoryCache APIs. HybridCache
is an abstract class with a default implementation that handles most aspects of saving to cache and retrieving from cache.
Features
HybridCache
has the following features that the other APIs don't have:
A unified API for both in-process and out-of-process caching.
HybridCache
is designed to be a drop-in replacement for existingIDistributedCache
andIMemoryCache
usage, and it provides a simple API for adding new caching code. If the app has anIDistributedCache
implementation, theHybridCache
service uses it for secondary caching. This two-level caching strategy allowsHybridCache
to provide the speed of an in-memory cache and the durability of a distributed or persistent cache.Stampede protection.
Cache stampede happens when a frequently used cache entry is revoked, and too many requests try to repopulate the same cache entry at the same time.
HybridCache
combines concurrent operations, ensuring that all requests for a given response wait for the first request to populate the cache.Configurable serialization.
Serialization is configured as part of registering the service, with support for type-specific and generalized serializers via the
WithSerializer
andWithSerializerFactory
methods, chained from theAddHybridCache
call. By default, the service handlesstring
andbyte[]
internally, and usesSystem.Text.Json
for everything else. It can be configured for other types of serializers, such as protobuf or XML.
To see the relative simplicity of the HybridCache
API, compare code that uses it to code that uses IDistributedCache
. Here's an example of what using IDistributedCache
looks like:
public class SomeService(IDistributedCache cache)
{
public async Task<SomeInformation> GetSomeInformationAsync
(string name, int id, CancellationToken token = default)
{
var key = $"someinfo:{name}:{id}"; // Unique key for this combination.
var bytes = await cache.GetAsync(key, token); // Try to get from cache.
SomeInformation info;
if (bytes is null)
{
// Cache miss; get the data from the real source.
info = await SomeExpensiveOperationAsync(name, id, token);
// Serialize and cache it.
bytes = SomeSerializer.Serialize(info);
await cache.SetAsync(key, bytes, token);
}
else
{
// Cache hit; deserialize it.
info = SomeSerializer.Deserialize<SomeInformation>(bytes);
}
return info;
}
// This is the work we're trying to cache.
private async Task<SomeInformation> SomeExpensiveOperationAsync(string name, int id,
CancellationToken token = default)
{ /* ... */ }
}
That's a lot of work to get right each time, including things like serialization. And in the "cache miss" scenario, you could end up with multiple concurrent threads, all getting a cache miss, all fetching the underlying data, all serializing it, and all sending that data to the cache.
Here's equivalent code using HybridCache
:
public class SomeService(HybridCache cache)
{
public async Task<SomeInformation> GetSomeInformationAsync
(string name, int id, CancellationToken token = default)
{
return await cache.GetOrCreateAsync(
$"someinfo:{name}:{id}", // Unique key for this entry.
async cancel => await SomeExpensiveOperationAsync(name, id, cancel),
token: token
);
}
}
The code is simpler and the library provides stampede protection and other features that IDistributedCache
doesn't.
Compatibility
The HybridCache
library supports older .NET runtimes, down to .NET Framework 4.7.2 and .NET Standard 2.0.
Additional resources
For more information, see the following resources:
Cache Tag Helper
Cache the content from an MVC view or Razor Page with the Cache Tag Helper. The Cache Tag Helper uses in-memory caching to store data.
For more information, see Cache Tag Helper in ASP.NET Core MVC.
Distributed Cache Tag Helper
Cache the content from an MVC view or Razor Page in distributed cloud or web farm scenarios with the Distributed Cache Tag Helper. The Distributed Cache Tag Helper uses SQL Server, Redis, or NCache to store data.
For more information, see Distributed Cache Tag Helper in ASP.NET Core.
Response caching
The Response caching middleware:
- Enables caching server responses based on HTTP cache headers. Implements the standard HTTP caching semantics. Caches based on HTTP cache headers like proxies do.
- Is typically not beneficial for UI apps such as Razor Pages because browsers generally set request headers that prevent caching. Output caching, which is available in ASP.NET Core 7.0 and later, benefits UI apps. With output caching, configuration decides what should be cached independently of HTTP headers.
- May be beneficial for public GET or HEAD API requests from clients where the Conditions for caching are met.
To test response caching, use Fiddler, or another tool that can explicitly set request headers. Setting headers explicitly is preferred for testing caching. For more information, see Troubleshooting.
Output caching
The output caching middleware enables caching of HTTP responses. Output caching differs from response caching in the following ways:
The caching behavior is configurable on the server.
Response caching behavior is defined by HTTP headers. For example, when you visit a website with Chrome or Edge, the browser automatically sends a
Cache-control: max-age=0
header. This header effectively disables response caching, since the server follows the directions provided by the client. A new response is returned for every request, even if the server has a fresh cached response. With output caching the client doesn't override the caching behavior that you configure on the server.The cache storage medium is extensible.
Memory is used by default. Response caching is limited to memory.
You can programmatically invalidate selected cache entries.
Response caching's dependence on HTTP headers leaves you with few options for invalidating cache entries.
Resource locking mitigates the risk of cache stampede and thundering herd.
Cache stampede happens when a frequently used cache entry is revoked, and too many requests try to repopulate the same cache entry at the same time. Thundering herd is similar: a burst of requests for the same response that isn't already in a cache entry. Resource locking ensures that all requests for a given response wait for the first request to populate the cache. Response caching doesn't have a resource locking feature.
Cache revalidation minimizes bandwidth usage.
Cache revalidation means the server can return a
304 Not Modified
HTTP status code instead of a cached response body. This status code informs the client that the response to the request is unchanged from what was previously received. Response caching doesn't do cache revalidation.
In-memory caching
In-memory caching uses server memory to store cached data. This type of caching is suitable for a single server or multiple servers using session affinity. Session affinity is also known as sticky sessions. Session affinity means that the requests from a client are always routed to the same server for processing.
For more information, see Cache in-memory in ASP.NET Core and Troubleshoot Azure Application Gateway session affinity issues.
Distributed Cache
Use a distributed cache to store data when the app is hosted in a cloud or server farm. The cache is shared across the servers that process requests. A client can submit a request that's handled by any server in the group if cached data for the client is available. ASP.NET Core works with SQL Server, Redis, and NCache distributed caches.
For more information, see Distributed caching in ASP.NET Core.
HybridCache
The HybridCache
API bridges some gaps in the IDistributedCache and IMemoryCache APIs. HybridCache
is an abstract class with a default implementation that handles most aspects of saving to cache and retrieving from cache.
Features
HybridCache
has the following features that the other APIs don't have:
A unified API for both in-process and out-of-process caching.
HybridCache
is designed to be a drop-in replacement for existingIDistributedCache
andIMemoryCache
usage, and it provides a simple API for adding new caching code. If the app has anIDistributedCache
implementation, theHybridCache
service uses it for secondary caching. This two-level caching strategy allowsHybridCache
to provide the speed of an in-memory cache and the durability of a distributed or persistent cache.Stampede protection.
Cache stampede happens when a frequently used cache entry is revoked, and too many requests try to repopulate the same cache entry at the same time.
HybridCache
combines concurrent operations, ensuring that all requests for a given response wait for the first request to populate the cache.Configurable serialization.
Serialization is configured as part of registering the service, with support for type-specific and generalized serializers via the
WithSerializer
andWithSerializerFactory
methods, chained from theAddHybridCache
call. By default, the service handlesstring
andbyte[]
internally, and usesSystem.Text.Json
for everything else. It can be configured for other types of serializers, such as protobuf or XML.
To see the relative simplicity of the HybridCache
API, compare code that uses it to code that uses IDistributedCache
. Here's an example of what using IDistributedCache
looks like:
public class SomeService(IDistributedCache cache)
{
public async Task<SomeInformation> GetSomeInformationAsync
(string name, int id, CancellationToken token = default)
{
var key = $"someinfo:{name}:{id}"; // Unique key for this combination.
var bytes = await cache.GetAsync(key, token); // Try to get from cache.
SomeInformation info;
if (bytes is null)
{
// Cache miss; get the data from the real source.
info = await SomeExpensiveOperationAsync(name, id, token);
// Serialize and cache it.
bytes = SomeSerializer.Serialize(info);
await cache.SetAsync(key, bytes, token);
}
else
{
// Cache hit; deserialize it.
info = SomeSerializer.Deserialize<SomeInformation>(bytes);
}
return info;
}
// This is the work we're trying to cache.
private async Task<SomeInformation> SomeExpensiveOperationAsync(string name, int id,
CancellationToken token = default)
{ /* ... */ }
}
That's a lot of work to get right each time, including things like serialization. And in the "cache miss" scenario, you could end up with multiple concurrent threads, all getting a cache miss, all fetching the underlying data, all serializing it, and all sending that data to the cache.
Here's equivalent code using HybridCache
:
public class SomeService(HybridCache cache)
{
public async Task<SomeInformation> GetSomeInformationAsync
(string name, int id, CancellationToken token = default)
{
return await cache.GetOrCreateAsync(
$"someinfo:{name}:{id}", // Unique key for this entry.
async cancel => await SomeExpensiveOperationAsync(name, id, cancel),
token: token
);
}
}
The code is simpler and the library provides stampede protection and other features that IDistributedCache
doesn't.
Compatibility
The HybridCache
library supports older .NET runtimes, down to .NET Framework 4.7.2 and .NET Standard 2.0.
Additional resources
For more information, see the following resources:
Cache Tag Helper
Cache the content from an MVC view or Razor Page with the Cache Tag Helper. The Cache Tag Helper uses in-memory caching to store data.
For more information, see Cache Tag Helper in ASP.NET Core MVC.
Distributed Cache Tag Helper
Cache the content from an MVC view or Razor Page in distributed cloud or web farm scenarios with the Distributed Cache Tag Helper. The Distributed Cache Tag Helper uses SQL Server, Redis, or NCache to store data.
For more information, see Distributed Cache Tag Helper in ASP.NET Core.
Response caching
The Response caching middleware:
- Enables caching server responses based on HTTP cache headers. Implements the standard HTTP caching semantics. Caches based on HTTP cache headers like proxies do.
- Is typically not beneficial for UI apps such as Razor Pages because browsers generally set request headers that prevent caching. Output caching, which is available in ASP.NET Core 7.0 and later, benefits UI apps. With output caching, configuration decides what should be cached independently of HTTP headers.
- May be beneficial for public GET or HEAD API requests from clients where the Conditions for caching are met.
To test response caching, use Fiddler, or another tool that can explicitly set request headers. Setting headers explicitly is preferred for testing caching. For more information, see Troubleshooting.
Output caching
Output caching is available in .NET 7 and later.