Monitor Azure Managed Cache Service


Microsoft recommends all new developments use Azure Redis Cache. For current documentation and guidance on choosing an Azure Cache offering, see Which Azure Cache offering is right for me?

Azure Managed Cache Service, once incorporated as a part of your application, plays a key part in maintaining your application's performance and availability. The availability of the service may occasionally be impacted with maintenance tasks at the backend or by how your allocated service capacity is being consumed. Further, in certain scenarios you may need deeper insights into various service counters and parameters to debug specific issues your client application may be facing.

Monitoring Azure Cache Service

The Management Portal provides the following functionality to help you monitor the health of the service and view performance counters to troubleshoot issues.

  • The caching dashboard provides an overview of the health of your cache, the utilization and information of the various properties of your service.

  • The Monitor tab allows you to use the Management Portal for viewing a variety of service performance counters that can help you troubleshoot issues.

In this section

  • Monitoring the cache service using the dashboard

  • Monitoring the cache service using the Monitor tab

Monitoring the cache service using the dashboard

The dashboard for your service can be accessed from the Management Portal by selecting the service entry and opening the dashboard tab.

Windows Azure Cache Service Dashboard

The dashboard displays the cache endpoint, cache status (such as Running), the current cache offering, and a chart with metrics from the following six performance counters.

  • Bandwidth Used %

  • Cache Miss %

  • Compute Used %

  • Memory Used %

  • Read Requests /sec

  • Write Requests /sec

Using the information displayed and the performance counter metrics, you can monitor the performance of your cache in the following areas.

  • Availability notifications about your cache

  • Capacity usage for your cache

  • Measuring cache effectiveness for your client

Availability notifications about your cache

In case your cache endpoint is currently unavailable due to maintenance activity or due to issues the service is facing, the dashboard will indicate this as an error state on the top of the page. This error information is also displayed on the All Items in the Management Portal. This will indicate whether the cache is completely unavailable due to an outage or is there a temporary issue that may impact your service intermittently.

The following are the various errors you may see on the dashboard related to unavailability.

  1. Cache Service is unavailable - This error appears when the service is experiencing problems and either the cache endpoint or the SSL (secure) cache endpoint are unavailable. Such situations are actively monitored by Microsoft to mitigate them in a timely manner and restore service availability. If you see this situation persist, you should contact Microsoft support.

  2. Cache Service is partially impacted or you may see intermittent errors - This warning appears when the service availability is not impacted completely, but because the service is undergoing maintenance activity, there may be some calls to the service that experience errors or some of the keys may be temporarily inaccessible. As a best practice, we recommend that you implement a retry block in your client application for such cases.

Capacity usage for your cache

The cache service is deployed as a dedicated infrastructure. These capacity counters show how the dedicated infrastructure is being utilized with respect to memory, bandwidth and compute resources, as a percentage of the total available capacity. If any of these counters are close to 100%, we recommend that you add more capacity to your cache service. For more information about capacity planning, see Capacity Planning for Azure Managed Cache Service. The following counters are important for monitoring capacity.

Counter Details

Memory used %

The allocated amount of memory for cache is used for storing the user data and the metadata about these objects that the cache needs to store for serving requests. This counter reflects how much of the usable memory of cache is already utilized.

Note that you may not see an exact match on the amount of data you put in vs. the utilization, as it will vary depending on your object size and use of features such as tags, regions and notifications.

Bandwidth used %

The cache is hosted on a dedicated infrastructure which provides incoming and outgoing bandwidth to the system. The bandwidth utilization reflects how this bandwidth is being used. If your system as a lot of large objects, you may see the bandwidth utilization being high.

Compute used %

The cache service utilizes compute resources to service cache requests. For high throughput applications, the compute resources made available to your dedicated infrastructure can get exhausted. Hence it is important for such applications to monitor the Compute Used % to ensure that it isn’t more than 70-80%, which will ensure that your cache remains responsive and data access continues with low latency.


You may note some of the % counter values go beyond 100%. This happens when you use more than your purchased capacity. When this happens, the cache data availability and latencies can get impacted, or you may experience loss of data. We recommend that you scale your cache immediately in such a situation to prevent disruption to your service. For more information, see Scale a Cache for Azure Managed Cache Service.

Measuring cache effectiveness for your client

The key parameters that impact the effectiveness of the cache for your client application are the number of requests made to the cache and the cache hit/miss percentage for those. These parameters are also made available on the dashboard.

Counter Details

Read Requests/sec

The number of read requests (Bulk Get, Get and Enumeration) received per second from all clients since start of cache service.

Write Requests/sec

The number of write requests per second since start of cache service. Put, Add and Lock methods are included with write operations.

Cache Miss %

The percentage of unsuccessful cache requests to the total number of requests since start of cache service.


Azure collects and aggregates the data from your cache service at regular intervals. The data appears on the dashboard with a delay of a few minutes.

Monitoring the cache service using the Monitor tab

While the dashboard gives you a quick view of six common metrics, the full list of Managed Cache Service performance counters is available, and can be viewed in the Management Portal on the Monitor tab for Cache.

You can choose up to twelve metrics for the metrics table, and plot any six of those metrics on the chart by selecting the check boxes by their table headers. You can also choose which previous interval from which you want to view metrics: 1 Hour, 24 Hours, or 7 Days.

Cache Service Monitor Tab

To configure the desired metrics, click Add Metrics.

Cache Service Monitor tab Select Metrics

For a full list of the performance counters available for Managed Cache Service, see Azure Managed Cache Service Performance Counters.