Configure API for server-sent events

This article provides guidelines for configuring an API in API Management that implements server-sent events (SSE). SSE is based on the HTML5 EventSource standard for streaming (pushing) data automatically to a client over HTTP after a client has established a connection.

Tip

API Management also provides native support for WebSocket APIs, which keep a single, persistent, bidrectional connection open between a client and server.

Prerequisites

Availability

Important

This feature is available in the Premium, Standard, Basic, and Developer tiers of API Management.

Guidelines for SSE

Follow these guidelines when using API Management to reach a backend API that implements SSE.

  • Choose service tier for long-running HTTP connections - SSE relies on a long-running HTTP connection. Long-running connections are supported in the dedicated API Management tiers, but not in the Consumption tier.

  • Keep idle connections alive - If a connection between client and backend could be idle for 4 minutes or longer, implement a mechanism to keep the connection alive. For example, enable a TCP keepalive signal at the backend of the connection, or send traffic from the client side at least once per 4 minutes.

    This configuration is needed to override the idle session timeout of 4 minutes that is enforced by the Azure Load Balancer, which is used in the API Management infrastructure.

  • Relay events immediately to clients - Turn off response buffering on the forward-request policy so that events are immediately relayed to the clients. For example:

    <forward-request timeout="120" fail-on-error-status-code="true" buffer-response="false"/>
    
  • Avoid other policies that buffer responses - Certain policies such as validate-content can also buffer response content and shouldn't be used with APIs that implement SSE.

  • Disable response caching - To ensure that notifications to the client are timely, verify that response caching isn't enabled. For more information, see API Management caching policies.

  • Test API under load - Follow general practices to test your API under load to detect performance or configuration issues before going into production.

Next steps