Delaying would stack up connections, and may queue up more work than the server can do. This is why throttling rejects requests, rather than delay.
the calling client should respect the api limits.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
I am using WebApiThrottle for rate limiting in my ASP.NET Web API 2 project, but I am having problems with concurrent requests. Instead of sending 429 too many requests responses, is it possible to delay concurrent requests, for example, allowing only one request per second? Here is my WebApiConfig:
public static class WebApiConfig
{
public static void Register(HttpConfiguration config)
{
// Web API configuration and services
config.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling = ReferenceLoopHandling.Ignore;
config.Formatters.JsonFormatter.SerializerSettings.DateTimeZoneHandling =
DateTimeZoneHandling.Local;
// Web API routes
config.MapHttpAttributeRoutes();
config.Routes.MapHttpRoute(
"DefaultApi",
"api/{controller}/{id}",
new { id = RouteParameter.Optional }
);
config.MessageHandlers.Add(new ThrottlingHandler()
{
Policy = new ThrottlePolicy(perSecond: 1, perMinute: 20)
{
IpThrottling = true,
EndpointThrottling = true
},
Repository = new CacheRepository(),
QuotaExceededMessage = "You may only perform this action every {0} seconds."
});
config.MessageHandlers.Add(new RequestResponseHandler());
config.Filters.Add(new CustomExceptionFilter());
var resolver = config.DependencyResolver;
var basicAuth = (BasicAuthenticationAttribute)resolver.GetService(typeof(BasicAuthenticationAttribute));
}
}
Delaying would stack up connections, and may queue up more work than the server can do. This is why throttling rejects requests, rather than delay.
the calling client should respect the api limits.