Hi Team,
We recently started observing "Azure Cosmos DB request failed. HttpStatusCode:ServiceUnavailable" exceptions on our api.
We are in the process of moving V2 to v3.
Here is the snippt of connection settings
V2:
var connectionPolicy = new ConnectionPolicy
{
ConnectionMode = Azure.Documents.Client.ConnectionMode.Direct,
ConnectionProtocol = Protocol.Tcp,
RetryOptions = new RetryOptions()
{
MaxRetryAttemptsOnThrottledRequests = 0,
},
EnableEndpointDiscovery = true,
MaxConnectionLimit = environment == EnvironmentType.Prod ? 1000 : int.MaxValue,
RequestTimeout = TimeSpan.FromSeconds(30)
};
connectionPolicy.PreferredLocations.Add(LocationNames.EastUS2);
connectionPolicy.PreferredLocations.Add(LocationNames.WestUS2);
connectionPolicy.EnableReadRequestsFallback = true;
string primaryAuthKey = container.Resolve<ISecretManager().GetSecretAsync(Configuration.PrimaryKeySecretName).Result;
var docDBClient = new DocumentClient(new Uri(Configuration.ServiceUri), primaryAuthKey, connectionPolicy);
await docDBClient.OpenAsync();
V3
var cosmosClientOptions = new CosmosClientOptions()
{
ConnectionMode = Azure.Cosmos.ConnectionMode.Direct,
MaxRetryAttemptsOnRateLimitedRequests = 0,
RequestTimeout = TimeSpan.FromSeconds(30),
ApplicationPreferredRegions = new[]
{
LocationNames.EastUS2,
LocationNames.WestUS2
}
};
string primaryAuthKey = await container.Resolve<ISecretManager>().GetSecretAsync(Configuration.PrimaryKeySecretName);
var cosmosClient = new CosmosClient(Configuration.ServiceUri, primaryAuthKey, cosmosClientOptions);
V3 changes currently are behind the flight and not rolled out yet because we started seeing above exception.
Here are the stack trace details
{"id":"34961477","outerId":"66836843","type":"Microsoft.Azure.Documents.GoneException","message":"The requested resource is no longer available at the server.\r\nActivityId: 2252d315-6232-493e-8150-2a91e76d9d74, documentdb-dotnet-sdk/2.16.1 Host/64-bit MicrosoftWindowsNT/6.2.9200.0","parsedStack":[{"level":0,"method":"Microsoft.Azure.Documents.TimeoutHelper.ThrowGoneIfElapsed","assembly":"Microsoft.Azure.Documents.Client, Version=2.16.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35","line":0},{"level":1,"method":"Microsoft.Azure.Documents.StoreReader+d__12.MoveNext","assembly":"Microsoft.Azure.Documents.Client, Version=2.16.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35","line":0},{"level":2,"method":"System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw","assembly":"mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089","line":0},{"level":3,"method":"System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification","assembly":"mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089","line":0
Though we think v3 changes are not related to this exception we would like to know if there is anything that you can trace with activity id mentioned above to help understand issue here.