Lock Exception Error in Azure Service Bus

Poojak 1 Reputation point
2021-09-02T16:29:00.79+00:00

I have 3 queues and 3 azure functions reading messages from these queues. I have added this azure function in single Function app. below is my settings -

"extensions": {
"serviceBus": {
"prefetchCount": 2,
"messageHandlerOptions": {
"autoComplete": false,
"maxConcurrentCalls": 2,
"maxAutoRenewDuration": "00:05:00"
}
}
}

This is my Function Code -

public class ProcessPORelease
{
private readonly IPOEventAction _eventAction;
public ProcessPORelease(IPOEventAction pOEventAction)
{
this._eventAction = pOEventAction;
}
[FunctionName("ProcessPORelease")]
[ExponentialBackoffRetry(3, "00:00:05", "00:00:30")]
public async Task Run([ServiceBusTrigger("release", Connection = "ServiceBusConnectionstring")] string myQueueItem, ILogger log, MessageReceiver messageReceiver, string lockToken, string messageId, DateTime enqueuedTimeUtc)
{
log.LogInformation($"C# ServiceBus queue trigger - po-release function processed message: {myQueueItem}");
log.LogInformation($"MessageId={messageId}");
log.LogInformation($"EnqueuedTimeUtc={enqueuedTimeUtc}");
try
{
Model model= JsonConvert.DeserializeObject<Model>(myQueueItem);
var Response = await _eventAction.CALLFUNCTIONTOPROCESS(model);
if (Response.Status?.ToLower() == "success")
{
await messageReceiver.CompleteAsync(lockToken);
log.LogInformation($"Message Complete {messageId}");
}
else
{
await messageReceiver.DeadLetterAsync(lockToken, pDResponse?.Status);
log.LogInformation($"Message Deadletter {messageId}");
}
await Task.CompletedTask;
}
catch (Exception ex)
{
await messageReceiver.DeadLetterAsync(lockToken, ex.InnerException?.Message);
}
}
}
}

Issue is - My message is processed but I am facing - The lock supplied is invalid. Either the lock expired, or the message has already been removed from the queue, or was received by a different receiver instance.

I have searched various stackoverflow question and implemented the setting as given above but nothing works for me. Could you please let me know what exactly I am doing is wrong here. I have two different Function coded exactly like above. I believe that the issue is similar to the

https://stackoverflow.com/questions/64036222/azure-service-bus-maxconcurrentcalls-1-the-lock-supplied-is-invalid-either

I am also using these versions -

<PackageReference Include="Microsoft.ApplicationInsights" Version="2.18.0" />
<PackageReference Include="Microsoft.ApplicationInsights.WorkerService" Version="2.15.0" />
<PackageReference Include="Microsoft.Azure.Functions.Extensions" Version="1.1.0" />
<PackageReference Include="Microsoft.Azure.ServiceBus" Version="5.1.3" />
<PackageReference Include="Microsoft.Azure.WebJobs.Extensions.EventGrid" Version="2.1.0" />
<PackageReference Include="Microsoft.Azure.WebJobs.Extensions.ServiceBus" Version="4.3.0" />
<PackageReference Include="Microsoft.Azure.WebJobs.Extensions.Storage" Version="3.0.10" />
<PackageReference Include="Microsoft.Extensions.Caching.Memory" Version="3.1.17" />
<PackageReference Include="Microsoft.Extensions.Http" Version="3.1.17" />
<PackageReference Include="Microsoft.NET.Sdk.Functions" Version="3.0.13" />

Azure Service Bus
Azure Service Bus
An Azure service that provides cloud messaging as a service and hybrid integration.
594 questions
Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
4,663 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. MayankBargali-MSFT 70,011 Reputation points
    2021-09-10T06:31:01.613+00:00

    @Poojak The code and configuration looks good. If this is the intermittent issue then this would have caused to some issue at the function or service bus end and this should not effect your workflow as the next instance should consume that message.

    If it is not the case then I will suggest you to remove the prefetchCount and see if it helps you. If you still observe the same behaviour then please let me know so we can connect offline and I can review the logs at the backend.

    0 comments No comments