multiple after the fact events

Prasad, Rakesh 221 Reputation points


i have a blobtrigger which is listening on one particular container. 3rd party process writes file in this container and i want my trigger to do some processing on basis of these incoming file.

logic inside a trigger works fine. but i can see these logs getting printed long after all the files has been processed.

while monitoring function from "Application Insight - live metrics", i can see logs like this even after couple of hrs.

Blob 'trigger/******************.json' will be skipped for function 'BlobTrigger' because this blob with ETag '"0x8D93848EA14CA40"' has already been processed. PollId: 'f5726ca3-3608-4527-9b9d-8d23ebb10061'. Source: 'ContainerScan'.

I am sure no new files are been written in incoming blob, last file written was couple of hrs back.

I can also see 1 server instance (for blob listener) still running.

I am afraid because of these false +ve, i may be paying for unnecessary compute. How to fix this?, and why is this happening?

I am using python 3.9 and latest dependencies.

Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
2,991 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
1,675 questions
0 comments No comments
{count} votes

1 answer

Sort by: Oldest
  1. Pramod Valavala 14,891 Reputation points Microsoft Employee

    @Prasad, Rakesh The blob trigger polls the storage account for new blobs and keeps track of ones that it processed. This is expected and there is indeed a constant load of polling on your storage account.

    The alternative is to leverage events on your storage account and use the Event Grid trigger instead. This alternate is mentioned in the docs as well, including other benefits as well.

    0 comments No comments