Azure Standard Tier Logic App - Timing Out After 3.75 Minutes

Anonymous
2023-12-20T11:52:03.66+00:00

Hello

We are currently working on an Azure Logic App implementation that conducts file format scans on large external client data files within Blob Storage. The files we are handling can reach sizes up to 25GB+ for a single zip file.

For a 20gb file for example, the function app took 6 minutes to complete and we have a Logic App that is checking for completion of the function app which then performs a notification actions.

Our challenge lies in the default timeout limit for single tenant apps, where the Logic App times out after 3.75 minutes. Despite our efforts to extend the timeout limit, we are still encountering issues.

Here are the configurations we've attempted:

Workflows.RuntimeConfiguration.RetentionInDays=90.00:00:00
Runtime.Backend.FlowRunTimeout=90.00:00:00
Runtime.FlowRunRetryableActionJobCallback.ActionJobExecutionTimeout=00:10:00

Despite these adjustments, the Logic App still fails at the default interval.

We would appreciate your guidance on how we can effectively increase the timeout limit or find a workaround to overcome this limitation.

Thank you for your assistance.

Kind regards,

Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
Azure Logic Apps
Azure Logic Apps
An Azure service that automates the access and use of data across clouds without writing code.
0 comments No comments
{count} votes

Answer accepted by question author
  1. Azar 31,630 Reputation points MVP Volunteer Moderator
    2023-12-20T12:03:41.6533333+00:00

    Hi Marc Hedgley

    Dealing with large file processing really is a frustrating job sometimes and i guess you've taken the right steps already by adjusting the settings, there are a few additional considerations I can recommend.

    Instead of processing the entire 25GB file in a single run, consider breaking it down into smaller chunks and processing them in parallel. This way, each smaller chunk has a better chance of completing within the timeout limits.

    Rather than processing the entire file synchronously within a single function call, you can design your Azure Function to handle chunks of the file asynchronously. Use Azure Queues, and Azure Service Bus to trigger subsequent processing for each chunk.

    and finally, Azure Durable Functions allows you to write stateful functions in a serverless environment. It's designed to handle long-running operations and can be used to implement workflows that span multiple functions.

    Hopefully, this helps, kindly accept the answer if you find it useful thanks much.


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.