504.0 GatewayTimeout & Invoking Azure function failed with HttpStatusCode - 499.

syamprasad kari 0 Reputation points
2024-11-12T09:44:30.44+00:00

We've developed an Azure Function in python that connect to a Blob Storage, reads files and writes into in Azure tables. During the process, using Azure Functions & it's running fine for small size files (Less than 100 MB). The problem is that, when try to execute bigger files (More than 200 MB) and getting below errors. Can you please us how to fix this issue.

Call to provided Azure function 'csv-validator' failed with status-'GatewayTimeout' while invoking 'POST' on 'https://-.azurewebsites.net' and message - '504.0 GatewayTimeout'.

 another one saw is Call to provided Azure function 'csv-validator' failed with status-'499' while invoking 'POST' on 'https://-.azurewebsites.net' and message - 'Invoking Azure function failed with HttpStatusCode - 499.'.

 Thank you!

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,500 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,990 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 27,131 Reputation points
    2024-11-12T13:21:53.56+00:00

    The 504 Gateway Timeout and 499 errors often occur with Azure Functions when processing larger files due to the default execution timeout limits, network constraints, or resource throttling.

    For the Azure Function, you can increase the functionTimeout value in the host.json file. The maximum allowed timeout depends on your hosting plan:

    • Consumption Plan: Up to 5 minutes by default, extendable to 10 minutes.
    • Premium Plan or Dedicated App Service Plan: No maximum limit, so you can set a longer duration if needed.

    ADF HTTP activity has its own timeout configuration, so adjust the timeout there as well.

    Reading large files all at once may cause issues. Break down the processing into smaller chunks, perhaps reading and writing data in batches, to reduce memory load and processing time per batch.

    I recommed using Durable Functions which enable long-running workflows by allowing the function to checkpoint and resume processing, which can help process large files by breaking down the task into smaller steps.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.