Logic App failed to extract API data files exceeding 100 MB

AMJ 6 Reputation points
2022-01-24T23:42:31.897+00:00

Logic App failed to extract API data files exceeding 100 MB as below:

168008-extract-file.jpg

As you could see a data file over 100 MB has reached the limit set up by Logic App. Followed MS instruction https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-handle-large-messages as below:

168056-help-instruction.jpg

It didn’t work. This is an API request worked well to extract data (< 100 MB) from a client and saved to our data lake.

I also tried to use Copy activity in data factory on the same API request to pull the data across. It worked for data files > 100 MB, but it was not sustained. When working on a number of files the transaction was interrupted often by saying the source file was not ready to be caught and sink to destination. Though I increased the timeout as below:

168081-http.jpg

It didn’t help. As a matter of fact there was no timeout required to get the response with required data when working in Logic App.

In a nutshell (for API request):
• Data transaction worked well in Logic App so long as the data file is < 100 MB;
• Data transaction was intermittently interrupted in Data Factory, but the data file can be > 100 MB.

Azure Logic Apps
Azure Logic Apps
An Azure service that automates the access and use of data across clouds without writing code.
3,560 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,657 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. MayankBargali-MSFT 70,941 Reputation points Moderator
    2022-01-31T12:32:29.977+00:00

    @AMJ Apology for the delay. It looks like there are two issues. One while using the logic app HTTP connector and the other data factory activity call. I have added the azure-data-factory tag so the expert can comment on your second issue but it is always suggested to create two different issues so it would be easier for the community to search for a different issue with a different service. I will reach out to azure data factory team to comment on your second issue.

    On your first issue with logic app HTTP connector. This is the hard limit of the consumption logic app as mentioned here for the Maximum input or output size and holds true for the HTTP trigger. Depending on the different connector this can go up to 1GB.

    This Maximum input or output size can be changed in the case of single tenant logic app as mentioned here. With regards to the maximum limit for the single tenant logic app, it depends on various factors like what is the compute and memory size (Workflow Standard plan type) you are using and how many of such messages will be processed in parallel, etc.

    1 person found this answer helpful.

  2. AMJ 6 Reputation points
    2022-02-14T13:01:13.987+00:00

    @MayankBargali-MSFT It has been turned on as below:

    174069-09-content-transfer.png

    The Runtime.FlowRunActionJob.MaximumActionResultSize only helped with HTTP action to extract data file exceeding 100 MB, but won't help Create blob (V2) action to save a file exceeding 100 MB in the data lake as below:

    174084-10-create-job-failed-on-100-mb.png

    I have run 2 files this time, 1 MB (succeeded) and 180 MB (failed) ad below:

    174085-11-two-input-files.png

    The Host.json used as below:

    174117-12-host-json.png

    Any more thoughts?

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.