Hello Shanmukha Yarabham (EXT-Nokia), I'm glad that you were able to resolve your issue and thank you for posting your solution so that others experiencing the same thing can easily reference this!
Since the Microsoft Q&A community has a policy that "The question author cannot accept their own answer. They can only accept answers by others ", I'll repost your solution in case you'd like to "Accept" the answer. Accepted answers show up at the top, resulting in improved discoverability for others.
Issue: Customer encountered a "BadRequest" error in Logic App while trying to read Azure Cost Analysis data from a Blob link using an HTTP action. The error indicates the file size exceeds the maximum buffer size limit of 100 MB. Customer’s file is around 150 MB, and despite attempting to enable and disable data chunks, the issue persists.
Cause: File size of 150 MB exceeds the maximum buffer size limit of 100 MB for HTTP actions in Azure Logic Apps. This limitation prevents the Logic App from reading the entire blob data in a single HTTP request.
Solution: Using Azure Data Factory (ADF) is an effective solution for handling large files without size limitations. First, create an ADF instance in the Azure portal if you don’t already have one. In the ADF UI, create a pipeline and add activities for your desired operations, such as reading and processing blob data. Define source and sink datasets, pointing to your blob storage and the destination for processed data, respectively. Use the Copy Data activity to move data from source to sink, configuring necessary transformations or mappings.
You can trigger this pipeline manually, on a schedule, or via an external trigger like an HTTP call from a Logic App. To integrate ADF with Logic Apps, add an ADF pipeline trigger in your Logic App using the Azure Data Factory connector. Select the Data Factory instance and the pipeline to trigger, providing any necessary parameters.
ADF offers scalability, no strict size limitations, rich transformation capabilities, and seamless integration with other Azure services, making it a versatile data engineering platform.