Getting an error : "Message=The request body is too large and exceeds the maximum permissible limit." when trying to copy azure storage table in AzDF

OGINTZ Marina 175 Reputation points
2025-04-01T08:40:29.4633333+00:00

Hi,

I have a pipeline that is copying azure storage tables from one to another and it's working good but in one of the runs I got this error :

Failure happened on 'Sink' side. 'Type=Azure.RequestFailedException,Message=The request body is too large and exceeds the maximum permissible limit. RequestId:32052a86-8002-012a-356f-a1fe6a000000 Time:2025-03-30T12:30:48.4084304Z Status: 413 (The request body is too large and exceeds the maximum permissible limit.) ErrorCode: RequestBodyTooLarge Content: {"odata.error":{"code":"RequestBodyTooLarge","message":{"lang":"en-US","value":"The request body is too large and exceeds the maximum permissible limit.\nRequestId:32052a86-8002-012a-356f-a1fe6a000000\nTime:2025-03-30T12:30:48.4084304Z"}}} Headers: x-ms-request-id: 32052a86-8002-012a-356f-a1fe6a000000 x-ms-client-request-id: ce261b4b-399f-4bd0-b3b3-551a478ecff3 x-ms-version: REDACTED x-ms-error-code: REDACTED Content-Length: 239 Content-Type: application/json Date: Sun, 30 Mar 2025 12:30:47 GMT Server: Windows-Azure-Table/1.0 Microsoft-HTTPAPI/2.0 ,Source=Azure.Data.Tables,'

What can I do to fix this issue?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,651 questions
{count} votes

Accepted answer
  1. J N S S Kasyap 3,780 Reputation points Microsoft External Staff Moderator
    2025-04-01T09:37:24.6633333+00:00

    Hi @OGINTZ Marina

     The error RequestBodyTooLarge (HTTP 413) means you're trying to write too much data in a single request to the Azure Table Storage sink.
     

    Azure Table Storage Limits

     Maximum request payload size: 4 MB

     Maximum batch size: 100 entities per batch (all must be in the same partition)

     In Azure Data Factory (ADF), when copying a large number of rows or large data entities to Table Storage, you might hit this limit.

    To Fix this issue consider the below options:

    Option 1:

    Enable Data Partitioning in ADF Copy Activity 

    Go to your Copy Activity settings.

     In the Sink tab, enable “Data Partitioning” (if applicable).

     Configure partitioning by:

     Column (like PartitionKey) – splits data based on distinct values.

     Round robin – spreads data across requests.

     Static range – if you know the row ranges.

     This helps split your data into smaller chunks within limits.

    Option 2:

    Reduce Batch Size in Sink

    In the Sink settings of the Copy Activity, look for the Write batch size or Maximum insert batch size (depending on your ADF version).Set it to 100 or below to stay under the batch limit.

     If the entities are large (closer to 64 KB), use a smaller batch size to keep under the 4 MB request limit.

    I hope this information helps. Please do let us know if you have any further queries.

    Kindly consider upvoting the comment if the information provided is helpful. This can assist other community members in resolving similar issues.

    Thank you.

    1 person found this answer helpful.
    0 comments No comments

1 additional answer

Sort by: Most helpful
  1. Nandan Hegde 36,151 Reputation points MVP Volunteer Moderator
    2025-04-01T14:35:33.0966667+00:00

    Hey @OGINTZ Marina

    As your file size if 19MB and as per MSFT doc: https://learn.microsoft.com/en-us/troubleshoot/azure/azure-storage/blobs/connectivity/request-body-large
    the size is the limitation causing the issue.

    So you can partition the storage tables in chunks and copy in iterations rather than 1 single copy leading to the error.

    https://learn.microsoft.com/en-us/troubleshoot/azure/azure-storage/blobs/connectivity/request-body-large

    You can use timestamp to filter the data or some key etc


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.