Hi @OGINTZ Marina
The error RequestBodyTooLarge (HTTP 413)
means you're trying to write too much data in a single request to the Azure Table Storage sink.
Azure Table Storage Limits
Maximum request payload size: 4 MB
Maximum batch size: 100 entities per batch (all must be in the same partition)
In Azure Data Factory (ADF), when copying a large number of rows or large data entities to Table Storage, you might hit this limit.
To Fix this issue consider the below options:
Option 1:
Enable Data Partitioning in ADF Copy Activity
Go to your Copy Activity
settings.
In the Sink tab
, enable “Data Partitioning
” (if applicable).
Configure partitioning by:
Column (like PartitionKey) – splits data based on distinct values.
Round robin – spreads data across requests.
Static range – if you know the row ranges.
This helps split your data into smaller chunks within limits.
Option 2:
Reduce Batch Size in Sink
In the Sink settings of the Copy Activity, look for the Write batch size
or Maximum insert batch size (depending on your ADF version).Set it to 100 or below to stay under the batch limit.
If the entities are large (closer to 64 KB), use a smaller batch size to keep under the 4 MB request limit.
I hope this information helps. Please do let us know if you have any further queries.
Kindly consider upvoting the comment if the information provided is helpful. This can assist other community members in resolving similar issues.
Thank you.