Request size too large - CosmosDB


I'm trying to copy from one cosmosDB container to another one. I've tried both the copy activity and to write a custom dataflow, in both situations I get error messages saying that request size is too large with sizes like this "2208232". So it's just a little bit above 2MB. Notice that in our source cosmosDB items have size very close to 2MB so it's like there is some sort of bug in ADF which is adding some size to the message. I've also set write batch size = 1 and it keeps failing.

Example of copy activity configuration:




As you can see there is no transformation, just moving items to another container, so if the item is of the correct size in the source container it should not create errors in the destination container...or am I missing something?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
7,142 questions
{count} votes