It depends on how many partitions we are talking about.
You can create a stored procedure that takes a partionKey and then queries all items in that partition and deletes them. Then you don't have get all the items in the Azure Functions first.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
I am loading data from csv files (file size ~40 MB) to Azure Cosmos DB using Azure Data Factory.If data mismatches happened during dataload, I want to delete the data entered using the file and reload once again.The data will be loaded across multiple partitions.I tried deleting data using Azure Functions,but will get timed out 230sec.
Please help me with the preferred way to delete large data from multiple partitions. Kindly share any links that explains the method.
It depends on how many partitions we are talking about.
You can create a stored procedure that takes a partionKey and then queries all items in that partition and deletes them. Then you don't have get all the items in the Azure Functions first.