Hello @Uthayakumar, Dinesh
Azure Data Factory's Copy Data activity can be used to ingest records in bulk into Dataverse.
However, it does not use batching requests like ExecuteMultipleRequest to export/import bulk data.
The Copy Data activity uses a scalable compute infrastructure to move data between source and sink data stores.
It can read data from various sources, transform the data, and write the data to various sinks. The activity is powered by a globally available service that can copy data between various data stores in a secure, reliable, and scalable way. If you need to use batching requests like ExecuteMultipleRequest to export/import bulk data, you can use the Microsoft Dataverse Web API directly. The Web API provides a programming model that you can use to interact with Dataverse data from your own custom code.
I hope that this response has addressed your query and helped you overcome your challenges. If so, please mark this response as Answered. This will not only acknowledge our efforts, but also assist other community members who may be looking for similar solutions.