How to do a bulk upload of 17.6M records in cosmos db for mongo api instance in less than an hour?

15643986 5 Reputation points
2023-03-28T14:31:13.7833333+00:00

Currently I am using a serverless instance of Cosmos db for mongo api in Canada East region. It takes almost a day to write 17.6M records to the cosmos instance. I have enabled the server side retries and I am using only a single instance of mongoClient throughout my application. The bulk executor library is currently not supported for Cosmos db for mongo api version. Are there any other ways to write these much data faster?

Note: I only want to write these much data once in the lifetime of the db instance. So going for a higher throughput will not be an ideal solution as I wont be needing it always.

Azure Cosmos DB
Azure Cosmos DB
An Azure NoSQL database service for app development.
1,448 questions
{count} votes

1 answer

Sort by: Most helpful
  1. ShaktiSingh-MSFT 13,436 Reputation points Microsoft Employee
    2023-03-28T16:00:26.2766667+00:00

    Hi 15643986 ,

    Welcome to Microsoft Q&A forum and thanks for using Azure services.

    As I understand, you want to perform bulk upload in Azure Cosmos MongoDB API.

    Since you have explored many options, I would recommend you to use mongoimport if helps in your usecase.

    Check here: Tutorial: Migrate MongoDB to Azure Cosmos DB's API for MongoDB offline using MongoDB native tools

    Additionally, you can also refer to:

    Hope above information would help. If not, please let us know so that we can further assist you.

    Thank you.

    0 comments No comments