question

KediPeng-1011 avatar image
0 Votes"
KediPeng-1011 asked AnuragSharma-08 commented

Cosmos stored procedure continuation not working

Hi, I've been following this post to hand the bulkUpload over 10,000 docs to cosmos DB, I followed the steps and exec bulk upload stored procedure from .net core sdk.

However, I always get the 408 timeout error when the first execution ends. Ideally, it should return the count it has processed and so I can call the procedure in code again.

Which part is wrong? Thanks in advance.

store procedure:

 function bulkUpload(docs) {
   var container = getContext().getCollection();
   var containerLink = container.getSelfLink();
   var count = 0;
   if (!docs) throw new Error("The array is undefined or null.");
   var docsLength = docs.length;
   if (docsLength == 0) {
     getContext()
       .getResponse()
       .setBody(0);
     return;
   }
   tryCreate(docs[count], callback);
   function tryCreate(doc, callback) {
     var isAccepted = container.createDocument(containerLink, doc, callback);
     if (!isAccepted)
       getContext()
         .getResponse()
         .setBody(count);
   }
   function callback(err, doc, options) {
     if (err) throw err;
     count++;
     if (count >= docsLength) {
       getContext()
         .getResponse()
         .setBody(count);
     } else {
       tryCreate(docs[count], callback);
     }
   }
 }


.net code:


 public static async Task Main(string[] args)
  {
      using (CosmosClient client = new CosmosClient(_endpointUri, _primaryKey))
      {
          Database database = client.GetDatabase(_databaseId);
          Container container = database.GetContainer(_containerId);
    
          List<Food> foods = new Bogus.Faker<Food>()
          .RuleFor(p => p.Id, f => (-1 - f.IndexGlobal).ToString())
          .RuleFor(p => p.Description, f => f.Commerce.ProductName())
          .RuleFor(p => p.ManufacturerName, f => f.Company.CompanyName())
          .RuleFor(p => p.FoodGroup, f => "Energy Bars")
          .Generate(10000);
    
          int pointer = 0;
          while (pointer < foods.Count)
          {
              StoredProcedureExecuteResponse<int> result = await container.Scripts.ExecuteStoredProcedureAsync<int>("bulkUpload", new PartitionKey("Energy Bars"), new dynamic[] {foods.Skip(pointer)});
              pointer += result.Resource;
              await Console.Out.WriteLineAsync($"{pointer} Total Items\t{result.Resource} Items Uploaded in this Iteration");
          }
    
      }
  }






azure-cosmos-db
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

AnuragSharma-08 avatar image
0 Votes"
AnuragSharma-08 answered AnuragSharma-08 commented

Hi @KediPeng-1011, welcome to Microsoft QnA forum.

As the error message says this is the issue with request timeout. We are trying to upload 10,000 documents and default timeout value for a request is 10 seconds. In this case, loading 10,000 documents could take longer than 10 seconds depending upon various factor like proximity of Azure Cosmos location from the code we are running, network latency etc.

If we increase the timeout to a considerable value, it will work. I tried with one minute of it and it did work fine. Please check the code below:

 public static async Task Main(string[] args)
     {
         CosmosClientOptions ops = new CosmosClientOptions();
         ops.RequestTimeout = new TimeSpan(0,1,0);
         using (CosmosClient client = new CosmosClient(_endpointUri, _primaryKey, ops))
         {
             Database database = client.GetDatabase(_databaseId);
             Container container = database.GetContainer(_containerId);
    
             List<Food> foods = new Bogus.Faker<Food>()
             .RuleFor(p => p.Id, f => (-1 - f.IndexGlobal).ToString())
             .RuleFor(p => p.Description, f => f.Commerce.ProductName())
             .RuleFor(p => p.ManufacturerName, f => f.Company.CompanyName())
             .RuleFor(p => p.FoodGroup, f => "Energy Bars")
             .Generate(10000);
    
             int pointer = 0;
             while (pointer < foods.Count)
             {
                 StoredProcedureExecuteResponse<int> result = await container.Scripts.ExecuteStoredProcedureAsync<int>("bulkUpload", new PartitionKey("Energy Bars"), new dynamic[] { foods.Skip(pointer) });
                 pointer += result.Resource;
                 await Console.Out.WriteLineAsync($"{pointer} Total Items\t{result.Resource} Items Uploaded in this Iteration");
             }
    
         }
     }

Please let me know if it works if you face any issues still.



If answer helps, please mark it 'Accept Answer'



· 6
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi @AnuragSharma-MSFT ,

Thanks for your reply, but it's still not working for me.

Actually, it's practically working. For example, I generate 100,000 docs and try to call the proc, the proc returns a 408 and then the code exit. But when querying through Portal, I can see around 8,000 docs are inserted. Sometimes it can iterate one or two rounds and still breaks at a point. I've tried to increase the requestTimeOut property too.

Does that have something to do with the RU throughput?

0 Votes 0 ·

Hi @KediPeng-1011, thanks for replying back. Below are the setting I used:

  1. Manual Throughput : 4000 RUs

  2. Request Timeout: 1 min

  3. Number of documents to be uploaded: 10,000

  4. Region: West US

Could you please provide above data for your account and I can try on the same?

Mostly it could be throttling the requests based on RUs set but I can confirm once I have the above data.




0 Votes 0 ·

Manual Throughput : 30000 RUs (I set it to a very high RU for testing)

Request Timeout: 1 hour

Number of documents to be uploaded: 20,000

Region: Australia Southeast

0 Votes 0 ·
Show more comments