question

AnkitKumar-6861 avatar image
0 Votes"
AnkitKumar-6861 asked Riteshkumar-6849 answered

How to retrieve a set of documents from Cosmos Db from a set of IDs

I am looking to retrieve the items from my Cosmos Db collection and I have a list of Ids which I want to retrieve. all code are running in an Azure Function and below is the code where I get all Ids in List filteredResult where I get list of all Ids. I am looking the best way to complete this code to retrieve all items from my Cosmos Db collection considering 30-40 Ids at a time.

public static void Run([ServiceBusTrigger("testSB", "SubscriberName", Connection = "AzureServiceBusString")] string mySbMsg,
[CosmosDB(
databaseName: "DBName",
collectionName: "CollectionName",
ConnectionStringSetting = "CosmosDBConnection")] DocumentClient client,
ILogger log)
{

     try {
          log.LogInformation($"C# ServiceBus topic trigger function processed message: {mySbMsg}");


            
         var jsonSerializerSettings = new JsonSerializerSettings();
         jsonSerializerSettings.MissingMemberHandling = MissingMemberHandling.Ignore;
         List<MyItem> lists = JsonConvert.DeserializeObject<List<MyItem>>(mySbMsg, jsonSerializerSettings);
         List<string> filteredResult = (from s in lists
                                       where s.DocType == "TEST"
                              select s.Id).ToList();

     }
azure-cosmos-db
· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Please let us know if you should have any additional questions or require further assistance, @AnkitKumar-6861. Regards, Mike.

0 Votes 0 ·
AnuragSharma-08 avatar image
0 Votes"
AnuragSharma-08 answered

Hi @AnkitKumar-6861, welcome to Microsoft Q&A forum. Really apologize for the delay in response.

So if we understand it correctly, you want to pass a list of string and get all the documents from the Azure Cosmos DB. There could be many ways to achieve this but you can use below code to achieve the same. This code can be used after you created the 'filteredResult' (just replace the 'input' list with 'filteredResult')

  List<string> input = new List<string>();
                 input.Add("1");
                 input.Add("2");
                 input.Add("3");                
                  var option = new FeedOptions { EnableCrossPartitionQuery = true };               
                  IQueryable<Family> queryable = client.CreateDocumentQuery<Family>(UriFactory.CreateDocumentCollectionUri("families", "items").ToString(), "SELECT * FROM books where books.id IN " + "('" + string.Join( "','", input) + "')",option);
                  List<Family> posts = queryable.ToList();
                  Console.WriteLine("Read count = {0}", posts.Count);

Also notice I created a model class for document properties as below:

 public class Family
     {
         public int id;
         public string city;
     }

Please let us know if this helps. Or else we can discuss further on the same.



If answer helps, please select 'Accept Answer' as it could help other community members looking for similar issues.



5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Riteshkumar-6849 avatar image
0 Votes"
Riteshkumar-6849 answered

I was wondering if I have Id list of say 500,000 length (fetching ids from parallel indexed Persistence layer SQL/warehouse etc) what should faster and more optimised

  1. parallel (single) requests for document on Ids in batch of 5K ? (total 500,000 requests with RU optimised)

  2. queries with "SELECT * FROM books where books.id IN " + "('" + string.Join( "','", input) + "')",option). for a batch of 5K (Total 500 requests with long size queries)

And then iterating it for all the docs of 500,000 Ids i.e 500 batches





5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.