Is there any limitation on the number of files when indexing Azure DataLake Storage from Azure Cognitive Search indexer?

test29998411 281 Reputation points

My team is trying to index over millions of PDF files on Azure Datalake Storage containers with FormRecoginzer and Azure Cognitive Search indexers.

Are there any limitations of Azure Cognitive Search when indexing a large number of files?
I am concerned about this.

Is there any limit to the number of files that can be indexed from Azure Cognitive Search indexer to files on Azure DataLake Storage?

If there is a limit, we are considering splitting the number of files to be indexed by folder.

Also, if indexing fails for only some files, do we need to re-run the indexer and re-index all files?

Azure AI Search
Azure AI Search
An Azure search service with built-in artificial intelligence capabilities that enrich information to help identify and explore relevant content at scale.
760 questions
0 comments No comments
{count} votes

Accepted answer
  1. Grmacjon-MSFT 16,776 Reputation points

    Hi @test29998411 ,
    Thanks for your question.

    Based on the doc you shared it states:

    "As of October 2018, there are no longer any document count limits for any new service created at any billable tier (Basic, S1, S2, S3, S3 HD) in any region. Older services created prior to October 2018 may still be subject to document count limits.

    To determine whether your service has document limits, use the GET Service Statistics REST API. Document limits are reflected in the response, with null indicating no limits."

    What tier are you currently using?


    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful