Index data from Azure Cosmos DB for MongoDB for queries in Azure AI Search
Important
MongoDB API support is currently in public preview under supplemental Terms of Use. Currently, there is no SDK support.
In this article, learn how to configure an indexer that imports content from Azure Cosmos DB for MongoDB and makes it searchable in Azure AI Search.
This article supplements Create an indexer with information that's specific to Cosmos DB. It uses the REST APIs to demonstrate a three-part workflow common to all indexers: create a data source, create an index, create an indexer. Data extraction occurs when you submit the Create Indexer request.
Because terminology can be confusing, it's worth noting that Azure Cosmos DB indexing and Azure AI Search indexing are different operations. Indexing in Azure AI Search creates and loads a search index on your search service.
Prerequisites
Register for the preview to provide scenario feedback. You can access the feature automatically after form submission.
An Azure Cosmos DB account, database, collection, and documents. Use the same region for both Azure AI Search and Azure Cosmos DB for lower latency and to avoid bandwidth charges.
An automatic indexing policy on the Azure Cosmos DB collection, set to Consistent. This is the default configuration. Lazy indexing isn't recommended and may result in missing data.
Read permissions. A "full access" connection string includes a key that grants access to the content, but if you're using Azure roles, make sure the search service managed identity has Cosmos DB Account Reader Role permissions.
A REST client to create the data source, index, and indexer.
Limitations
These are the limitations of this feature:
Custom queries aren't supported for specifying the data set.
The column name
_ts
is a reserved word. If you need this field, consider alternative solutions for populating an index.The MongoDB attribute
$ref
is a reserved word. If you need this in your MongoDB collection, consider alternative solutions for populating an index.
As an alternative to this connector, if your scenario has any of those requirements, you could use the Push API/SDK or consider Azure Data Factory with an Azure AI Search index as the sink.
Define the data source
The data source definition specifies the data to index, credentials, and policies for identifying changes in the data. A data source is defined as an independent resource so that it can be used by multiple indexers.
For this call, specify a preview REST API version. You can use 2020-06-30-preview or later to create a data source that connects via the MongoDB API. We recommend the latest preview REST API.
Create or update a data source to set its definition:
POST https://[service name].search.windows.net/datasources?api-version=2024-05-01-preview Content-Type: application/json api-key: [Search service admin key] { "name": "[my-cosmosdb-mongodb-ds]", "type": "cosmosdb", "credentials": { "connectionString": "AccountEndpoint=https://[cosmos-account-name].documents.azure.com;AccountKey=[cosmos-account-key];Database=[cosmos-database-name];ApiKind=MongoDb;" }, "container": { "name": "[cosmos-db-collection]", "query": null }, "dataChangeDetectionPolicy": { "@odata.type": "#Microsoft.Azure.Search.HighWaterMarkChangeDetectionPolicy", "highWaterMarkColumnName": "_ts" }, "dataDeletionDetectionPolicy": null, "encryptionKey": null, "identity": null }
Set "type" to
"cosmosdb"
(required).Set "credentials" to a connection string. The next section describes the supported formats.
Set "container" to the collection. The "name" property is required and it specifies the ID of the database collection to be indexed. For Azure Cosmos DB for MongoDB, "query" isn't supported.
Set "dataChangeDetectionPolicy" if data is volatile and you want the indexer to pick up just the new and updated items on subsequent runs.
Set "dataDeletionDetectionPolicy" if you want to remove search documents from a search index when the source item is deleted.
Supported credentials and connection strings
Indexers can connect to a collection using the following connections. For connections that target the MongoDB API, be sure to include "ApiKind" in the connection string.
Avoid port numbers in the endpoint URL. If you include the port number, the connection will fail.
Full access connection string |
---|
{ "connectionString" : "AccountEndpoint=https://<Cosmos DB account name>.documents.azure.com;AccountKey=<Cosmos DB auth key>;Database=<Cosmos DB database id>;ApiKind=MongoDb" } |
You can get the Cosmos DB auth key from the Azure Cosmos DB account page in the Azure portal by selecting Connection String in the left navigation pane. Make sure to copy Primary Password and replace Cosmos DB auth key value with it. |
Managed identity connection string |
---|
{ "connectionString" : "ResourceId=/subscriptions/<your subscription ID>/resourceGroups/<your resource group name>/providers/Microsoft.DocumentDB/databaseAccounts/<your cosmos db account name>/;(ApiKind=[api-kind];)" } |
This connection string doesn't require an account key, but you must have previously configured a search service to connect using a managed identity and created a role assignment that grants Cosmos DB Account Reader Role permissions. See Setting up an indexer connection to an Azure Cosmos DB database using a managed identity for more information. |
Add search fields to an index
In a search index, add fields to accept the source JSON documents or the output of your custom query projection. Ensure that the search index schema is compatible with source data. For content in Azure Cosmos DB, your search index schema should correspond to the Azure Cosmos DB items in your data source.
Create or update an index to define search fields that will store data:
POST https://[service name].search.windows.net/indexes?api-version=2024-05-01-preview Content-Type: application/json api-key: [Search service admin key] { "name": "mysearchindex", "fields": [{ "name": "doc_id", "type": "Edm.String", "key": true, "retrievable": true, "searchable": false }, { "name": "description", "type": "Edm.String", "filterable": false, "searchable": true, "sortable": false, "facetable": false, "suggestions": true }] }
Create a document key field ("key": true). For a search index based on a MongoDB collection, the document key can be "doc_id", "rid", or some other string field that contains unique values. As long as field names and data types are the same on both sides, no field mappings are required.
"doc_id" represents "_id" for the object identifier. If you specify a field of "doc_id" in the index, the indexer populates it with the values of the object identifier.
"rid" is a system property in Azure Cosmos DB. If you specify a field of "rid" in the index, the indexer populates it with the base64-encoded value of the "rid" property.
For any other field, your search field should have the same name as defined in the collection.
Create additional fields for more searchable content. See Create an index for details.
Mapping data types
JSON data type | Azure AI Search field types |
---|---|
Bool | Edm.Boolean, Edm.String |
Numbers that look like integers | Edm.Int32, Edm.Int64, Edm.String |
Numbers that look like floating-points | Edm.Double, Edm.String |
String | Edm.String |
Arrays of primitive types such as ["a", "b", "c"] | Collection(Edm.String) |
Strings that look like dates | Edm.DateTimeOffset, Edm.String |
GeoJSON objects such as { "type": "Point", "coordinates": [long, lat] } | Edm.GeographyPoint |
Other JSON objects | N/A |
Configure and run the Azure Cosmos DB for MongoDB indexer
Once the index and data source have been created, you're ready to create the indexer. Indexer configuration specifies the inputs, parameters, and properties controlling run time behaviors.
Create or update an indexer by giving it a name and referencing the data source and target index:
POST https://[service name].search.windows.net/indexers?api-version=2024-05-01-preview Content-Type: application/json api-key: [search service admin key] { "name" : "[my-cosmosdb-indexer]", "dataSourceName" : "[my-cosmosdb-mongodb-ds]", "targetIndexName" : "[my-search-index]", "disabled": null, "schedule": null, "parameters": { "batchSize": null, "maxFailedItems": 0, "maxFailedItemsPerBatch": 0, "base64EncodeKeys": false, "configuration": {} }, "fieldMappings": [], "encryptionKey": null }
Specify field mappings if there are differences in field name or type, or if you need multiple versions of a source field in the search index.
See Create an indexer for more information about other properties.
An indexer runs automatically when it's created. You can prevent this by setting "disabled" to true. To control indexer execution, run an indexer on demand or put it on a schedule.
Check indexer status
To monitor the indexer status and execution history, send a Get Indexer Status request:
GET https://myservice.search.windows.net/indexers/myindexer/status?api-version=2024-05-01-preview
Content-Type: application/json
api-key: [admin key]
The response includes status and the number of items processed. It should look similar to the following example:
{
"status":"running",
"lastResult": {
"status":"success",
"errorMessage":null,
"startTime":"2022-02-21T00:23:24.957Z",
"endTime":"2022-02-21T00:36:47.752Z",
"errors":[],
"itemsProcessed":1599501,
"itemsFailed":0,
"initialTrackingState":null,
"finalTrackingState":null
},
"executionHistory":
[
{
"status":"success",
"errorMessage":null,
"startTime":"2022-02-21T00:23:24.957Z",
"endTime":"2022-02-21T00:36:47.752Z",
"errors":[],
"itemsProcessed":1599501,
"itemsFailed":0,
"initialTrackingState":null,
"finalTrackingState":null
},
... earlier history items
]
}
Execution history contains up to 50 of the most recently completed executions, which are sorted in the reverse chronological order so that the latest execution comes first.
Indexing new and changed documents
Once an indexer has fully populated a search index, you might want subsequent indexer runs to incrementally index just the new and changed documents in your database.
To enable incremental indexing, set the "dataChangeDetectionPolicy" property in your data source definition. This property tells the indexer which change tracking mechanism is used on your data.
For Azure Cosmos DB indexers, the only supported policy is the HighWaterMarkChangeDetectionPolicy
using the _ts
(timestamp) property provided by Azure Cosmos DB.
The following example shows a data source definition with a change detection policy:
"dataChangeDetectionPolicy": {
"@odata.type": "#Microsoft.Azure.Search.HighWaterMarkChangeDetectionPolicy",
" highWaterMarkColumnName": "_ts"
},
Indexing deleted documents
When rows are deleted from the collection, you normally want to delete those rows from the search index as well. The purpose of a data deletion detection policy is to efficiently identify deleted data items. Currently, the only supported policy is the Soft Delete
policy (deletion is marked with a flag of some sort), which is specified in the data source definition as follows:
"dataDeletionDetectionPolicy"": {
"@odata.type" : "#Microsoft.Azure.Search.SoftDeleteColumnDeletionDetectionPolicy",
"softDeleteColumnName" : "the property that specifies whether a document was deleted",
"softDeleteMarkerValue" : "the value that identifies a document as deleted"
}
If you're using a custom query, make sure that the property referenced by softDeleteColumnName
is projected by the query.
The following example creates a data source with a soft-deletion policy:
POST https://[service name].search.windows.net/datasources?api-version=2024-05-01-preview
Content-Type: application/json
api-key: [Search service admin key]
{
"name": ["my-cosmosdb-mongodb-ds]",
"type": "cosmosdb",
"credentials": {
"connectionString": "AccountEndpoint=https://[cosmos-account-name].documents.azure.com;AccountKey=[cosmos-account-key];Database=[cosmos-database-name];ApiKind=MongoDB"
},
"container": { "name": "[my-cosmos-collection]" },
"dataChangeDetectionPolicy": {
"@odata.type": "#Microsoft.Azure.Search.HighWaterMarkChangeDetectionPolicy",
"highWaterMarkColumnName": "_ts"
},
"dataDeletionDetectionPolicy": {
"@odata.type": "#Microsoft.Azure.Search.SoftDeleteColumnDeletionDetectionPolicy",
"softDeleteColumnName": "isDeleted",
"softDeleteMarkerValue": "true"
}
}
Next steps
You can now control how you run the indexer, monitor status, or schedule indexer execution. The following articles apply to indexers that pull content from Azure Cosmos DB: