Write to cosmos DB using ADF

Nagesh CL 621 Reputation points

Hi Experts,

I need to create a pipeline with source as "Azure SQL DB" and the target / sink being cosmos db. There might be few nested objects. Can someone provide some links with sample pipelines for the same?


Azure Cosmos DB
Azure Cosmos DB
An Azure NoSQL database service for app development.
1,470 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,777 questions
0 comments No comments
{count} votes

Accepted answer
  1. Nandan Hegde 29,911 Reputation points MVP

1 additional answer

Sort by: Most helpful
  1. Nagesh CL 621 Reputation points

    Hi @Nandan Hegde ,
    Hope you are doing well.

    I am back again with one more issue. To keep it short, i am trying to implement SCD1 with below details: -

    Source - Azure SQL DB
    Sink - Cosmos DB

    I am using mapping dataflow and within it, I am generating HashKey, comparing with Sink and using alterrow activity trying to Upsert data to Cosmos DB. Though a different HashKey is getting generated, data in cosmos DB is not getting updated. I read few articles and they had suggested to map the naturalkey to the id column of the sink (instead of autogenerated guid). I got an error when i tried to map primary key of my data to the id column. The error was id column does not accept integers. So, I had to explicitly cast to string. After casting the PK to string, inserts are working fine. But update is failing with below error: -

    ""StatusCode":"DFExecutorUserError","Message":"Job failed due to reason: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String","Details":"com.microsoft.azure.documentdb.DocumentClientException:"

    Not sure what the issue is now. Any help is appreciated.

    If not, if there is any link / article that i can refer to for implementing SCD type 1 using mapping data flow with source being anything, but target being cosmos db.

    Thanks in advance.

    Nagesh CL