A community member has associated this post with a similar question:
filter data from json
Only moderators can edit this content.
creating etl by filter rows from cosmos db api

I am new to azure cloud platform and my task is to create ETL pipeline to populate data from cosmos db into snowflake. I have Azure Cosmos DB for MongoDB API connection string and so far I have created a linked service in the ADF. I am not sure on how to do an incremental load. I am only interested in picking up rows that are inserted or changed.
Someone suggested to look at this article but I don't want to write code and would like to work inside Azure Data Factory.
I can output the data in JSON format and I know how to upload into Snowflake from azure blob storage.
I was thinking of only grab records where UpdateDate = getdate() - 1 but not sure if it's possible from the following Sample data:
[{
"_id": {
"$binary": "H82Maue1G0y+3hmOnKYNoQ==",
"$type": "03"
},
"Type": 0,
"NoteId": 1518722702,
"Number": 9,
"PinnedInfo": null,
"CreatedDate": [
638071336835472472,
180
],
"UpdatedDate": [
638137843666973190,
60
],
"ScheduledDate": [
638071228585440000,
0
],
"ModifiedDate": null,
"FollowUpDate": null,
"Visibility": 0,
"MailedTo": null,
"ReschedReason": null,
What should be my default value so it matches with UpdateDate?
I am stuck at this point due to lack of knowledge and would appreciate the assistance.
Thanks in advance