Content
How to skip/ignore zip files that have been unzipped successfully before?
I want to use this system within my work environment by letting users put zip files into the Azure Blob Storage and use Azure Data Factory to automate the process of extracting/unzipping zip files everyday at the specific time. I have followed the…


Azure Data Factory - replace expression in derived column transformation using Mapping Data Flow
Dear Team, I am trying to use Azure Data Factory for a Derived column transformation task one of my tasks is mentioned below, DESCRIPTION_TEXT : UNILEVER GROUP ##### GBR Remove trailing country code (only when it equals to country) and ##### if they…


why am I unable to get display of 3,00,000 records in azure sql database with price-tire-50 /100 DTUs?
Hello There, I recently started using Azure SQL Database as part of my Azure DE learning purpose. I tried to run the query 'select (*) from' over a table that has around 3,00,000 records -- to just fetch the records ; tried this in Query-editor within…


Pagination of REST API call with "Link" in Header and rel="next"
Hello, I have a hard time doing pagination in ADF dataflow activity. Tried many of the below, nothing worked. My Link in Header is like below - Any idea how to work on it?


Where is the storage account key for cosmos db?
I have a copy data pipeline which loads data from a json file within a blob to a cosmos DB collection. Now the sink is the cosmos dB connection . At creation time account selection method is Azure subscription , subscription is selected and then Azure…


Change Data Capture from Data Lake source using Common Data Model?
I have some D365 sample data exported to Data Lake using Common Data Model. In the Azure Data Factory Data Flow, I have configured a source to read from the manifest file under Changefeed folder and selected in this case the "HcmWorker" entity.…
ADF is not able to access Keyvault secrests via ADB notebook
We have stored credentials in Azure keyvault we have provided access to both ADF and ADB to access keyvault details.(Get,List). we are able to access secret directly from ADF if we create sample pipeline to check the access. we are able to access secret…


Copy Method in ADF - Polybase / Copy Command / Bulk Insert
Hi, We have data of million rows in delimited files placed in Azure Blob storage. We need to transfer to dedicated sql pool of Azure Synapse via Azure Data Factory So which copy method would be suitable in ADF - Copy command / polybase / Bulk Insert.…


ADF Copy Activity - Disable chunking when unchecked does not copy a xlsx file
Hi Team, The issue is while using Copy activity in ADF. Issue occurs when disable chunking is unchecked (default value). As soon as we set disable chunking to true that file is copied successfully. The file has about 36000 records. We receive the…


My data flow window is empty even though the code is still there
Hello everyone! I was working on my data flow yesterday and it was all going well. When I reopened my browser this morning, I noticed that my data flow window is completely empty. It's all blank, not even the parameters show up or the option in an empty…


There are substantial concurrent MappingDataflow executions which is causing failures due to throttling under Integration Runtime 'IR(Manage Private Network)',
There are substantial concurrent MappingDataflow executions which is causing failures due to throttling under Integration Runtime. Error code :4502. PL was running last 3 months and suddenly consistently failing with above error. I tried use high IR…


Script activity failed: Argument {0} is null or empty. Parameter name: paraKey
Hi, Please see this topic for reference: https://learn.microsoft.com/en-us/answers/questions/763892/script-activity-failed-argument-0-is-null-or-empty.html I'm getting the same error in the Script1 activity in the following JSON: { …


IndexError: string index out of range - ADF - Databricks - Python
I'm passing a UDF value from a JSON table through Azure Data Factory as the following: "schema_array":…


Webhook activity for Python3 runbook execution
Hello, I created an Automation Account and Python3 runbook. Python script takes files from Blob storage and makes some transformations on them and after that new files are created in Blob storage. I am testing on simple Python script like…


Issue with Dynamic Schema mapping in azure data factory for JSON Data
Hi Team, When I am doing dynamic schema mapping for Json data in ADF, that expression is showing me below error. Please let me know what I am missing here and that's the way I am doing. My Source has multiple fields and it is showing the same error for…


[DataFactory] - Start/Stop Tumbling Triggers does not repect recurrence
Hello, I'm trying to make the following senario work without errors Have 1 pipeline that copies incremental data if table existes and if not it creates me that table. Have 2 tumbling trigger, 1 for running throughout the day and 1 to running at night…


ADF unable to call internal API via web activity with self-host Integration Runtime
Hi, I am working on a project with Azure Data Factory. We have setup an internal application and provided a API for updating record, and we would like to trigger a POST call by web activity in ADF (with Self host IR), but failed with the follow…


Transfering .bak file from sftp_server to blob storage and storing as "PAGE BLOB", using Azure Data Factory
Hi, Im trying to move a ~3GB .bak file from an ftp_server to azure blob storage using sftp. I want the transfer to happen automatically once a day, as a new .bak file with the same name is placed at the FTP_server every night. I have looked into logical…


Azure Data Factory IR cannot be stopped or deleted
I have created an IR in Azure Data Factory. The option to stop or delete the IR is missing. IR Type is Azure. Please advise on how I can proceed please.


Can Azure Data Factory connect to Azure MySql Flexible via private endpoints?
I have an Azure MySql Flexible instance, and an Azure Data Factory instance. Both resources are in the same subscription, connected to private endpoint on the same vnet (each in it's own /29 subnet). On my Data Factory instance, I followed the steps…

