ADF Copy Activity - Source Schema not fixed
Hi Experts, I am still in learning phase of ADF. I have a scenario wherein I use Copy activity with source dataset connecting to SFTP server and fetch csv files from there and storing it in Gen2 containers. Inside the Copy activity, schema mapping has…
Azure Data Factory
Issue with Lookup activity in Data Factory
I am facing issue while updating the Error message of a previous activity into a sql table using Look up activity . Below is the error message Error:Failure happened on 'Source' side.…
Azure Monitor
Azure Data Factory
partition delta files based on a particular column, as we write them in to a sink.
I am extracting my changed data every night as parquet and then merging (upsert) them into my delta lake (delta file). In the optimization tab I am setting dynamic range and passing the column name as a parameter. When I check the input json, I see…
Azure Data Factory
How to implement logic if one of any five different boolean variable are true...
I have a daily ELT process that picks up some file from blob storage. This process has five required files that need to be present for it to proceed. This is the part of the logic I am trying to implement. I already have the logic that checks to see…
Azure Data Factory
Copy Activity erroring 'received an invalid column length'
copy activity failed when encounter one record with unexpected length, is there a way to log the record instead of failing the activity? configured as 'Skip incompatible rows' already and it only for mismatch with data type. if can log data truncation…
Azure Data Factory
update issue
Can you please see what I am doing wrong to cause insert as well as update? I am simply trying to update the Delta sink parquet file by setting IsDeleted field = 1 when the source row is deleted. The pipeline calls the dataflow and the source…
Azure Data Factory
How to merge files by picking them dynamically
There are a lot of files with timestamp suffixed in their filename. I keep getting few of those files each day, and some of those few files have multiple copies of them with different timestamp (hh or mm or ss component of timestamp differ on each day). …
Azure Data Factory
ADF SHIR; MySQL; and connectivity issues
tl;dr; The self-hosted integration runtime, SHIR, reports connection timeout errors to MySQL instances whereas a query from HeidiSQL/MySQL Workbench to the same instance works fine. 2021-10-18 update It appears to be a timeout issue. The SHIR drops the…
Azure Data Factory
Copy Data Import Schemas not working for API call in ADF
In the last 2 weeks, the "Import Schemas" button in the Mapping tab of the Copy Data activity has quit working when the source dataset is an API. This obviously affects new development, but to test it, I attempted to import the schema of an…
Azure Data Factory
Dbase DBF File to Azure SQL
Problem Statement : Dbase .dbf files to be exported to Azure SQL using Azure Data Factory. Looking for a solution : How we can export .dbf files to Azure SQL using ADF. Is there any connectors and transformations for storing the data from Dbase files…
Azure Data Factory
can't parse json files stored as octet-stream
i am storing data in azure by using the data reader getstream method from sql server. This allows me to upload the stream to azure blob via c# without incurring memory issues. I can read the file which is a gz json file formatted as a list of json…
Azure Data Factory
Copy Activty Issue
Hello Team, I am using copy activity Source :Blob storage Target : Snowflake Table(And using procedure to run the copy command to transfer the data).The data is getting copied to the target table from blob sotrage file .But the…
Azure Data Factory
Write semi structured csv from Azure Data Factory
I've been tasked with writing a CSV file with a certain format - NEM12 (see link below) MDFF-Specification-NEM12--NEM13-v106.pdf I'm stumped as to how I could do this in ADF. The general format has a different number of columns per row, depending…
Azure Data Factory
Parametrize Snowflake LinkedService
I am trying to parametrise snowflake linkedservice credentials. I am trying to read from keyvault. Based on dbname,i need fetch different secrets from keyvault .I am getting error. I am not sure how to do it.
Azure Key Vault
Azure Data Factory
ADF Complex json file load
Hi Team, I am having a really complex json file as shown in below sample. Kindly provide me input to deal with this kind of file to load into SQL table by using ADF. [ { "TBL": "Aisha_abc", "Row": [ …
Azure Synapse Analytics
Azure Data Factory
Data Factory using Key vault for linked Services Information
Im using the following information https://learn.microsoft.com/en-us/azure/data-factory/store-credentials-in-key-vault I go through. get the Managed Identity Object ID from data Factory In Key vault I do an Access Policy with GET. and I used…
Azure Data Factory

SFTP/FTP source location and Azure Fuctions
Hi All, My source sits on FTP/SFTP which are quite a few Excel data sets in a Zip folder, what I want to do is use Azure Functions and convert them in to CSV format and store in Blob Storage and ultimately copy in to sink ADLS. My question is can I…
Azure Functions
Azure Blob Storage
Azure Data Factory
Mapping data flow expression for converting simpledate time format to epoch timestamp
There are existing functions where in we can convert datetime from epoch to other formats. What could be the possible function combination to convert a particular datetime as below to epoch timestamp. Ex: lastconnected:…
Azure Data Factory
Restrict ADF converting Null data from source date field returning 1900-01-01 in target column
Hello All, I am not sure, whether this was already asked, I did a search and didn't find any suitable replies related to this. ADF copy data is returning '1900-01-01' by default to a NULL DateTime field when moving the data from Source to target table…
Azure Synapse Analytics
Azure Data Factory


How to Automate path location of notebook file of ADB using azure data factory.
Hi, I am trying to automate the file location of databricks csv file using ADF notebook pipeline activity, so we can avoid the hardcode the path while running the code I have followed this MS doc : …