Azure Data Factory - How do we get source and destination of Azure Data Factory pipeline via .net code?
Azure Data Factory - How do we get source and destination of Azure Data Factory pipeline via .net code?
How to get the activity name by dynamically in the adf
I’m trying to capture the logs with error and store it in ADLS? I don’t have sql and store procedure. In the log files the pipeline name is coming as logging pipeline which is not correct, I should display activity name like profile data load success ,…
Change Region Azure Data Factory
Hi, I created an ADF in the region Germany West and now notice that Data Flows aren't available in this region. Would it be possible to change the region of my ADF or do I have to create the same Data Factory in the correct region?
dataflow - branch - collect all fields to go to sink
Hello, in the screenshot below, I am getting data from .json file... flatten24 has a-lot of the fields. Now I would like to get other fields from flatten23,22,21 so that I have them all to put into sink. How do I do this please? Thank…
dataflow - joining flattened fields
Hi, I am flattening a json file data... I am using flatten activity for each array inside a parent.... Basically, when I am happen with the flatten activies and I can see the columns coming through in each flatten activity, how do I put these…
expression
Hello, What is wrong with this expression please? I think it requires single quote around surename? Note that I do not get an expression error but when I run this in pipeline copy sink I get an error. When I narrow it down it looks like…
Transpose File using ADF Mapping DataFlow
I have a csv file data like below : ![182916-image.png][1] I want to achieve data like below : ![182867-image.png][2] Here first 4 columns have same row values and last 4 columns have separate row values. Please help me how to achieve this using ADF…
How to raise a bug in azure data factory
I would want to report a finding on Azure data factory. May I know what is the procedure for it
Incremental Load in Azure Data Factory
Hi, Requirement: Fetch data from on-premise using ADF and load it into Synapse incrementally. Problem: We don't have a unique value column to perform upsert taking that column as the key. There is a 'change_date' column indicating the last modified…
Unable to copy Delta Lake data to Synapse Dedicated SQL Pool with Azure Data Factory
When trying to use data factory to copy data to a Synapse Dedicated SQL Pool Table I received the following error: ErrorCode=AzureDatabricksCommandError,Hit an error when running the command in Azure Databricks. Error details:…
Unpivot and Summarize Data Flow in ADF
When making sense of a new data set I often want to perform some basic transformations to get an understanding of which columns have limited selections or are largely empty so I often perform the following basic transformation. These steps are…
distinct value of column
Can you please see if what I am doing here correct to get the distinct value of column? I get the correct answer but not sure if this is the way to do it. basically what does this do? name != "venture" Thank you
Strange errors ADF
Hello, We encountered some strange errors in our Azure Data Factory pipeline runs. One is from a Notebook activity as seen in the following screenshot: Another error is from the Notebook activity as well: A third error occurred…
flatten json data
sample json data: ... "parcels": [ { "name": "kljkl", "state": "fgfgf", "quantities": [ { …
Incremental Changes from Integration Runtime to Data Factory
Hi- I am attempting to load data from an on-prem sql server to azure data factory (data lake). I will be scheduling the upload once per day. How do I keep the data together, but still have a way to separate the days? Basically I do not want to…
At export step for Azure data factory pipeline getting error "at Sink ""Exportcustomerss": Java.Lang.NullPointerException
At export step for Azure data factory pipeline getting error "at Sink ""Exportcustomerss": Java.Lang.NullPointerException I can see one record is available for export to azure data lake file. In Sink setting Sink Type is…
Is data encrypted when using Self-Hosted Integration Runtime?
Hi- I am simply looking to know if data transfer are encrypted when using Self-Hosted Integration Runtime? I can't find it in any documentation, I only see references to encrypting credentials. Thanks!!
Synapse data flow: DFExecutorUserError (User configuration issue)
I have created data flow which transforms one parquet files into other parquet files (I would like to change schema). Both source and sink is configured as ADLS2. As input I have multiple files and some of them are processed correctly but couple of…
The current user has exceeded the maximum number of concurrent queries allowed by the server.,Source=Microsoft Salesforce ODBC Driver
Hi team, we are running SQL query on Salesforce as a source but the activity is getting failed with this error - Why it is failing? what is the root cause and solution for it?
How to do full load of BSEG table from SAP ABAP oracle environment using SAP table connector
I'm new to SAP. Can anyone let me know how to implement full load for the BSEG table from SAP ABAP oracle environment using SAP table connector into Azure. As this table has huge volume of data , which field can be used to pull the historical data .