merge rows of same file Azure data factory
Hello I need to merge multiple rows into one row in azure data factory, per example I have the following file PolicyId Driver_name 0001 Adam 0001 Lucy 0002 peter At the end I need to have PolicyId Driver_name1 Driver name2 …
How to pull the data from Dynamic 365 to Data lake
<p>Hi <a rel="user" nodeId="74270" href="/answers/users/74270/chiragmishramsft-1092.html">@ChiragMishraMSFT-1092</a> </p> <p>Can you pls help me with the below error:</p> <p>{…
ADF - export to flat file with multiple record types
How to export to flat file with multiple record types in Azure Data Factory from JSON Example output file: Header, 1, H1 Detail1, 1, D11, Test1, 10 Detail1, 2, D12, Test2, 20 Detail2, 1, D21, 100 Detail2, 2, D22, 200 Trailer, 1, T1, 2,…
ADFV2 Copy Activity Support for more encodingNames
HI I am using the Copyactivity which generate the delimiter text files reading from parquet and applying the respective encodings for the files, we see a list of encoding which are supported by Copy activity (Delimited text format ), and still some…
Azure Data Factory add Tags in json files
Hello, In the Portal it is possible to add Tags to a Azure Data Factory (I'm not talking about the Annotation in pipelines). But I can't find a way to include them in the ARM Template. I just created a blank ADF, added a Tag and exported the ARM…
![](https://techprofile.blob.core.windows.net/images/83b87d0ad6664b86a2c2018eab8da26a.png)
Data flow error: 4501 - related to WranglingDataFlow
Hi guys, Please, for some reason, I'm getting this error message when I try to debug a pipeline with a wrangling data flow activity. { "errorCode": "4501", "message": "Failed to fetch…
JSON file Node didnt loade in SQL server tables using Azure data factory
HI Team, Below is MY JSON file for SOurce(Blob storage JSON files) I have loading Sql server table using Azure data factory(Copy data activity) in the Mapping tab they Node checkbox I have selected resolutions Checkbox data loaded only…
Log shipping
Dear experts I have two node SQL server standard failover cluster. I need to setup single node DR using log Shipping method. Want to ask after switch over to DR how we can setup, plan so that application can connect automatically to DR database…
![](https://techprofile.blob.core.windows.net/images/c4fc5e52b1f948d5b78c8391889ef52e.png)
Azure Data Factory Dynamic Content
Hi there, I want to use a variable inside my Data factory Copy activity. I trying to use a variable which I got from an Azure Function. When I run the debug for the variable I can see that I'm getting the value but I can't access it as a…
Azure Data Factory Data Copy from Cosmos
We have one scenario, where we have to transfer/copy newly added/updated data from cosmos dB to a SQL Server every 2 hours. Currently, data copy copies all the files, can I configure to only copy the newly added/updated ? Thanks
Data feild comma data impact to csv file creation issue
Hi Team, Currently i have setup a copy activity for a SQL query output .csv file create on the Data Lake destination, But in the one table address level details there have a comma (,) on the inside of the data fields, Due to the that reason, created…
Doubt with new Azure IR
Hi there. I know this is maybe a trivial question for pretty much all of you, but I am confused with this. To reduce the Pipeline runtime, the administrator has created a new Azure IR. I understand that I need to change the setting on its…
data bricks scala : data frame column endoing from UTF 8 to windows 1252
HI I am working with data bricks where i have the data in parque and i am generating smaller files out of it , i have a column in this which is string and it has different characters and i have to encode this string value to windows 1252 or windows…
how to replicate ongoing changes of on prem database to data lake?
Like AWS DMS handles ongoing replication and update bucket with respective files, what is available in azure to manage ongoing changes occurring at on-prem sql database? Initially it requires to do a full load in my data lake storage which can be…
Extract the date hierarchy from previous data in Data Lake Partition
Hi Team, Would be of great help to get some clear assistance on the below query. Data source is Dynamic365 & SAP Hana. My requirement is to create a data lake partition(ADLS Gen2) with date…
Datatime format in ADF v2
Hello , I have a requirement wherein I need to convert the below datetime fomat: 2020-06-24T10:30:54.6451783Z into 2020-06-24T10:30:54.64Z meaning yyyy-MM-ddTHH:mm:ss with trailing z. Is there any direct format other than me concatinating the…
![](https://techprofile.blob.core.windows.net/images/nU6nYqttiUqkasJ2PMYbbQ.png?8DB643)
Mapping data flow does not load any rows to 1 out of 2 branches
I have this mapping data flow, with two branches from 'RemoveHdlColumns'. When I run this the rows from MergingInput is only loaded through the bottom branch. All the rows are loaded for the 'Select2' transformation, but NO rows hits the…
Designing Table Structure for Accomodating Huge Amount of Data
I am working on a use-case wherein I am expected to receive 80 million records every week and I have to load it in Azure SQL Database, some of the data is dimensional type and some is of fact type, I would be using Azure Data Factory to load the data…
![](https://techprofile.blob.core.windows.net/images/62MriLkzHU6wZIFFquWMUQ.png?8DA01C)
![](https://techprofile.blob.core.windows.net/images/6ac6d5cce09942beb1e4016b6deb8ccc.png)
Transactions in azure-data-factory
Hello, Am working on one of the adf activity where I am generating the 10 files dynamically with SP outputs using Foreach and Export data. I am trying to implement transactions now where the generation of files should be rolled back if any one of the…
Why do my dataflow pipelines spend 5 minutes in acquiring compute state every time?
I have a bunch of pipelines using dataflows for data transformations each with different sources and sinks. All pipeline exactly takes 5 minutes to spin up a cluster for these dataflows even though they are all triggered almost at the same time. Is there…