executing pipeline dynamically
Hello, There are files such as follows: pipelineName.DatasetName.txt The metadata activity reads this filename and gets the pipelinename and assigns it to a set variable i.e. vPipelineName. Lets say the pipelinename is pRunContacts Question: …
Some Activity Runs are Acquiring Compute after Already Acquiring Integration Runtime
I am running a pipeline in Azure Data Factory and while running this pipeline there are multiple Data Flows that need to be executed. The first data flow executes and acquires compute for the Integration Runtime. Although once the next data flow begins…
File download using HTTP connector
Hello, I'm downloading a file using HTTP connector from a URL similar to as below: https://{website}/FileView2.aspx?IDFile=26a883a7-afb4-4d98-95e8-29cc63638c28 The link points to a zip file. The file downloads with a proper name in the…
Column 'ABC' does not allow DBNull.Value, but no null values in data source.
How does one troubleshoot this issue? "Operation on target XYZ CSV failed: Failure happened on 'Sink' side. ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed. Please…
Azure data factory REST API throwing error
I am trying to fetch data through data factory using REST API OAuth2ClientCredentails, but I am getting error : {"error_description":"access_denied","error":"server_error"}, where as its working through postman and…
Publish Error via Azure Data Factory after Terraform Deployment Pointing to existing GitHub Repo
I have an error below when I tried to publish from Azure Data Factory after It is deployed via Terraform. Publishing error: Invalid references or dependencies found. This is likely due to publishing outside of Git mode or editing and deleting …
Why do we use app registration from Azure Active Directory ?
I'm fairly new to microsoft tools. I was working to copy file from sharepoint to adls. While going through it, I saw we had to app register in AAD. Why do we do that?
Databricks notebook output exception log to Azure data factory
Are there anyway that the Databricks notebook output exception log to Azure data factory ? Like set variable and then log to Azure Log analytics ?
SQL Query in CSV File in blob storage
Hello Team, I am having 3 queries in sql server . I need to run all the 3 queries and need to copy the data into 3 different files into Azure blob storage.(FiileName1,FileName2,FileName3) please advise me how it can be done. Regards Rohit
Load data from XML file to Azure SQL DB using Azure data factory
I am trying to load the XML file (adf.xml) from blob storage to Azure SQL DB From the attached file (mappingadfxml1.pdf), I am only looking for the columns that are checked and loaded into the respective columns in Azure SQL DB. I have also…
Unrecognized extension in ADLS
Hi, I'm downloading files from HTTP using Binary dataset in Copy Activity and Saving the file in ADLS using Binary Dataset. I'm using Binary dataset because the HTTP url(source) will have different file types dynamically. I'm getting the filename is url…
Azure data factory copy large number of files from ADLS to blob
Hi, I want to perform a copy activity from Azure data lake store to azure blob storage by compressing the files. I have a folder in ADLS which has to be compressed to a zip file and stored in a blob. The throughput I achieve is very less. I could…
Passing the output of copy activity to a parameter in Azure Data Factory
Hi, I'm using Copy Activity to call a HTTP server for fetching the data from a URL and storing the data in SQL database. At the same time I must also use this fetched data to perform further operations in same pipeline. So right now I'm storing data in…
convert string to float in azure data factory in copy activity
I have a csv file in mapping all source column ha string type and in destination some column have float ,varchar ,int type. When i run the pipeline activity fail and show error. Error message: ErrorCode=TypeConversionFailure,Exception occurred…
Azure Data Factory / Self-Hosted Integration Runtime / ODBC / Connection String In a Parameter
Hey, Here's what we're trying to do: Got an ADF Self-Hosted Integration runtime, hosted in a Windows VM with an ODBC Driver installed Trying to pass ODBC Driver connection string as a parameter JSON for the Linked service looks like this: { …
ADF Self-Hosted IR performance issue.
Hello guys. The situation: I have 2 Virtual Machines with 2 installed SQL Servers. On both of them were installed Self-Hosted IRs for the correct connection between ADF and SQL servers. When I tried to run some Copy component that took around 100…
Flowlet using join
Hi Team, I have one scenario, I am creating a flowlet with 2 inputs and1 join and 1 select component. When I am trying to use the created flowlet in a dataflow, i am not able to see the data in data preview tab. Getting below error : DF-COMP-003…
Azure Data Factory - Designer not picking up changes in environments from SSIS Catalog
I have added an additional Environment to my project deployed in the SSIS Catalog. However the ADF Designer will not pick it up, nor will it execute the package if I put the name of the Environment in manually. I have tried: - Restarting the SSIS IR…
Error while publishing: Cannot read property 'linkedService' of undefined
When trying to publish an ADF from master to adf_publish I am getting an error: Error while publishing: Cannot read property 'linkedService' of undefined There are no other details shown. I tried looking through the master branch and do not see…
data factory pipeline visual
Hi, If I do not have the azure account, is there a way to see the visuals of the pipeline if I only have received the .json file of it? Thanks