Cannot register Azure Integration Runtime
Hello, I am trying to register a self hosted Azure integration runtime. This used to work before, but suddenly the existing runtime could not connect anymore. I already: Reinstalled the local runtime installation Removed integration runtime in Azure…
Wanted to check any excel file/any file is available in File share Main and multiple layers subfolder
hello all, I want to check in ADF that, is there any file available in Fileshare main folder and multiple layers of hierarchical sub folder. Based on that I want to form m I f else condition.
Use Azure Data Factory to copy files and place a csv of files copied
Hello, I am trying to implement the following flow in an Azure Data Factory pipeline: Copy files from an SFTP to a local folder. Create a comma separated file in the local folder with the list of files and their sizes. The first step was easy…
Using reg ex to map inputs columns to one output column
I have need to ingest data from csv files into sql, however the column name for a particular value could differ, for example name could come as name or fullname. Assuming my database table has column called name how can I map the input columns using reg…
Copy from CosmosDB to CosmosDB Error: "Request size is too large" using Data Factory
I am using the basic Copy wizard in ADF v2. I have a source Cosmosdb in one Subscription, and moving to a new Cosmosdb in a new subscription Database and Containers are configured the same. I have one container that copies 212 of 216 documents and…
Remove-AzureRmDataFactoryPipeline is not deleting ADF pipeline present under GIT branch
I have a requirement to delete 50 pipelines from my ADF. I tried powershell way using the below command :- Remove-AzureRmDataFactoryV2Pipeline -Name "XYZ_DEF" -ResourceGroupName $resourceGroupName -DataFactoryName $dataFactoryName I am…
Auto create table error in copy activity for sql to sql pipeline
Hi, I'm getting an error trying to preview data in the sink process. I selected the auto create table option for the source data which should automatically create a table in my sink sql datawarehouse but its not. Here's the error I get when trying to…
How to convert binary data retrieved via HTTP into a parsable type
I am connecting to an HTTP source (SOAP API) via POST, and retrieving Binary data. How can I get this data into a parsable format that can be used to copy to a data warehouse? The goal is to use this data for PowerBI reports. Thank you
Excel Mapping Data Flow - Date Time Text Values are loading month as minute
When importing an XLSX file in to the mapping data flow (EDIT:Tested with both inline and dataset and same happens), a column containing date times is loading with the incorrect text value. It looks like the process has attempted to parse the date time,…
[Mapping Data Flow] 0 Total columns - Source Excel - Bug Found
Hi My source is excel and I am trying to read the data in MDF Azure Data factory. I am able to preview the data. But the count is display as Columns : 0 total. I checked and tried with the options given in this thread: …
How to get output in one object instead of multiple objects?
Hi, I am using restAPi as source and using pagination rule to get entire data, find below picture(1 image). But in my mapping my data having nested arrays(2 image) as output also getting as arrays with all pages which I am unable to flatten in…
how to move compressed parquet file using adf or databricks
hi, i have a requirement to move parquet files from aws s3 into azure then convert to csv using adf. i tried to download that few files on to my local file system and tried to copy via copy activity within adf. The files are in this format …
Trying to run a cloned pipeline
How can I run a cloned pipeline in Data Factory? From inside the Author section I only can find the possibility to debug runs. Where and how can I run such cloned pipelines from inside the Author section, showing up an entry in the Monitor section? …
Supplier Data Processing/Analytical Services
Scenario Company A has an Azure Tenant and regulatory approval to store and process sensitive data Company B provides data processing and analytical services in Azure (sql database/warehouse, adf, blob storage etc) and does not have regulatory…
getting below error while running custom activity in ADF Pipeline
{ "errorCode": "2010", "message": "Hit unexpected exception, please retry. If the problem persists, please contact Azure Support\"", "failureType": "UserError", "target":…
How to Transform files in subfolders with one script in databricks
i have a adls gen2 folder with sub folders with parquet files in each folder. My requirement is to transform all parquet files in sub folders and load into another folder in adls gen 2 with same folder structure with one script. is it possible to do or…
Generic Error in Synapse Studio Pipeline
Hello, I am just starting to use Synapse Studio. However, I have been using Azure Data Factory for a few months. I have a pipeline that is just a single dataflow activity, copying data from our Azure SQL to Synapse SQL Pool. But I keep getting this…
Validation Activity - Timeouts event when a file is available
I have a validation activity which is mapped to file folder dataset. The childitems is set to true and timeout to 10 seconds. It has both a success activity and a failure activity. Now though the validate activity output is showing the file in the…
Just want to validate if a file exists in my data lake before I try and move it.
I am new to azure and I want to do something that seems really simple, but I am just not familiar enough with the commands. Below, I will just use pseudo code to ask my question; If File A.json Then Move A.json to OLD/08172020_A.json …
SnowFlake Data Copy Error
I have a copy data activity which is using a SnowFlake source. In the source under "Additional Snowflake copy options" I have added a parameter with the property name set to "SINGLE" and the value set to "FALSE". I am able…