How should I connect DataFactory to a SFTP Server?
Hi, I'm trying to connect My DataFactory to SFTP Server. I have all the details necessary to connect. But I'm currently facing: Invalid Sftp credential provided for 'SshPublicKey' authentication type. openssh key type: ssh-rsa is not supported …
Trying to send a file from ADF blob storge to SFTP server but getting an error
Getting an error - SocketErrorCode: 'ConnectionReset'.,Source=Microsoft.DataTransfer.ClientLibrary.SftpConnector,''Type=System.Net.Sockets.SocketException,Message=An existing connection was forcibly closed by the remote host,Source=Renci.SshNet,' while…
How to handle multiple different output streams in ADF
Hi team, I have a scope script that calls a module with a function that has multiple output rowsets and I was wondering if this is supported in adf. Ultimately, we want to push the data to Kusto in the same different output streams without mixing them.…
Google BigQuery updated linked service error: "Message=No current element is available" for empty result
I started to receive failure with error "Message=No current element is available" for Copy activity with the new Google BigQuery linked service. The legacy one doesnt throw an error/failure when the Google SQL query gives an empty result. I…
Formatting a datetime referring to a 24-hours clock
Hi, in an ADF pipeline I'm trying to use this expression: @formatDateTime(utcNow(),'yyyy-MM-ddTHH:mm:ss') but this return a value that shows the AM/PM indicator. I'd like to show the time referring to the 24-hours clock. Any suggests to me, please?…
CopyData in ForEach
Dear all, I want to use the copydata activity in my pipeline. I have a csv file in my storage account, which has the list of the data i want to pass into the copy activity. I create a lookup activity first and then I try to use ForEach and put the copy…
Transform a regular array into an array of objects inside a Mapping Data Flow in Azure Data Factory
In azure data factory, I have a column which is an arrayof strings like in this (image below): costGroup = ["GH", "APT"] Because I need to output in Json format later on, I need to transform it to an array of objects like…
Event based approach for SharePoint list in ADF
Hi All, I have a share point list and Iam able to connect the same and get data and load it into blob storage I am now interested to know is it possible to do anything event based approach to load data the data into blob
Does Airflow on ADF (Managed Workflow Orchestration) require Data Factory Contributor role with AAD?
I want to use Airflow on ADF (Managed Workflow Orchestration), but when I try to add a user, they do not get access even though they have been given "Viewer", "User", and "Op" roles in the Airflow UI. I integrated AAD during…
ErrorCode=ODataFailedClientCreation
we would like transfer data about odata to sql. Our linked service is working. Only the ADF pipeline doesn't work with folling error... …
How to sync/upload files from Sharepoint to Azure Blob Storage
I am currently trying to create a chatbot that has access to a Azure Blob Storage, so that I can ask questions about the files I upload. Most of the files are located in Sharepoint, so I would like the files to automatically be updated in the Blob…
Copy Activity - ADF
Hi, I am new to azure, while I am learning I am facing an issue that copy activity running twice in a loop. Use case: Copy files from raw to silver container based on file extension. So for this I have created If loop(true==csv, false==JSON) but false is…
Is there any way to make Azure Data Factory pipelines wait in queue if infrastructure reaches maximum capacity and not fail?
I have an Event based trigger in Azure Data Factory which executes ETL pipelines every 5 mins. The pipeline has some Databricks notebook activities which executes via a cluster pool. In some cases, the pipelines are failing throwing this error:…
The costing for the Batch services and I have trouble running my Python job in Azure Batch because the required directories are not present on the Batch node.
I would like to know how Batch service is charged. When I run my Python jobs through ADF's Batch service, I get som troubles. For example: My test.py script and the utils folder are located in the run_python folder. The test.py script calls the…
ADF Pipeline -> "Azure Key Vault"
Hi, I have created a web activity in a pipeline, I want to type the password and not select the password from Azure Key Vault. I'm not able to type the password. Can you please guide? Thanks and Regards, Pratisti Pratap Satatdekar
Which is best option for an environment with analytic reporting to be done in the cloud and require on premise data and it require frequent updates during the day
Hi, I would like to understand what is the best solution to implement with the following requirements: Cloud environment for analytical reporting where data will be sourced from on premise and then combined with data loaded from excel along with have…
Unroll By button for flatten is grayed out
I'm new to Azure Data Factory. In Azure Data Factory I have an XML file I want to flatten and convert to a CSV file. I add a Data Flow, and to the Data Flow I add a "source1" where I have already done an "Import projection". I added…
ODBC Error [HY104] during Data Transfer to Oracle via Azure Data Factory: Character, decimal, and binary parameters cannot have a precision of zero
Hello, I am encountering an issue while transferring data to an Oracle database using Azure Data Factory copy activity. The operation fails with the following error…
Issue with copy data from Azure Synapse Link for Dataverse using ADF Dataflow
Hi there, We use Export to Data Lake option to copy the data from D365 and want to transform to Azure Synapse Link for Dataverse because Export to Data Lake option is obsoleting from 1st Nov 2024. We have configured Azure Synapse link for Dataverse in…
Facing issue while using ADF Copy data activity from Azure to Kusto Bugs table
Dear Team, We are experiencing an issue with data transfer from an Azure Storage Table to a Kusto Table using Azure Data Factory (ADF). Specifically, the System_AssignedTo field, which is blank in the Azure Storage Table, is being populated with…
![](https://techprofile.blob.core.windows.net/images/UVyLn-Xlr0apNce5TRX1RA.png?8DBA3B)