Why is the public ip range download blocked?
Hey, I use a script to download the public ip ranges from azure (https://www.microsoft.com/en-us/download/details.aspx?id=56519). If I view that URL in my browser, it displays and I am able to download the JSON. However, if I run my script that would…
ADF Snowflake V2 connector throws GenericAdoNetReadError during Copy activity
The problem We're using a Copy activity in ADF to move data from a storage account into a Snowflake database using the V2 Snowflake connector. But just after the data has been copied, the Copy activity fails and throws the following error: Operation on…
I have a CDC connector on ADF connected to an SAP BW system. Need to read data only of a particular country from the ODQ into ADF. How can I send a selection filter through the CDC connector to the SAP BW system so that only the filtered records are read?
I have tried the Optimize on source option on data flow with the conditions attached on the screenshot and I know that these conditions help with partitioning but I need to have some filters applied on the ODQ and then filtered data flowing into the…
In Azure Data Factory, is it possible to define link service and dataset with dynamic type?
In Data Factory, is it possible to define link service and dataset with dynamic type? Since the legacy Salesforce connector will soon be deprecated, I want to create a single link service and dataset to support both the legacy and new connector instead…
Reading 1 million Records from Single Excel in Blob storage + Azure functions/ADF/Cosmos Change feed/WebJob
Hi Everyone- I have a use case of reading 1 million records in a single Excel from a Blob storage. Basically when someone uploads this file to Blob, it should start processing. Please let me know the best option to choose from below for the same. …
SAP latency data
Hi Expert, how to we can load the data from modified data in updated or insert fields in databricks using ADF or data bricks on trigger level instead of loading multiple times example: table updated or inserted with new records how table change and…
How can use Lookup to get data from multiple salesforce Object in ADF
I need to select Some data from Multiple salesforce object to another database. Currently I am trying pipeline for this because Dataflow is not supporting Salesforce. But the problem is While creating source/dataset it is showing only object and don't…
Azure Data Factory - Generate custom Guid and copy to SQL table
I need to generate a unique Guid which contains only numeric and of length 20, for every row in the table. Currently, I am running one lookup activity (name - FetchIDs) to run the Stored procedure. Output of the lookup activity is as shown below { …
Design strategy for Data lake
Hi friends, we need to design an Azure data lake from scratch for a solution that has complex, multiple data sources from Databases, Rest APIs, and custom applications, and the Data lake solution should be scalable and high-performance in terms of data…
Data Factory to Salesforce using JWT
Hi All, We've been set with the requirement of authenticating with Salesforce via JWT. For the most part, this is set up and we can get an access token returned, however, how do we access Salesforce once we have the token? Does it have to be through a…
How can we connect Blackboard DDA (a SaaS using PostgreSQL to store data ) via Azure Synapse analytics using SSL certificate for authentication in SHIR for Private Vnet enabled networks.
Blackboard DDA is a SaaS application which uses PostgreSQL as DB to store data. The DB uses a SSL certificate key to authenticate and we have private Vnet enabled. How to create a linked service for this DB and where do we store certificate in SHIR. Are…
Schema Drift Issue in Dataflow with Sink-Delta
Hi Team, I was testing the schema drift option for my delta sink and am getting error while running my dataflow when added one new column to existing source schema. Pipeline ran successfully when I didn't change the schema. Error: I have selected Allow…
Split single column data into multiple columns in datafactory
Issue: I have some csv files in sftp location . I was planning to use copyactivity to load files from sftp location to datalake for archiving as well as upserting in deltalake. However, I am getting error while reading from source, as some file has…
Copy activity is success but no data in the sink source
Hello, I have created a data pipeline that copies data from MongoDB to a SQL Azure data source. The problem is that 1 time out of 2, no data is copied even though the pipeline is successful. I have to restart the flow manually a dozen times before any…
Connect to Linked Server Database using Azure Data Factory
Hi! I have a database that is a linked server of another. I can connect and copy data of the main database in ADF. But is it possible to also copy data from the linked server database? If yes, how would I do this?
Activity stuck in queue status
Starting from today morning, I am facing problem that any pipeline run, by trigger or manually or in debug mode, doesn't start. To be precise, pipeline starts ("In progress", but first activity stays in "Queue" status). Doesn't matter…
Azure Data Factory - Copy data activity failed: Failure happened on 'Source' side
Operation on target Copy data1 failed: Failure happened on 'Source' side. ErrorCode=UserErrorFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file operation is failed, upload file failed at path:…
When we deploy ADF,we are getting "TooManyFactoryUpdaterequests" in Create or Replace GlobalParameter through Azure Devops
When we deploy ADF, we are getting "TooManyFactoryUpdaterequests" in Create or Replace GlobalParameter using CI/CD Azure devops
Azure Data Factory pipelines stuck on first activity for 7+ hours
Hi, I just woke up to monitor all the runs, and suddenly saw that all of my runs were stuck on the first activity, or not even starting the first activity some for more than 7+ hours. This happened over-night as yesterday everything was fine and I didn't…
Custom libraries (wheel) for ADF Databricks Python activity run on serverless compute
I want to be able to execute Python scripts (via Databricks Python) from Azure Data Factory using serverless compute. Serverless compute does not support cluster level (compute scoped) libraries. In databricks workflows, it is being done as…