How to fix the Pipeline which is running as expected in ADF dev, post which I have published the pipeline (and associated objects) to ADF Prod it fails.
Hello Team, I have build ADF pipeline to load data from source (SQL server) to ADLS. Pipeline is running as expected in ADF dev, post which I have published the pipeline (and associated objects) to ADF Prod. But while running the pipeline (Debug),…
Azure Data Factory Error- when trying to connect to Azure SQL Database. How to resolve? Why happened?
Spark job failed: { "text/plain":…
An csv file with 200 columns in adls. fetch only list of columns from config table. config table is in Azure SQL. config table has only 10 col names
An csv file with 200 columns in adls. fetch only the list of columns from the config table. config table is in Azure SQL.config table has only 10 column names. how to create pipeline with this scenario in Azure datafactory?
How to parse nested json array of document in ADF data flow
Hi all I am trying to fitch the values from a nested josh array of document , I have used aggregate to convert into objects but not able to fitch the values of all child nodes like as below itOffer.item itOffer.item.SplOfr itOffer.item.buy …
Azure Data Factory - Generate custom Guid and copy to SQL table
I need to generate a unique Guid which contains only numeric and of length 20, for every row in the table. Currently, I am running one lookup activity (name - FetchIDs) to run the Stored procedure. Output of the lookup activity is as shown below { …
Rest API End Condition in Copy Activity is not working
I'm working on a paginated get requestin adf. As soon as an EndCondtion is set, the request does not work anymore. I'm using pagination rules in order to loop the request until all data is loaded. The range rule works fine but as soon as I add the…
Syntax error in the metadata-driven control table script generated by the copy data wizard.
ADF ControlTable Script.txt In ADF I have selected the Metadata-driven Copy Task in the Ingestion wizard. My source database is Azure SQL (WorldWideImporters), and my destination database is also Azure SQL (AdventureWorks2014). I'm using the destination…
I am using the azure policy to whitelist the domain for outbound connectivity from Azure Data Factory to other services. But facing issues in connectivity due to throttling applied on policy.
I am using the azure policy (https://learn.microsoft.com/en-us/azure/data-factory/configure-outbound-allow-list-azure-policy) which is applied at resource group level. This policy is working as expected and is only allowing outbound connectivity to the…
How do I use the Script activity in ADF, so it uses Azure Databricks SQL Warehouse
I want to be able to use ADF Script activity to execute SQL statements on the Azure Databricks SQL warehouses (including the serverless kind). https://learn.microsoft.com/en-us/azure/data-factory/transform-data-using-script Azure Databricks SQL…
How to translate database content with Azure Translator by ADF or Synapse notebook?
There is an Azure Database table. Some of columns need to be translated from one language to another into additional columns. Such as from English to Spanish, or Portuguese to English, etc. I am exploring how I can use ADF or Synapse notebook to…
ADF Pipeline using Global parameter fails when triggered but succeed when debugged
I'm working with ADF, using global parameters, and facing the issue. It works fine if I debug the pipeline. But, if I trigger the pipeline (manual or scheduled), it fails saying the property doesn't exist and it shows the name of the global parameter I'm…
Reading 1 million Records from Single Excel in Blob storage + Azure functions/ADF/Cosmos Change feed/WebJob
Hi Everyone- I have a use case of reading 1 million records in a single Excel from a Blob storage. Basically when someone uploads this file to Blob, it should start processing. Please let me know the best option to choose from below for the same. …
How do I copy data from a public sharepoint URL to Microsoft Blob Storage?
Hi everyone! I am new to using Azure services. My co-author at Stanford has uploaded several files (in csv format) to a public link. This link starts as: office365stanford-my.sharepoint.com/personal It has 40 files, ranging from a few 100 MB to 100 GB. I…
When we deploy ADF,we are getting "TooManyFactoryUpdaterequests" in Create or Replace GlobalParameter through Azure Devops
When we deploy ADF, we are getting "TooManyFactoryUpdaterequests" in Create or Replace GlobalParameter using CI/CD Azure devops
No MediaTypeFormatter is available to read an object of type 'ClusterInfo' from content with media type 'text/html'.
Hi, in azure data factory I wanted to test the connection of our new Databricks Workspace URL and ClusterID. I parameterized the linked service (LS) to global parameters databricksURL and databricksClusterID. When I want to test the connection…
Why is the public ip range download blocked?
Hey, I use a script to download the public ip ranges from azure (https://www.microsoft.com/en-us/download/details.aspx?id=56519). If I view that URL in my browser, it displays and I am able to download the JSON. However, if I run my script that would…
I have a CDC connector on ADF connected to an SAP BW system. Need to read data only of a particular country from the ODQ into ADF. How can I send a selection filter through the CDC connector to the SAP BW system so that only the filtered records are read?
I have tried the Optimize on source option on data flow with the conditions attached on the screenshot and I know that these conditions help with partitioning but I need to have some filters applied on the ODQ and then filtered data flowing into the…
In Azure Data Factory, is it possible to define link service and dataset with dynamic type?
In Data Factory, is it possible to define link service and dataset with dynamic type? Since the legacy Salesforce connector will soon be deprecated, I want to create a single link service and dataset to support both the legacy and new connector instead…
SAP latency data
Hi Expert, how to we can load the data from modified data in updated or insert fields in databricks using ADF or data bricks on trigger level instead of loading multiple times example: table updated or inserted with new records how table change and…
How can use Lookup to get data from multiple salesforce Object in ADF
I need to select Some data from Multiple salesforce object to another database. Currently I am trying pipeline for this because Dataflow is not supporting Salesforce. But the problem is While creating source/dataset it is showing only object and don't…