Schema Drift Issue in Dataflow with Sink-Delta
Hi Team, I was testing the schema drift option for my delta sink and am getting error while running my dataflow when added one new column to existing source schema. Pipeline ran successfully when I didn't change the schema. Error: I have selected Allow…
Renew Certification - Microsoft Azure Fundamentals
Hi Team, My Microsoft Azure Fundamentals Certification expires in June 2024. I Would like to renew the certification; can you please provide me with the steps. I don't see option to take Renewal assessment. I tried to go through this video, however it…
How to update and delete a row using Azure Data Factory Change Data Capture
I am exploring Azure Data Factory CDC feature, and I am trying to perform the CDC from one SQL table to another SQL table, also the SQL table in the source has a primary key. Whenever a new row is added to the source table, it gets added to the…
An csv file with 200 columns in adls. fetch only list of columns from config table. config table is in Azure SQL. config table has only 10 col names
An csv file with 200 columns in adls. fetch only the list of columns from the config table. config table is in Azure SQL.config table has only 10 column names. how to create pipeline with this scenario in Azure datafactory?
CDC Start button disabled. Cannot Start CDC in Azure data factory
I created a CDC in adf and made a Git configuration to connect to Azure Devops Repos with a new branch created. Now, I checked out to a new branch. In the new branch, CDC is also there but i cannot start the CDC . It says Please publish to enable the…
InsertDate and UpdateDate column addition in Sink-Delta from dataflow.
I am using dataflow to Upsert the data to my delta lake. My requirement is to have two columns - 1. insertdate 2.updatedate . If there is a new record then both insertdate and updatedate column should get updated with current_date() values. And, if…
Split single column data into multiple columns in datafactory
Issue: I have some csv files in sftp location . I was planning to use copyactivity to load files from sftp location to datalake for archiving as well as upserting in deltalake. However, I am getting error while reading from source, as some file has…
How to move data from one table to another table within same workspace of LOG ANALYTICS using Azure Data Factory ?
How to move data from one table to another table within same workspace of LOG ANALYTICS using Azure Data Factory 2.from one table in log analytics workspace 1 to another table in log analytics 2 workspace using Azure Data Factory ?
How to copy the data from Amazon S3 to Azure SQL database incrementally.
Hello team, I wanted to wanted to copy data from Amazon S3(JSON format) to Azure SQL database. I wanted to learn the mechanism for incrementally updating the data from Amazon S3 to Azure SQL. Also, the environment has only SHIR deployed, so data flows…
How to fix this error in azure data factory? My dataflow worked fine, now I receive this error.
Error: Spark job failed: { "text/plain":…
How to keep ARM Template in sync when modifying Azure Data Factory?
I'm unclear on the process for maintaining the ARM Template for deployments when modifying an Azure Data Factory (ADF). I see two options: Modify the ADF in the browser-based drag-n-drop editor and use the generated Json ARM Template as is. Manually…
Can you assist with a Synapse issue involving memory allocation in SQL DW Copy Command in Data Factory, without solely resorting to increasing DWU?
I'm encountering an issue while using the Copy Activity in Data Factory with the Polybase copy option. The error message I receive is as follows: “…..SQL DW Copy Command operation failed with error 'Unable to allocate 257707 KB for columnstore…
COPY ACTIVITY FROM SOURCE LIKED REST TO DESTINATION LOINED REST SERVICE
Want to copy from source table to destination table in the same log analytics workspace. Using Azure Data Factory. My configuration: Source Linked REST Service to read data from Log Analytics workspace w1 table t1 Destination Linked REST service to…
Job failed due to reason: at Sink 'xxx': Connection string is invalid. Unable to parse.
Hi, In a pipeline there are 5 dataflows. When we debug each dataflow separately it works perfectly fine. But the moment we debug the pipeline, we get the error "Job failed due to reason: at Sink 'xxx': Connection string is invalid. Unable to…
ADF + SAP Tableconnector: Function Modules
Hi together, We have a setup to copy table data from our SAP system to our AzureDataFactory via the SAP table connector interface. This works well as long as we access SAP tables. Now there is the need to access function mudules in SAP with parameters to…
How to Automatically create a Data Cycle diagram
Currently, the entire data cycle of a project has been developed in Azure Data Factory. The main objective is to have an automated visual scheme of the entire data cycle at a low level (from the basic activities, not the sequential execution of…
Why is the public ip range download blocked?
Hey, I use a script to download the public ip ranges from azure (https://www.microsoft.com/en-us/download/details.aspx?id=56519). If I view that URL in my browser, it displays and I am able to download the JSON. However, if I run my script that would…
ADF Copy Activity : Data load from ADLS (parquet files) to Azure Synapse Table Failing after running for 24 hours
ADF pipeline copy failed after running for 24 hours. Timeout is set to 7days. Copy Activity is trying to copy data from ADLS (Parquet files) to Azure Synapse Table on Production Environment . Could you please suggest what could be the possible issue for…
ADF GLobal Parameters error after trigger (debug works fine)
Hi, I have set a Global Parameter "maximale_ouderdom" which I use in an If Condition. When I debug the pipeline all works fine. The global parameter is being used in the condition. However, when the actual…
How to configure ADF pipeline run, linked service, so it uses Databricks serverless compute
Databricks has recently announced serverless compute for workflows: https://learn.microsoft.com/en-us/azure/databricks/workflows/jobs/run-serverless-jobs I would like to be able to execute Azure Data Factory (ADF) jobs using this…