Import filenames from Azure Storage container to SQL Table using Azure Data Factory
Hi All, I am trying to import filenames from an Azure Storage container into a table in a Dedicated SQL Pool (formerly SQL DW) using Azure Data Factory. For example, I have around 2,000 JSON files in an Azure Storage container, and I would like to…
about the relationship and data type in power bi
in power bi is this true, always ensure that both of the columns that you are using to participate in a relationship are sharing the same data type. Your model will never work if you try to build a relationship between two columns, where one column has a…
Import or export an ADF pipeline
How can I import or export an ADF pipeline and related objects such as datasets, linked service et all to/from one subscription to another?
ADF Lookup activity not returning correct datetime value
A pipeline uses a lookup activity and runs a script against Snowflake, returns first row of data and we are using it in a update statement. Running the lookup to get MAX(load_date) from a snowflake table. In Snowflake it is "2024-04-25 17:26:01.548…
Errors when trying to use decimal type arguments in a Data Flow Library Function
I am receiving errors when trying to use a decimal argument in a Data Flow Library Function and based on my troubleshooting I believe it may be an issue with ADF itself. I have gone through these troubleshooting steps: Create a new Data Flow Library…
When I Ran the Pipeline I got Below error Could you please help out me?
Operation on target Execute pl_sap_dataload_full failed: Operation on target df_commonload_copy1 failed: The request failed with status code '"BadRequest"'.
Split the column values in dataflow in Azure Data factory
I have few records that have the following columns I have few records that has following columns Id, Person, housenumber 1, Mark, 101,102, 103,104, 105,106, 2,Alice, 107 3,Bob, 108,109,110 enter image…
Error using Azure Data Factory to Copy data (Using Upsert) from Azure Blob to Azure SQL database
Hi all, I keep getting this error when I perform an upsert (under copy activity) with this error code: Failure happened on 'Sink' side. 'Type=System.NullReferenceException,Message=Object reference not set to an instance of an…
Pipelines that require a scale up of the db are not scaling up
Hi, I have a couple of pipelines that have a stored procedure activity that scale up a database before the actual processing of the data starts. This has been working fine for almost a year, however in the past 2 days, the scale up times out and…
Issue with Parameterized CDS View in ForEach Loop During Copy Activity
When we load a CDS view using the copy activity, it successfully copies from the source SAP HANA. However, when the same CDS view is parameterized within a ForEach loop, the copy operation fails and returns the following error:
Cosmos DB copy in ADF comes from two regions instead of one region
I have one ADF instance in West US region and I have a Cosmos DB in East US. When running a copy activity in ADF from Cosmos to Azure SQL, I see the copy comes from East US and West US regions. The copy should come just from East US. Could this beavior…
JIRA Tempo REST API with Copy Data Activity connection issue
Dear all, I would like to store the data from JIRA Tempo Account into my Azure Storage Account via Copy Data activity in Azure Synapse Workspace. It would be great if someone could help me on this. I create a new linked service with REST API and put…
Dynamic aggregation column name in Azure Data factory
Hi, @Anonymous . I've seen you had some great answers to similar topics before, so linking this to you:-) I have a dataflow which I want to keep as dynamic as possible. To do this, I have defined an array parameter that is used to specify…
SAP CDC Connector with Fabric Lakehouse
Hi All, We have the SAP SLT as source and we are working on creating the data flow pipeline in Azure Data Factory with CDC connector. For CDC connector our understanding is it needs the target to have the CRUD transaction options. Is this understanding…
view in dataframe
hey, how we can create or replace view statement in spark sql in dataframe of databricks create or replace view as (select * from temp1)
Azure data factory AdlsGen1 structured link service fail to read file: Method not found
We create a AdlsGen1 structure stream linked service to read structure stream file. When we created the link service, clicked 'Test connection' and it succeeded, but it failed to preview stream and the error was…
Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName.
This exception occurred when I use the pipeline of data factory to copy data from sql server to lakehouse. But I didn't find any problems with the raw data.
How to fix "out of memory exception " while processing a pipeline of around 43 gb of data using copy activity?
I am processing a pipeline of around 43gb of data using copy activity and i am getting the error as : ErrorCode=SystemErrorOutOfMemory,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A task failed with out of…
Writing to the sink failing with error: Invalid column name 'true'
Hi, One of our dataflow is failing in production with following error: 'Job failed due to reason: at Sink 'sink1': Invalid column name 'true'. Invalid column name 'true'. The dataflow inserts records to a Synapse SQL server table. This was working…
Fabric Data Factory - Dynamic Jobs for data ingestion
Hi Team, We have a list of customized SQL queries as source and the output to be loaded into lakehouse as new table. Is there a way that we could do this with Parameters where we could pass the entire query as a parameter along with the target table name…