How to flatten dynamic Json in ADF/Synapse
I have a issue where I have dynamic json and enabled schema drift in my source, To flatten the json I am coding it as array(byName('Columnname')) in unroll by dynamic content section. Here columnname is the array which need to be unrolled. I am getting…
Copy activity throwing an error, when a auto incremented column is there on the Sink side
Hello Team, I am trying to copy data from Amazon S3 to Azure SQL database. On Azure SQL side I have set an autoincrement column(Row_Index) in a table, however the copy activity throws an error that "Column 'Row_Index' does not allow…
Failed to get access token from your token endpoint. Error message: No MediaTypeFormatter is available to read an object of type 'OAuth2AccessToken' from content with media type 'text/html
In Azure Data Factory I am trying to connect to a REST API that uses OAuth 2.0 Authentication. When supplying Data Factory with all the necessary credentials and testing the connection I get the error: Failed to get access token from your token…
DF-SRC-002 at Sink 'DatasetSink'(Line 176/Col 13): 'tableName' (Table Name) is required
Hi, I have a dataflow that reads data from a source (Azure dataset), here is the source config: I have the desired output before the sink. The sink configuration as as follows: The problem is when I try to preview the data, it gives…
Failed to convert the value in 'xxxx' property to 'System.String' type in Data Lake Connector
Environment: Azure Synapse I am developing a pipeline that has a lookup to SQL that is passing the values to GetMetaData. When using @dataset.name of parameter in the dataset, it fails with "Failed to convert the value in name of parameter…
Azure AD Token Customization
I have registered an application in azure ad through which I am getting the token which is used for authentication. I have added some app roles to the application which is mapped at group level in enterprise application. While decoding the token in…
How to ignore the records by applying an auditing filed column condition using ADF Data Flows
Hi All I am building a data transamination using mapping data flows ,I have a time stamp field Like TimeStampUpdated in the target table. I want to lockup historical data with incremental data transamination and ignore the records coming in the…
Moving files after a copy from a blob storage
Hi, I've implemented a pipeline to read a set of txt files from a blob storage, searched by a certain string, and then to write them on a shared folder on-premise. For a such task I've used a copy activity. Now, how I could move the copied files from the…
Azure Data Factory pagination using QueryParameters and body field
Hello everyone, I am currently working with a pagination system and facing a challenge with the URL and parameter concatenation. I am trying to append a specific string to the URL retrieved from the first page of an API response. Here's the setup: API…
How to fix these errors and warnings with Integration Runtime (Self-hosted) Host Service on Windows
Our Windows server administrator found that the Windows event log is so full that it can only hold entries for the last few hours. The following error occurs 3 times in a row in the Event Viewer every couple of hours: Log Name: Integration…
How to add the "Azure Machine Learning workspace name" which exists in the data factory Azure Machine Learning linked service in the override parameters in the release pipeline?
I have two adf dev and prod. I have the linked service "Azure Machine Learning". Inside this linked service I have some parameters such us, subscriptionID, resourceGroupName and Azure Machine Learning workspace name**.** In my pipeline…
Missing cerificate
I passed the Azure Data Fundamentals exam(2nd certificate) via Certiport, but the certificate isn't linking to my account even though my email ID is linked and also, I've already completed the azure fundamentals exam(1st certificate) and only that…
How can I find help validating a certificate?
Indeed, I am a student in cloud computing in Morocco and I do not have enough means to obtain the AZ 900 certification and my family also does not have enough financial means. Could someone help me validate this certificate please?
how to call Oracle procedure in ADF lookup activity
We need call Oracle Procedure from the Azure Data Factory. It seems that in lookup activity only Select is supported. How can we call the Oracle Procedure without the modification on Oracle side?
Azure Data Factory simple copy file and rename
Hello: I have a simple requirement. I have a couple of files in an Azure Blob folder (receiving folder) and I need to copy them, to another fold (Azure Blob folder) and rename them. This posting is really close to what I need to do: …
Delta to Parquet mapping data flow resulting in one empty partition of 2
Hi, I've been working on something, but I can't get it to work. I seem to have found the issue, but I"m not able to fix it. I've isolated in a single run which I'll explain below: I have a Delta table on ADLSv2 (based on one partition) that I want…
how to check who created the log analytics workspace in azure but it created more then 90n days ago we can't able to check in activity log how to we check that?
how to check who created the log analytics workspace in azure ? it created more then 90n days ago we can't able to check in activity log how to we check in another way? kindly please share me how can we check that
how to fix Error: Spark job failed: { "text/plain": "{\"runId\":\"2325e724-f898-471d-b9b3-1f28fc560b44\",\"sessionId\":\"9c4abfad-3cc6-4429-b30a-d70b25537d29\",\"status\":\"Failed\",\"payload\":{\"statusCode\":400,\"shortMessage\":\"java.lang.Exception:
while i am trying to Data preview getting this error
ByPass custom logic executions in Azure Data factory copy activity
Hi, I am planning to do data refresh project using Azure data factory and like to disable the custom logics execution. is there any way I can do in ADF copy activity? Like…
couldnt able to download the files completely by using copy activity download from website always downloading partially
whenever i try to download the files from website by using copy activity its downloaded partially 124.kb instead if 240MB - 350MB ..