how to call Oracle procedure in ADF lookup activity
We need call Oracle Procedure from the Azure Data Factory. It seems that in lookup activity only Select is supported. How can we call the Oracle Procedure without the modification on Oracle side?
how to check who created the log analytics workspace in azure but it created more then 90n days ago we can't able to check in activity log how to we check that?
how to check who created the log analytics workspace in azure ? it created more then 90n days ago we can't able to check in activity log how to we check in another way? kindly please share me how can we check that
how to fix Error: Spark job failed: { "text/plain": "{\"runId\":\"2325e724-f898-471d-b9b3-1f28fc560b44\",\"sessionId\":\"9c4abfad-3cc6-4429-b30a-d70b25537d29\",\"status\":\"Failed\",\"payload\":{\"statusCode\":400,\"shortMessage\":\"java.lang.Exception:
while i am trying to Data preview getting this error
couldnt able to download the files completely by using copy activity download from website always downloading partially
whenever i try to download the files from website by using copy activity its downloaded partially 124.kb instead if 240MB - 350MB ..
How to apply Wildcard REGEX filename from SFTP using ADF ?
I have a list of files on an SFTP server that I need to copy and move to a blob storage container. The filenames should match the following wildcard pattern: (FullName)[0-9](-)[0-9](-)[0-9]*(.)(CSV). Here are the steps I've taken: Created a new…
Azure Synapse Analytics Failed to setup debug session
Right now I'm not able to run a data flow task on our Azure Synapse environment. When I try to start a debug session I get the message Failed to setup debug session. When I trigger the pipeline instead the task fails with "Operation on target…
Understand costs AZ Databricks job
I have an Azure Data Factory pipeline that executes multiple Databricks Notebooks using job clusters. I need to track the cost of these job clusters, including both the Databricks and the underlying VM costs, specifically for this set of jobs. Currently,…
Call API with dynamic URL and store JSON results in DataLake Storage
I am trying to make implement the following scenario using Data Factory: I am making multiple API calls, relying on a dyanmic URL (e.g. "url.com/api/{ID}" for a list of IDs). The resulting JSON from each of the API calls should be stored as a…
Choosing an Approach for Incremental Loading with Watermark in Azure Data Factory: Efficiency and Cost Considerations
Hi all, I'm working on implementing an Azure Data Factory pipeline for incremental data loading using a watermark table approach. I have identified two different approaches but am unsure which one is considered the best practice in terms of…
How to compare two dates in ADF to see which is greater?
Hi there, I am using Azure Data Factory to compare two dates (one is stored in a string variable and one its utcnow) and I need to know when the utcnow is greater than the date in the stored string variable. Both are just dates no timestamps I looked at…
Data flow failing at Sink Step when pipeline is scheduled
I have Pipeline with Data flow debug where data is sinked into Data set. When run manually it runs fine and the parquet file is created. However when scheduled in Pipeline it gives following error. Pipeline runs fine only gives error on schedule…
Except Unique Assert Transformation in Dataflow
I am trying to use Asset transformation in dataflow to check the dups in source. I am not able to get the correct result when using parameters that will have single or multiple columns as primary keys (example shared below) when using below expression. …
Azure HTTP Linked service ADF Username only no password
I am trying to use Azure HTTP Linked services to connect an API which uses just a username and no password for basic auth. The password input is mandatory in Azure. Anything i can do?
Obvious behavior of dynamic mapping in ADF
I have a single copy activity that moves data from a source parquet file to an azure sql database. When I used auto mapping and tried to change the name of source or sink field it failed, however when I copy the json mapping generated and change one of…
Split column data into another column
Hi There, I want to extract data from a column , which has key-value pairs separated by commas, into respective columns using Azure Data Factory data flow. Example Data of Column containing key value…
ADF does not allow MI based auth for Azure table storage. Are there any alternatives for this?
I am trying to move away from key-based access for an azure table storage in ADF, but the only auth options I get are account key and SAS URI. Is there no way to use MI/SP for this? Blob storage has MI based auth but not table, why is this so?
How to migrate bigger repo to azure devops
Hello, I am migrating from GIT to Devops. My repo is quite huge and while doing git push I am getting error "HTTP 413 curl 22 the requested url retuned error: 413". I am using Https to do migration. SOmewhere I saw ssh can solve this issue.…
Default TImezone in the Dataflow
Hi Team, Is there anyway to change the default timezone in the datafactory? Currently, when I am creating a column(loadtime) with currentTimestamp() in derived column transformation in the dataflow, then I get the time in UTC. But, I want the default…
Hubspot connector doesn't work for deal properties: 414 Request-URI Too Large
I am getting this error message when trying to copy data from Deal properties from Hubspot: Failure happened on 'Source' side. ErrorCode=UserErrorOdbcOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR [HY000]…
How can I force a failure when testing an ADF Webhook?
I would like to try to force a failure when testing an ADF WebHook activity so it follows the failure path. Is there a way I can easily do this? I can try to swap the on success and on failure paths to test the path, but I'd really like to force an…