Data Factory doesn't store credentials in Git. I could see credentials saved.
This para mentions that Data Factory doesn't store credentials in Git. https://learn.microsoft.com/en-us/azure/data-factory/source-control#using-passwords-from-azure-key-vault But when I create/publish a linked service in a feature branch, I can see that…
Azure Data Factory with Dataverse Sink (mapping data flow) Upsert broken?
Simple ADF data flow with a dataverse sink (Dynamics 365). Created an alternate key on a custom dataverse table and attempting to do an Upsert. I have the Alter row transform with "Upsert if" -> true() I have set the sink to Upsert…
Microsoft Fabric Copy Data Activity - Filter Out Files that Start With "~"
I need to copy files from a network folder to a Lakehouse and I am building a pipeline to handle that tasks. Some of these files are temporary excel files that Microsoft generates natively as other users are making edits. I need to exclude files that…
Read sharepoint list data and convert to csv using ADF
Is it possible to read sharepoint list data using ADF with having 60 sites and 100+ list items and automate XML response to CSV files ? We already did POC using databricks, its working fine, but want to know whether ADF can handle without…
Azure Data Factory ServiceNow Connector Copy Activity Dynamic Date Filter Query Builder
Hello all, We recently received a notice to update our legacy ServiceNow connection driver and while updating our copy pipeline activities, I am struggling to add a filter on a date column. In the legacy connection driver, we were able to use a query…
How to reference Json fields in copy data (ADF)?
I am currently integrating an API response into a SQL Server database using Azure Data Factory. The response from the API is structured in JSON format, as shown below: The goal is to extract the id values from each batch and insert them into a SQL…
How to override an ARMTemplateParametersForFactory.json [Array as value] in GitHub Actions
I already got help from this "You can override the parameters in parameter file by explictly calling it out next to the parameter file. Example : parameters: repo/ARM/azuredeploy.resourcegroup.parameters.json rgName=${{env.RESOURCEGROUP_NAME}} This…
How do I make a dataset based on an SQL statement with multiple tables?
How do I make a dataset based on an SQL statement with multiple tables, not a single table? I'm using MS Azure Data Factory online via a web browser. I'm not a total beginner but I know a few things and have completed a tutorial on Azure. This…
Switch from Legacy BigQuery to new BigQuery Linked Service
I have a BigQuery routine that takes start date and end date as input parameter and both input parameter is declared of type String in routine. This routine is working fine without errors. I want to invoke the BigQuery routine from Azure Data Factory.…
What does ADF SSIS Error "Disk Full Exception Caught" mean
Hi, we are running an Azure-SSIS Type Integration Runtime in Azure Data Factory. This morning this IR suddenly got Unavailable for some reason. Fortunately we had the Diagnostic Settings ADFSSISIntegrationRuntimeLogs enabled and I found the following…
Error during batchpoint invoke: Access to this resource is denied. Please check your ACL rules on the resource
Hello, I am getting a "Access to this resource is denied. Please check your ACL rules on the resource." when attempting to invoke a batch endpoint. I have tried both with Azure CLI and Azure Data Factory to invoke a batch endpoint, with both…
How to increase throughput in ADF copy activity From Table Storage to Table Storage
Hello, Short background - I'm using ADF for creation of Table storage Backups for specific Storage Accounts (each account is different pipeline with separate copy activities for each Table), creating new Table storages from old ones using copy…
Power Query merge query on two colums
Hello: I have two datasets (UserQueries) in Power Query, I now want to merge them on two columns that are common. Should be easy, right. I use Merge Queries on the tool bar, get a dialogue box as follows: The col's are correct and the order is correct.…
Data Factory Pipeline source pointing to Azure Data Lake failing
Hello: I have a Source in a Pipeline which points to a DataSet. If I test the DataSet the connection tests fine and I can preview the data. The schema is also being picked up correctly. There is an extra attribute in the first position, called Prop_0. …
Automating execution of ".bat" file using ADF / LogicApp
Hello Team, Currently we are manually executing a .bat file in one of the client VMs and we need to automate the same using ADF / Logic App. The bat file is inside a folder and it executes a .jar file which resides within the same folder. The .jar file…
ADF - converting list of lists into a proper JSON format
Hello, I'm pretty new to ADF and I can't wrap my head around one case. To keep it simple: I have a source (REST API response), which theoretically is JSON but it doesn't have a "proper" JSON format. It's supposed to be a simple table but it…
Salesforce connector : I am getting TaskCanceledException error while doing full load for Service__c and Service_Step__c entity.
Hello, We have recently transitioned to the Salesforce Bulk API 2.0 Connector in Azure Data Factory (ADF) for our integration processes. As part of our use case, we perform a full data load for all Salesforce entities once a month. However, we are…
When to use deployment jobs vs. normal jobs in Azure Pipelines for Bicep deployments?
As a data engineer working on infrastructure-as-code using Bicep, I'm trying to understand the best practices for structuring my Azure Pipeline. I have separate environments for DEV and PRD, and I'm not sure when to use a deployment job versus a normal…
Cannot connect to SQL Database. Please contact SQL server team for further support.
Hello support from time to time i get an error that username/password is invalid for database. However database is up and running and in use. below are the infromation activityid 701ef079-7056-4d6b-b941-c2e2f011e57a
How to set ADF linked services and triggers parameterized for respective environment when we create pull request
How to Set ADF linked services and triggers parameterized for respective environment when we create pull request like dev-parameter.json