Recommended technologies for creating a dashboard of ongoing jobs.
I'm looking for recommendations on technologies to use for the following project. I need to create a dashboard of summarized job data from a network of hundreds of users. Our current application is used in the field for inputting and tracking job costs…
Obtaining details for creating an Azure subscription using Rest API and Terraform
Hello! I have been trying to create an Azure subscription using Rest API and Terraform but I am not sure what the following parameters mean: billingAccountName, billingProfileName, invoiceSectionName. Could someone point me in the right direction on how…
how to fix task canceled error from Azure Data Factory, Pipeline
Hello The error below has occurred. Let me know what I'm gonna do. Error : 'Type=System.Threading.Tasks.TaskCanceledException,Message=A task was canceled.,Source=mscorlib,' A pipeline trigger was executed in Azure Data Factory. Thank you.
How to maintain same folder structure as source while sink the processed file
I have a requirement to process the JSON to parquet on daily basis. I have folder A,B,C needs to sink the file to another container with same structure as A,B,C for example if I'm processing a file from folder A it should sink to output container folder…
Connection issue with Oracle Autonomous Cloud (ADW) and Azure Data Factory
Hi, I am trying to connect Oracle Autonomous Database (ADW) with Azure Data Factory using self hosted integration runtime. After setting up Oracle DataDirect wire protocol 8.0 driver still i am unable to connect. i am getting this error below ERROR…
Azure Data Factory Oracle to Parquet datatype mismatch
Hi Team, I have build a dynamic pipeline to load the data from Oracle tables to datalake as Parquet files. Now, when I am trying to copy the data from source to sink through CopyActivity, then I am not able to convert the decimal datatype of the…
Cannot read excel file which is in using adls using load_workbook of openpyxl in databricks
Cannot read excel file which is in using load_workbook of openpyxl but can read if copied to dbfs
Setup ADF authen with personal github
I have error when I try setup repo git integration with new credential it's not working. check in tab network it return 401 but I don't see any step input credential or token???? why it have token in here?? where it come from??
Azure Data Factory pagination using QueryParameters and body field
Hello everyone, I am currently working with a pagination system and facing a challenge with the URL and parameter concatenation. I am trying to append a specific string to the URL retrieved from the first page of an API response. Here's the setup: API…
CosmosDB Analytical store delete changes in sql sink
Hi I am trying to capture delete changes from Azure cosmosdb analytical store using change data capture in Azure datafactory, Source is Cosmosdb, sink is Azure sql database. In between I am flattening my file using dataflow. My insert, update is…
ADF lost access to Dynamics CRM / Dataverse after MFA was turned on
Today my ADF pipelines started throwing errors: Operation on target Process order failed: ErrorCode=DynamicsFailedToConnect,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to connect to Dynamics: Unable to Login to…
Mapping Address Values to Relative URLs in Azure Data Factory
Hello, I am working in ADF to post data to Business Central Temp table. I have set up my source dataset as reading from blob to REST as sink. I am having this issue where I have to send invoices to the respective temp tables in each company in Business…
The Integration Runtime (Self-hosted) node has encountered an error during registration.
I am not able to register Self Hosted Integration Runtime on my Machine. This is the error I am getting : The Integration Runtime (Self-hosted) node has encountered an error during registration. Unable to determine the Integration Runtime (Self-hosted)…
Error in Linked Service connected to azure sql db using Azure Key vault
The linked service is not working after performing CICD from dev to qa branch. All the resources are copied along with linked service. However, The linked service could not setup correctly with this error. What does this error indicate and how to…
Split single column data into multiple columns in datafactory
Issue: I have some csv files in sftp location . I was planning to use copyactivity to load files from sftp location to datalake for archiving as well as upserting in deltalake. However, I am getting error while reading from source, as some file has…
Unable to resolve Azure database for mysql name from Data Factory
I am creating a linked service from my Azure Data Factory environment, to access my Azure Database for Mysql database. I select "Azure database for Mysql", and I can see all my mysql servers. I select "From Azure subscription" as the…
Error from ADF pipeline loading data into Dynamics 365 table
I have an Azure pipeline that is copying data from a table in an azure sql db to a table in Dynamics 365 CE. The D365 has MFA attached so I am using an azure service principal in the linked service connection for the pipeline. The Service principal is…
How to Get REST API Data when return rows are more than 30,000 and store in one single csv
How to Get REST API Data when return rows are more than 30,000 and store in one single csv. I tried pagination rule with AbsoluteUrl to $.Response. How to get copy activity go through each record with offset and limit automatically? How to write all…
Getting "Found invalid data while decoding" error while running Copy Activity for REST API
Hi, I am trying to get data from REST API and trying to save it in CSV. However, each time i run the copy activity i am getting below error. Any help will be great appreciate it. This REST API required a Header value as below to get uncompressed…
How to parameterize a dataset so I can use multiple excel files in a blob folder to pass to PowerQuery one by one?
I am in Data factory and I need help to parameterize a dataset so I can use multiple files in a blob folder to process in PowerQuery. I do not want to make 100 datasets if I want to process the 100 excel files in the same way. I have successfully…