Oracle Service Cloud error when setting up ADF linked service
We have already setup up a vpn tunnel and all the necessary network configuration to Oracle Service Cloud. I created a Linked service to Oracle Service Cloud but when i test it i get this error: ERROR [HY000] [Microsoft][OSvC] (20) Error while…
ADF Lookup activity not returning correct datetime value
A pipeline uses a lookup activity and runs a script against Snowflake, returns first row of data and we are using it in a update statement. Running the lookup to get MAX(load_date) from a snowflake table. In Snowflake it is "2024-04-25 17:26:01.548…
How to find connection string for email communication services?
Why is this not easy and intuitive? There are no tabs or anything that easily lead to the connection string, so where can I find it? Every other site, they make this as easy as possible, but for whatever reason M$ really doesn't want your business here.…
Slow Data Pipeline Performance - ADF Data Flow to Azure SQL Database
I'm having some performance issues with an Azure Data Factory (ADF) data flow pipeline. The pipeline is designed to move data from a Parquet file and insert/update it into an Azure SQL database table. The data volume is moderate, with batches of 50,000…
How to send parallel rest-API request through Copy data activity in ADF with Pagination rule AbsoluteURL
The pipeline flow is to send request to rest-api and load the data to a json file in ADLS2. Due to huge data requests, there is a delay in the Copy task completion. To enhance the performance of the ADF pipeline, I wanted to send 4 requests in parallel…
Data Factory auto create table in Copy activity doesn't seem to work, or isn't very useful
Hi there I'm trying to create copy activities where the source table is replicated into the Sink database, and the table is created according to what is in the Source. I know there is the "Auto create table" option when making the copy…
ADF | ADB Activity Execution Time on Job Clusters
Has anyone noticed adb notebooks running (on job clusters) faster in ADF ? we have sequential notebook activities and seeing the start up time of clusters to be as low as 2 minutes.
V2 Snowflake connector: How to copy data into non-uppercase database objects?
I just switched to the new Snowflake V2-connector in ADF but ran into problems when trying to copy data into a table that has a non-uppercase name. To give some context: I'm using a single dataset for my connection to Snowflake and parametrized schema…
Invalid property name error in ADF Copy Data Activity
We have created pipelines for copying data from sharepoint. We have configured our copy data activity to store the file with some metadata also like author name, title, link, last modified, etc. Now for some files I'm getting below…
How to fix ErrorCode=UserErrorInvalidPluginType,'Type=Microsoft.DataTransfer.Common.Shared.PluginNotRegisteredException,Message=Invalid type 'GoogleBigQueryV2 Azure Data Factory
I am trying to copy data from Google Cloud BigQuery datasets to Azure Blob Storage as parquet files. I have followed this documentation to set up the Linked Service to Google Cloud. The connection to BQ is successful. Then, I created a Dataset in ADF…
SAP CDC and SAP Tables connectors regarding SAP Note 3255746 ?
Dear Team, Is there any one who when through the SAP Note 3255746 ? What impact shroud we expect regarding the SAP CDC and SAP Tables connectors already implemented for our customers? Thanks for you input. Tarik
‘Failed to execute script. Exception: 'Odbc Operation Failed.’ Error in Script activity in ADF
Hi Team, We have few pipelines in ADF where we are fetching the Data from multiple API endpoints using Copy Activity. We are connecting to API by leveraging ‘Rest API’ linked service and Integration Runtime(IR) as ‘AutoResolveIntegrationRuntime’. We are…
How to copy files from sharepoint drive that has huge heirarcy of folders and subfolders
We want to get all files from the sharepoint into our ADLS container. In sharepoint drive there are arround 60k files but through our existing pipeline we are getting only 600 files. We have a web activity with below…
How to store an array in a json file in ADF
I have a web activity with url drive/drive-id/items/folder-id/children then I have a filter activity that filters all folders and then I'm appending it in an array. Now what I want is to store all folder ids in a json file. and later I will load that…
Need help in understanding times in ADF's tumbling window trigger
Hi Team, I'm confused by many times in tumbling window trigger. We have at least 4 times for tumbling window trigger. @trigger().outputs.windowStartTime @trigger().outputs.windowEndTime @trigger().scheduledTime @trigger().startTime Besides, there…
How to resolve invalid property name error in ADF copy activity?
ErrorCode=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'BadRequest'. Account: 'rmsdatalakestracc1mqa'. FileSystem:…
can use a dataset for pipeline but can't use for dataflow in Azure Data Factory
Created a dataset for Azure SQL and it works fine in any pipelines of Azure Data Factory. However, it gives me the connection error from Data Flows - refer to the below. The Azure SQL is configured to be accessible from any Azure services. Spark…
Azure Data Factory still generating deleted log even with "After completition" set to "No Action"
Hi, I’m facing the issue below. I have a Data Flow that inserts a filter into an Azure SQL DB. I’ve set it to “Delete source files” after completion, but it’s still generating a log in the Blob storage, which I don’t want. So, I changed it to “No action…
Hubspot connector doesn't work for deal properties: 414 Request-URI Too Large
I am getting this error message when trying to copy data from Deal properties from Hubspot: Failure happened on 'Source' side. ErrorCode=UserErrorOdbcOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR [HY000]…
How to pass the correct audience when calling mssparkutils.credentials.getToken on Azure China (Mooncake) cloud?
I'm using Microsoft Spark Utilities (MSSparkUtils) with linked service to authenticate into Azure SQL using System Assigned Managed Identity (Synapse Workspace) on Azure China (Mooncake) cloud. However, when I call gettoken with the audience type…