Logic app to retrieve the latest file from a blob folder
How can I create a logic app that retrieves the latest file from a blob folder when a Http request is received, where there are multiple files, and sends it as an attachment? Are there any specific steps or configurations required for this process?
Synapse Serverless CETAS fails with error "Fatal exception occurred: bad allocation".
Hello, I am trying to create an external table (CETAS) from a large amount of fairly small json files, so that they can be queried more efficiently. The json files are stored on ADLS. Previously this worked fine, when i let the query run for 1 - 1.5…
In ADF using HDFS linked service my copy file activity throws the following error
Hi, I have an issue using ADF with HDFS linked service. I created a HDFS connection then a copy acitivity from HDFS to Azure Data Lake gen2. The source is a CSV file and the copy format is binary. When I run the pipeline I get the following error: …
Azure Datafactory Out of memory error when read from Salesforce
I use Datafactory Copy activity to copy data from Salesforce to ADLS, I am facing with Out of memory error. The file size is 129k of rows 800MB, then I set block size to 100 MB and Max rows per file to 100 000, but the error still exist. What can you…
ADF pipeline to read the data from UC table to adls gen2 account
Hello Team, We have a requirement to create Azure Datafactory pipeline to read the data from UC table, access on the table is provided ( to Azure Datafactory Managed Identity) and copy the data into adls gen2. Is there a way or article to implement this?…
Consistent data in data lake gen2
Hi friends, I have to understand how data consistency works in ADLS, I have found this old…
Can I use wild card(*) in middle of File Path
Can I use wild card(*) in middle of File Path 50m ago Can I use wild card(*) in middle of File Path when I load files ADLS to Notebook? I got file path like bellow …
Copy Dataverse data into Azure SQL using Synapse Link - Initial data is not loaded
The intended setup is to link Dynamics environment to PowerApp and use Synapse Link to copy data to ADLS. From there ADF template is used to incrementally load data to Azure SQL. In short: Dynamics -> Synapse link -> ADLS -> ADF -> ASQL I…
Copy data for very small file takes way too long
I encountered a timeout problem related to a copy data activity from ADSL Gen 2 to ADSL gen 2 trying to copy a 0 byte csv file. How is it possible that the activity takes 28 minutes to copy a 0 byte file?
Getting this error while copying 3 files from adlgen2 container to same adlsgen2container.
Failure happened on 'Sink' side. ErrorCode=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'BadRequest'. Account:…
last modified time in ADLS storage
Our data collector app is set to use UTC time. It writes to the ADLS Gen2 storage with a directory structure based on current UTC time (i.e. year/month/day). The azure region we selected is East 2. We access azure portal from west coast (i.e PST). …
How can I efficiently download files from various subfolders and nested folders within different levels of hierarchy in SharePoint, and then transfer them to Azure Data Lake Storage (ADLS) using Azure Data Factory (ADF)?
How can I efficiently download files from various subfolders and nested folders within different levels of hierarchy in SharePoint, and then transfer them to Azure Data Lake Storage (ADLS) using Azure Data Factory (ADF)? Despite attempting various…
Access Azure Synapse Link Data Through PowerBI
We are using Synapse Link to export D365 FnO tables to Azure Datalake. When we are creating the Synapse Link by configuring Synapse workspace, Spark pool and Storage, a LakeDatabase get created in the synapse workspace with the tables. Then I have…
How can I upload a csv from a local drive to Lakehouse programmatically, so I can schedule the pickup?
Hi, We are storing csvs on a local drive to support reporting and analysis. I would like to pull these csvs into the Lakehouse on a scheduled basis programmatically. All I can find so far is a manual upload. Can I code a Notebook, or leverage a…
Azure Data Factory Self Hosted IR Cannot Connect To MongoDB Timeout With No Exception
I have an self-hosted (on Prem) Integrated Runtime. On this VM I have installed a MongoDB Compass client and I can connect to a target MongoDB cluster. The connection is okay and I can see the DB/collections. I use the Azure Data Factory to create a…
Map an ADLS Gen2 storage to a a windows drive
A colleague of mine has indicated that it is not possible to map an Azure Data Lake Storage Gen2 store to a mapped drive in Windows. I'm not a specialist on this, but I'm very skeptical on this statement, as I see elsewhere that this should be possible:…
How to dynamically access files from mounted data lake in Databricks notebook?
Hello everyone, I have a databricks notebook running some python code for ETL transformation of data from a CSV file. I have the csv files in a blob storage and have mounted the said storage to my notebook using dbutils.fs.mount Now, the csv files are…
Not able to run the synapse pipeline with %run notebook magic command
I am trying to run a pipeline in Azure synapse analytics and running a notebook which has When I am running the above command via manual notebook run its running but when I am running the same notebook via pipeline getting the below error
Azure Data Fcatory: String was not recognized as a valid DateTime.Couldn't store <> in Date_column Column
I have a Copy data Activity which pull data from Source Parquet file to Sink Azure SQL database. while doing so its throws an error of "Failure happened on 'Sink' side.…
ADLS Gen2 backup options
We have set up our data lake using an ADLS Gen2 storage account. Because of using Data Factory and Databricks, we cannot enable the soft delete feature on these accounts. What are our options for backup of the data for accidental deletion? We do have…