Does synapse serverless sql pool OPENROWSET support "parquest based structured stream" format?
I can read from a data lake structured stream using synapse serverless sql pool OPENROWSET, by specifying FORMAT = 'SSTREAM'. Does OPENROWSET support "parquest based structured stream" format? If yes, then what value should I specify to FORMAT…
Data Mesh architecture implementation on Azure data lake
Hi friends, Data mesh architecture is a decentralized approach that organizes data based on business domains (e.g., marketing, sales, HR), I have the following questions Is it required to build a separate data lake for each department? When data…
Design strategy for Data lake
Hi friends, we need to design an Azure data lake from scratch for a solution that has complex, multiple data sources from Databases, Rest APIs, and custom applications, and the Data lake solution should be scalable and high-performance in terms of data…
401 Unauthorized "Audience validation failed" from ADLS endpoint
We are using service principal credentials to authenticate using OAuth2 token. Fetching the access token step is successful however, request getting failed when hitting ADLS endpoint . Error response : Exact error message :…
Data size of databricks delta tables
It has been observed that the size of delta tables are much less as compared to when checked the underlying delta files in the storage account. Suppose a databricks delta table raw.deltaTableA has size of 2MB if we check the size of underlying delta…
Error trying to validate storage account name while creating a new synapse workspace
I am unable to create a new data lake storage (gen2) account name in the basic tab for create Azure Synapse Workspace. The error I get is - " There was an error trying to validate storage account name. please try again ". The error message is…
Blob Capacity by Tier
Hi I was wondering if there is a way to get the blob or usage capacity for storage accounts broken down by tier through rest api?
How to store set variable json array output into csv file in data factory
I used the Lookup activity under for each loop and used Store Procedure which gave me some record as an output if the condition is fails So that output I have stored into the Set Variable and then those values I need to store into csv file whose file…
Azure Synapse Link with F&O missing tables
Hello, I'm facing this problem with synapse link, I did: Connect F&O environment with power apps. Established an incremental link with data lake. I can see D365 Finance and Operation in the list of tables in manage tables of the link. My problem…
Unable to create Storage Event trigger in Synapse Data Factory
I want to create a pipeline triggered by an event in my storage account : when a blod is created inside.nWhen I want to publish that event trigger in synapse, I have this error : The client 'd4d9f262-75fa-4138-845c-019afa12cf7a' with object id…
Copy LOB based data from DB2 to Azure Data Lake Gen2 storage using ADF
Hi team, I was trying to work on a scenario to copy data from a table in db2 database having a column with datatype as LOB(CLOB) to Azure Data Lake Gen2 storage container using Azure Data Factory. However, its taking a considerable amount of time( a day)…
Filtering data for last 24 months in Mapping Data Flows
How can I filter data using Mapping Data Flows for the last 24 months, starting from the max date and going back? The date column is in the mm/dd/yyyy format.
How to fix error Job failed due to reason: com.microsoft.dataflow.broker.InvalidPayload
Operation on target Country failed: {"StatusCode":"DFExecutorUserError","Message":"Job failed due to reason: com.microsoft.dataflow.broker.InvalidPayloadException: Fail to validate with reason: Retry Request, fail to…
How to send a mail notification for a failed pipeline in Azure Synapse Analytics?
How can I send a notification email to a specific email address without using a logic app when one of my Synapse Analytics pipelines fails? I would like to include the error message in the email notification.
How to maintain same folder structure as source while sink the processed file
I have a requirement to process the JSON to parquet on daily basis. I have folder A,B,C needs to sink the file to another container with same structure as A,B,C for example if I'm processing a file from folder A it should sink to output container folder…
Azure blob storage gen2
I need to restrict uploading files to Blob storage at 18 MB size and allow only .txt extension
How can i create a linked service in ADF for Sharepoint online?
I want to extract files from sharepoint to ADL by only using ADF. I followed few steps Step1: Azure Active Directory -> Registered new app -> created new secret key I have the Tenant ID, ClientID(App ID), Secret Key Step2: Sharepoint online ->…
Azure logic app , extract file from sharepoint to ADLS
Hi, I am working on a file extraction from sharepint to ADLS and the file size is 1 GB. I need to copy the file to ADLS. I am currently using standard logic app as the file size is more than the default size. After that i tried adding the file size in…
Download data from cosmos by v-studio
When I download data from cosmos from azure data lake storage of visual studio, it shows an error: 流 URL 无效