How can I get all files inside a drive irrespective of folder structure in ADF?
I want to copy files files from sharepoint drive which has lots of nested folders. Maximum hierarchy of folders is for 12 levels. Currently I'm using below endpoint in ADF's Web activity as it was mentioned in some of the articles that it provides every…
Having problem with Azure sandbox storage account
Hi, I am having issue to access to storage account in Azure sandbox environment. I have login to Azure sandbox environment, but not able to access to Azure storage account even though my subscription is selected as sandbox. Capture.PNG
Unable to copy share point files through ADF
I am facing issue now! I followed all the steps https://learn.microsoft.com/en-us/azure/data-factory/connector-sharepoint-online-list?tabs=data-factory and tried to read pdf files from sharepoint through ADF. source: Sharepoint(PDF files)(I selected…
Special character handling for file processing
Hello, I have some CSV files as feeds into datalake storage, this file will contain data /records with some special characters eg '/' We need to process the file from one container to another and we will need to remove some of the these special…
How to send the history of 3 months of files from one directory to another according to the date in its name
Hello, I'm having problems because I want to move the history of some files stored in a data lake according to the date in its name, but in the end what I manage to do is only move either one specific day or all of them. example: I have files October -…
How to mount dataset or datalake with read/write permission in Notebook of Azure Machine Learning Studio?
I've successfully mounted a dataset with read permission to notebook of my machine learning studio account. But when I try to write back to the datalake/dataset, it throughs "[Errno 30] Read-only file system". How can I mount the dataset with…
Data Mesh architecture implementation on Azure data lake
Hi friends, Data mesh architecture is a decentralized approach that organizes data based on business domains (e.g., marketing, sales, HR), I have the following questions Is it required to build a separate data lake for each department? When data…
Why Data asset is not supported when try to create a AutoML job?
Hi, I successuffly created a data asset (folder_URI type) with uploaded imges in an Azure Blob storage (registered storage source), but when I try to create a job in Azure AutoML the Data asset shows (not supported). Any idea what is the issue? Thanks
While fetching data from cosmos db conatiner and persisting the json file in ADLS Gen 2 through Synapse Pipeline, some objects in my json file are appearing as blank string causing data loss. This is happening in PROD only not in UAT and DEV.
Hi, The issue i am facing is with the Synapse Analytics Service - Pipeline. I have created a dataflow which is pulling data from cosmos db container in json format and storing that json file in ADLS Gen 2. When I check the json file in ADLS I see that…
![](https://techprofile.blob.core.windows.net/images/GwtaAs6y7E29mww9s0oZbQ.png?8D92F0)
Cannot read excel file which is in using adls using load_workbook of openpyxl in databricks
Cannot read excel file which is in using load_workbook of openpyxl but can read if copied to dbfs
401 Unauthorized "Audience validation failed" from ADLS endpoint
We are using service principal credentials to authenticate using OAuth2 token. Fetching the access token step is successful however, request getting failed when hitting ADLS endpoint . Error response : Exact error message :…
How to maintain same folder structure as source while sink the processed file
I have a requirement to process the JSON to parquet on daily basis. I have folder A,B,C needs to sink the file to another container with same structure as A,B,C for example if I'm processing a file from folder A it should sink to output container folder…
How to connect ADLS Gen2 from Synapse Notebook, using SAS Token.
Hello, i'm trying to connect with Data Lake Storage Gen2 in Synapse Notebook. I'm trying to send a parquet file from my subscription's ADLS to other subscription's ADLS. So i got a SAS token from other subscription's ADLS, and I tried to connect ADLS…
Does synapse serverless sql pool OPENROWSET support "parquest based structured stream" format?
I can read from a data lake structured stream using synapse serverless sql pool OPENROWSET, by specifying FORMAT = 'SSTREAM'. Does OPENROWSET support "parquest based structured stream" format? If yes, then what value should I specify to FORMAT…
Design strategy for Data lake
Hi friends, we need to design an Azure data lake from scratch for a solution that has complex, multiple data sources from Databases, Rest APIs, and custom applications, and the Data lake solution should be scalable and high-performance in terms of data…
Data size of databricks delta tables
It has been observed that the size of delta tables are much less as compared to when checked the underlying delta files in the storage account. Suppose a databricks delta table raw.deltaTableA has size of 2MB if we check the size of underlying delta…
Blob Capacity by Tier
Hi I was wondering if there is a way to get the blob or usage capacity for storage accounts broken down by tier through rest api?
How to store set variable json array output into csv file in data factory
I used the Lookup activity under for each loop and used Store Procedure which gave me some record as an output if the condition is fails So that output I have stored into the Set Variable and then those values I need to store into csv file whose file…
Azure Synapse Link with F&O missing tables
Hello, I'm facing this problem with synapse link, I did: Connect F&O environment with power apps. Established an incremental link with data lake. I can see D365 Finance and Operation in the list of tables in manage tables of the link. My problem…
![](https://techprofile.blob.core.windows.net/images/dumC_gWh_Em1fAuZl6JHHg.png?8D8F05)