Having problem with Azure sandbox storage account
Hi, I am having issue to access to storage account in Azure sandbox environment. I have login to Azure sandbox environment, but not able to access to Azure storage account even though my subscription is selected as sandbox. Capture.PNG
Unable to copy share point files through ADF
I am facing issue now! I followed all the steps https://learn.microsoft.com/en-us/azure/data-factory/connector-sharepoint-online-list?tabs=data-factory and tried to read pdf files from sharepoint through ADF. source: Sharepoint(PDF files)(I selected…
How can I get all files inside a drive irrespective of folder structure in ADF?
I want to copy files files from sharepoint drive which has lots of nested folders. Maximum hierarchy of folders is for 12 levels. Currently I'm using below endpoint in ADF's Web activity as it was mentioned in some of the articles that it provides every…
How to send the history of 3 months of files from one directory to another according to the date in its name
Hello, I'm having problems because I want to move the history of some files stored in a data lake according to the date in its name, but in the end what I manage to do is only move either one specific day or all of them. example: I have files October -…
Special character handling for file processing
Hello, I have some CSV files as feeds into datalake storage, this file will contain data /records with some special characters eg '/' We need to process the file from one container to another and we will need to remove some of the these special…
VPN and networking config for secure upload to ADLS Gen2 storage account?
I'm looking into network architectures and security for implementing secure access from user laptops into Azure to upload files to ADLS Gen2 data lake blob containers. We have no on-prem network or AD - just individual user laptops. We do have MS Entra…
How to mount dataset or datalake with read/write permission in Notebook of Azure Machine Learning Studio?
I've successfully mounted a dataset with read permission to notebook of my machine learning studio account. But when I try to write back to the datalake/dataset, it throughs "[Errno 30] Read-only file system". How can I mount the dataset with…
How to sync Azure data lake storage with sharepoint drive?
We have copied files from sharepoint site multiple drives having nested folders in Azure data lake storage container maintaining same folder structure. Now I want to create a pipeline in Azure data factory to delete file from ADLS container which is not…
My Dev, test, prod environments are in different resource groups of same subscription. How do I create a devops pipeline in this case?a DevOps pipeline to deploy a
Hi, My dev, test and prod environments are in different resource groups of the same subscription. I am involved in a data engineering project where I will be using primarily below resources - ADLS - data storage ADF - Orchestration Azure Databricks - QC…
How to copy files from sharepoint drive that has huge heirarcy of folders and subfolders
We want to get all files from the sharepoint into our ADLS container. In sharepoint drive there are arround 60k files but through our existing pipeline we are getting only 600 files. We have a web activity with below…
Data Mesh architecture implementation on Azure data lake
Hi friends, Data mesh architecture is a decentralized approach that organizes data based on business domains (e.g., marketing, sales, HR), I have the following questions Is it required to build a separate data lake for each department? When data…
Why Data asset is not supported when try to create a AutoML job?
Hi, I successuffly created a data asset (folder_URI type) with uploaded imges in an Azure Blob storage (registered storage source), but when I try to create a job in Azure AutoML the Data asset shows (not supported). Any idea what is the issue? Thanks
While fetching data from cosmos db conatiner and persisting the json file in ADLS Gen 2 through Synapse Pipeline, some objects in my json file are appearing as blank string causing data loss. This is happening in PROD only not in UAT and DEV.
Hi, The issue i am facing is with the Synapse Analytics Service - Pipeline. I have created a dataflow which is pulling data from cosmos db container in json format and storing that json file in ADLS Gen 2. When I check the json file in ADLS I see that…
Cannot read excel file which is in using adls using load_workbook of openpyxl in databricks
Cannot read excel file which is in using load_workbook of openpyxl but can read if copied to dbfs
How can i create a linked service in ADF for Sharepoint online?
I want to extract files from sharepoint to ADL by only using ADF. I followed few steps Step1: Azure Active Directory -> Registered new app -> created new secret key I have the Tenant ID, ClientID(App ID), Secret Key Step2: Sharepoint online ->…
401 Unauthorized "Audience validation failed" from ADLS endpoint
We are using service principal credentials to authenticate using OAuth2 token. Fetching the access token step is successful however, request getting failed when hitting ADLS endpoint . Error response : Exact error message :…
How to maintain same folder structure as source while sink the processed file
I have a requirement to process the JSON to parquet on daily basis. I have folder A,B,C needs to sink the file to another container with same structure as A,B,C for example if I'm processing a file from folder A it should sink to output container folder…
How to connect ADLS Gen2 from Synapse Notebook, using SAS Token.
Hello, i'm trying to connect with Data Lake Storage Gen2 in Synapse Notebook. I'm trying to send a parquet file from my subscription's ADLS to other subscription's ADLS. So i got a SAS token from other subscription's ADLS, and I tried to connect ADLS…
Error trying to validate storage account name while creating a new synapse workspace
I am unable to create a new data lake storage (gen2) account name in the basic tab for create Azure Synapse Workspace. The error I get is - " There was an error trying to validate storage account name. please try again ". The error message is…