Azure ML | Support for ADLS Gen2 Datastore with SAS Token Authentication
Hello, I’m currently working on an application where we’re connecting various data sources—such as file shares and ADLS Gen2—to Azure Machine Learning. While we can create file-share datastores with SAS token authentication, I noticed that this option…
Data Quality issue in the Purview
Hello Team, We have configured the ADLS2 as source and scan it. For the data Quality, we did the following the steps : Create the Governance domain and publish it. Create the Data Product and add the tables. In the Data Quality section, add the…
Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName.
This exception occurred when I use the pipeline of data factory to copy data from sql server to lakehouse. But I didn't find any problems with the raw data.
error code 2011
i am testing a pipeline, i introduced a repeated row in one of the files that i want to upload. i was expecting that the pipeline would have run anyway, uploading the correct files and not the incorrect one. actually, the entire pipeline did not work...…
In MS fabric users able to view data but cannot download it
In MS Fabric, I want users to be able to view data in the workspace but not be able to download it. Please provide clear steps, along with links to verify the authenticity of the solution provided.
Copy Files from sharepoint online site to azure datalake storage
Hello We are trying to setup the flow which will copy files from sharepoint online site to azure datalake storage. As per my understanding there are 2 options : Using ADF to pull the files as mentioned in the link below…
504.0 GatewayTimeout & Invoking Azure function failed with HttpStatusCode - 499.
We've developed an Azure Function in python that connect to a Blob Storage, reads files and writes into in Azure tables. During the process, using Azure Functions & it's running fine for small size files (Less than 100 MB). The problem is that, when…
Alternative Methods for Capturing Data Lake Size in Less Time
Need assistance in capturing the size of the data lake per environment (e.g., Dev, SIT, Prod). Currently, a PowerShell script is used to fetch details, generating a CSV file for each environment with the medallion, folder, subfolder, and size. The…
Transforming JSON files using data flow
Hello! I currently have about 60 json files inside a blob container which most of them have different fields and values. I have created a pipeline with a get metadata activity that points to the container, with the field list set to Child items. I have…
Data lake solutions
We are in the process of Data Lake and going further down the line we are really getting confused whether to go for delta lake , datalakehouse, or synapse analytics. The subtle nuances making things not easier such as " A Data Lake House merges…
Why function could not find file
Hi there, I built an Azure Function to process json data from external requests, and then saved the json to a local file, uploaded it to the Container through the storage client. It worked well locally, and once deployed to Azure, it would prompt that…
Data lake schema enforcement
Hello, In Data Lake data is processed or ingested as schema on read and that is data is read in it format that it comes from the source. But I read an article that says schema enforcement makes data lakes high-performance and data readable. Please…
How to send a mail notification for a failed pipeline in Azure Synapse Analytics?
How can I send a notification email to a specific email address without using a logic app when one of my Synapse Analytics pipelines fails? I would like to include the error message in the email notification.
How cn i restrict someone from not downloading any data from Fabric Lakehouse or Fabric warehouse
As a data admin, i want to control data access for a user in your Microsoft Fabric Warehouse. The goal is to allow this user, who has a Contributor role, to view data directly in the workspace without being able to download it as a file. This scenario…
Getting Issue while upload files on azure data lake storage
I have two application first application for frontend :- which is responsible to take file from local storage and send that files azure functions via API 2nd application for backend:- which is responsible to take file from form-data (multipart) and…
Azure resource type - microsoft.datalakestore/accounts
Hi Team, There is a azure resource type - "microsoft.datalakestore/accounts" What exactly are these resources? How is it different from the storage accounts (microsoft.storage/storageaccounts)? Thanks. Regards, Nagesh CL
How cn i restrict someone from not downloading any data from Fabric Lakehouse or Fabric warehouse
As a data admin, i want to control data access for a user in your Microsoft Fabric Warehouse. The goal is to allow this user, who has a Contributor role, to view data directly in the workspace without being able to download it as a file. This scenario…
Copy activity failed because you have multiple concurrent copy activities runs writing to the same file
Hi All, I am migrating 1000 of SQL tables from on premise SQL server to Azure Blob storage. I am using ADF for each and copy activity to do so. However, while processing the tables in parallel/concurrent, I am getting below error - Failure happened…
PATCH method not allowed for Storage Services REST API "Path - Update"
I try to use the Set Access Control option described in https://learn.microsoft.com/en-us/rest/api/storageservices/data-lake-storage-gen2?view=rest-storageservices-datalakestoragegen2-2019-12-12. Although I was able to run other methods (as Path - List)…
Azure data lake folder structure
Hi, the link provided in the below thread is not working, any way to find the information or a new URL/link: FAQs About Organizing a Data…