@Henrique Girundi Thank you for reaching out to Microsoft Q&A. I understand that you are having issues with read/write data from Databricks to Azure storage account where you see the error- Operation failed: "This endpoint does not support BlobStorageEvents or SoftDelete.
As per the Managed identities for Azure resource authentication document:
- If your blob account enables soft delete, system-assigned/user-assigned managed identity authentication is not supported in Data Flow.
- If you access the blob storage through private endpoint using Data Flow, note when system-assigned/user-assigned managed identity authentication is used Data Flow connects to the ADLS Gen2 endpoint instead of Blob endpoint. Make sure you create the corresponding private endpoint in ADF to enable access."
- System-assigned/user-assigned managed identity authentication is supported only by the "AzureBlobStorage" type linked service, not the previous "AzureStorage" type linked service.
As per this related thread- https://learn.microsoft.com/en-us/answers/questions/472879/azure-data-factory-data-flow-with-managed-identity.html
Here are some solutions to this issue-
Option1: Disable soft delete option
Option2: Changing the linked service type for a source file from Azure Data Lake Storage Gen2 to Azure Blob Storage in the linked service.
Hope this helps. Please let us know if you have any more questions and we will be glad to assist you further. Thank you!
Remember:
Please accept an answer if correct. Original posters help the community find answers faster by identifying the correct answer. Here is how.
Want a reminder to come back and check responses? Here is how to subscribe to a notification.