DP-203 Labs
Is there any way to do all the labs for DP-203 without costing alot of money AND without taking like 10 minutes to install something? For each lab, the ./setup.ps1 file takes like 10 or 15 minutes to load. I'd like to not have to delete the resources…
SAS token generation by Databricks to access CSV files from ADLS container folder
Hi Team, There are some csv files zips inside the ADLS container folder. These zip files need to be downloaded for data correction. Downloading the file requires SAS token embedded with zip file path. Databricks has been used to generate the token and…
![](https://techprofile.blob.core.windows.net/images/hAFzqf2_AwAAAAAAAAAAAA.png?8DB62B)
Azure data lake storage and evaluation
Hi friends, what is the meaning of metrics operations applied in data lake storage calculations? And there are a bunch of all metrics to calculate that are confusing. Any easy way to understand?
ADLS Storage access forbidden 403 in synapse
Hello, I tried to run the sql code under synpase platform but seems like i dont have access to it, but i could access them when running activity like 'copy data etc. What should i do next to open the access?
ADLS Gen2 Access Logs via Diagnostic Settings
Hi, I was wondering if there is any way to enable access logs like read, write, delete on Azure Data Lake Storage Gen2 (ADLS Gen2). For Azure Blob Storage we achieved this via Diagnostic Settings, but to me it looks like this doesn't cover any operations…
To Read Delta tables from ADLS Gen2 from Azure Analysis Services (AAS)
I'm not able to Read Delta tables from ADLS Gen2 from Azure Analysis Services (AAS), even though the MS Azure page here https://learn.microsoft.com/en-us/azure/analysis-services/analysis-services-datasource#tab1400a has given an option to read delta…
Getting the size of parquet files from azure blob storage
I have a blob container abcd The folder structure is like below: abcd/Folder1/Folder a, Folder b…..Folder z Inside a particular Folder a/v1/full/20230505/part12344.parquet Similarly Folder b/v1/full/20230505/part9385795.parquet Scenario is I need to get…
Azure data lake storage and strategy
Hi Friends, We need to store huge data in a data lake of around 150TB. I have the following questions: We might not store the whole 150TB data at one point, but if we do and have one datalake storage would performance issues be caused? What are the…
Encryption Data on transit - from oracle to Azure Data lake
Hello - We have this implementation setup: use Synapse analytics Pipelines to extract data from Oracle OCI database Copy data to Azure data Lake Storage Transform data from ADLS Gen 2 to Azure Sql Database We want to make sure that : Data…
ADLS Gen2 Query Acceleration
Hello, As a follow up to this thread: Query acceleration using parquet does not work with double fields - Microsoft Q&A I would like to know if and when Microsoft plans to enable the query acceleration feature for Parquet files as well for ADLS Gen2…
![](https://techprofile.blob.core.windows.net/images/Z1PCM1zxm0SLa41PVP7B7g.png?8DA865)
How to fix the copy activity error while copying data from databricks delta table to datalake in csv format
There are some error tables in Databricks delta table . Those tables need to extracted as csv and load in azure data lake , inside the folder of the container. Staging has been enabled in the copy activity since it is 2 step process. Approx row count of…
how to read excel file data (stored in adls gen2 ) in Azure Data Factory pipeline
I have source excel file which is coming as showing in the format . Please help how to read excel file with that specified format . Please help . Thank You.
Custom Logs.
I have a CSV file (say "empfile.csv") in my data lake which is then loaded into Azure Synapse's dedicated SQL pool. Now, My data exists at both Data Lake and SQL pool. Suppose, I update some field values in this data (say "Manager"…
![](https://techprofile.blob.core.windows.net/images/5kfUQwbXfkqnY7o5wQqfdA.png?8DAA55)
Where do i find Azure HTTP status 429 response-code-examples
Where do i find Azure HTTP status 429 response-code-examples for: Azure Service BUS Azure Event Hub Azure Event Grid Azure Blob Storage Microsoft Fabric Azure Blob Storage und MS Fabric benutzt Azure Datalake Storage gen2 API
How to copy files from storage account to another storage account with the same last modified date using ADF
Hi Team, The request is that, when we are running ADF copy job to copy data from Landing storage account to Prod Storage account the last modified date coming as present copy time, but we want to copy the last modified date, what we have in landing…
This request is not authorized to perform this operation using this permission.", 403, HEAD Synapse connect to adls
I am trying to select data from ADLS Gen2 storage delta table and keep receiving this error. I added the synapse service principal as storage blob data contributor and ACLs to container with no luck. Firewall is set to enable all networks as well. Please…
Want to migrate from one Synapse workspace to another Synapse workspace
Hi team, Want to migrate from one Synapse workspace to another Synapse workspace. One being Dev environment, another one test environment. Please provide leads. Regards, NagaSri
![](https://techprofile.blob.core.windows.net/images/dFZUKqND1UqSwK3PMRwNZg.png?8DB3F2)
whitelist the serverless data plane subnets in the cloud region
I am following below instructions from the documentation to whitelist the serverless data plane subnets in the cloud region of your Databricks workspace. But unable to find ARM resource ID of the serverless compute subnet details …
Unable to connect Fivetran with public IP to ADLS Gen2 in Vnet with Private end point
Hi, I'm encountering an issue while trying to connect Fivetran to ADLS Gen2 in the UK South region. Here are the details: I have configured ADLS Gen2 as a destination in Fivetran and completed the prerequisites, including creating an SPN with appropriate…
Accessing Fabric Lakehouse via Logic App
Has anyone had any luck connecting a logic app to a lakehouse in Microsoft Fabric? Some additional background: We have a data pipeline that is ultimately generating a summary of some actions taken within the pipeline. This summary is stored as an Excel…
![](https://techprofile.blob.core.windows.net/images/DhaagHHln0iKC-GT6CLnkg.png?8DBA19)
![](https://techprofile.blob.core.windows.net/images/Z1PCM1zxm0SLa41PVP7B7g.png?8DA865)