I need to verify my storage account's total capacity limit and check if I am reaching my quota. How can I confirm the maximum storage allocation and expand it if necessary?
I need to verify my storage account's total capacity limit and check if I am reaching my quota. How can I confirm the maximum storage allocation and expand it if necessary?
I want to restrict azure ADLS SFTP access to directory level.
I want to created sftp for 5 user and want to maintain all the sftp folder in one container.
Pyspark dataframe is taking too long to save on ADLS from Databricks.
I'm running a notebook on Azure databricks using a multinode cluster with 1 driver and 1-8 workers(each with 16 cores and 56 gb ram). Reading the source data from Azure ADLS which has 30K records. Notebook is consist of few transformation steps, also…
How to use linked service in Notebook with pyspark
I have pyspark script in Notebook to read and write data in ADLS Gen2. Below is an sample of the pyspark script. But in the Synapse I only have a linked service created with Service Principle could connect to the ADLS Gen2, so I need to specify in…
How to specify a custom catalog name for Azure Databricks Delta Lake Dataset in ADF
Hello, I am creating an Azure Databricks Delta Lake Dataset in ADF and I am only able to choose the database name that links to Databricks's hive_metastore. How can I specify a custom catalog name that I created in Databricks instead of…
How to control access to a folder in ADLS gen2 container while Storage account IAMs are in action
Hi, I have a synapse pipeline that saves an output file in a folder (ex: salary) in an ADLS container (ex: employee). Now Mr. X wants the data saved in the folder to be accessible only to him but storage account level IAMs have already given access to…
Medallion architecture in ADLS
I am trying to find the most suitable storage architecture for the following use case. I have several clients and I need isolated storages so data cannot be mixed up I work with 3 different environments for each client: dev, pre, pro I need to…
Connecting to Azure Data Lake Storage Gen2 from Tableau Desktop is throwing error
Error Receiving in Tableau Desktop: Tableau received an OAuth error from your request. Please see the error message for more information: User authorization failed (invalid_client). When checked the auth url it seems the url is the one like below -…
Unable to create a directory named space character using ADLS Gen2 REST API
I am unable to create a directory in "Azure Data Lake Storage Gen2" that is named as just a single space character, despite reviewing the documentation and there being no indication this is a disallowed name. My primary access to "Azure…
generate SAS tokens from serviceprincipal credentials
I am working on to create java client that generates sas tokens for the given service principal credentials. I am taking a reference from…
Need to extract zreports from SAP HANA
Hi, I have a use-case where I want to extract data from SAP Hana. The use-case is as follows: I have SAP Hana deployment from where I need to extract data. The data is stored in zreports which are extracted using T-Codes. Now, I want to extract the data…
We want to know details about azure local , can we use databricks , adlsg2, datafactory, machine learning on azure local ?
We want to know details about azure local , can we use databricks , adlsg2, datafactory, machine learning on azure local?
ADF Copy Data JSON Source dynamic schema mapping
Hi I am working on ADF Copy data activity. HTTP Dataset is returning a JSON with the following sample JSON output { "totalRowCount": 1, "data": [ { "ProductCode": "P - 1", …
How to fix "Failed to upload block This request is not authorized to perform this operation." error while writing a csv to ADLS Gen2 using synapse notebook
I am trying to write a dataframe into a csv file on ADLS Gen2 using Synapse Notebook. I am using pandas to_csv as below. The 'linkedService': 'xxxxx' uses system assigned managed identity which has the "Storage Blob Data Contributor" role…
I am unable to load a table to fabric lakehouse
I am unable to load tables to fabric lakehouse. I was having no issues until recently but now when I publish a table from gen 2 dataflow the table in the lakehouse has all blank rows. In the SQL endpoint they are all null values.
How to write the query on Azure to get the information of particular SPN linked with all the folders present in containers..
Hi, As there are multiple folders are preset into Azure storage containers, each folders of containers linked with some SPN. I am looking for the script to create to find, which SPN associate with which all folders?

Monitor AutoResolve Integration Runtime in Azure Data Factory
Hi all I need to monitor the Integration Runtime performance for a pipeline that uses stored procedure activity to load data from ADLS to Snowflake. The stored procedure in this case resides in snowflake itself and it’s called from Snowflake Tenant to…
ErrorCode=AdlsGen2OperationFailed...Failure happened on 'Sink' side.
Getting the below error while copying the data to ADLS from SAP Hana, however this error is random to tables failing daily. When I try load each of the failed tables from the same source and with the same file name then there is no issue and it works as…
I am integrating the dataflow activity which removes duplicates from the csv file to the existing pipeline i have. I need a help to configure the parameters.
I created a dataflow activity which removes duplicates from the source files( csv formatted files). So now i integrated this dataflow activity to existing pipeline i have. but i am not able to parametrize the source files. It would be helpful if i get…
DP-203 Labs
Is there any way to do all the labs for DP-203 without costing alot of money AND without taking like 10 minutes to install something? For each lab, the ./setup.ps1 file takes like 10 or 15 minutes to load. I'd like to not have to delete the resources…