Connect to Blob storage from Azure Databricks SQL
So I would like to read a table from a CSV file on Azure Blob Storage in my own account, and load it into a table in Unity Catalog on databricks (hopefully using SQL). I have tried this SQL command: CREATE TABLE IF NOT EXISTS <table_name>; COPY…
Integrating Databricks notebooks in Azure ML using SDK V2
Hi all, We currently have some Azure Databricks notebooks in production which we would like to integrate in Azure ML using the v2 SDK. I found resources to integrate these notebooks using the databricks_step in the v1 SDK. The official documentation…
Databricks Dev/Prod setup
We are a data team of 4 people. To make the process easy and more productive. Can we separate dev/prod environments at Databricks catalogue level rather than the workspace level? Can anyone share any thoughts on this? Thanks
When creating a second external location to the same path in Azure Databricks Unity Catalog it gives conflicting error for path. Is there any way to solve this?
Hello Team, When creating a second external location/external volumes to the same path with different folder or to the root location gives an error see below for details in Azure Databricks Unity Catalog as it gives conflict error for path. Is there any…
org.apache.hadoop.fs.FileAlreadyExistsException: Failed to rename temp file
[Repeat Question due to old thread] We have built a streaming pipeline with spark autoloader. Source Folder is a azure blob container. We've encountered a rare issue (could not replicate it). Below is the exception…
Issues while writing into bad_records path in Databricks
Hello All, I would like to get your inputs with a scenario that I see while writing into the bad_records file. I am reading a ‘Ԓ’ delimited CSV file based on a schema that I have already defined. I have enabled error handling while reading the file to…
My Dev, test, prod environments are in different resource groups of same subscription. How do I create a devops pipeline in this case?a DevOps pipeline to deploy a
Hi, My dev, test and prod environments are in different resource groups of the same subscription. I am involved in a data engineering project where I will be using primarily below resources - ADLS - data storage ADF - Orchestration Azure Databricks - QC…
The scim API is by default adding users to admins group in azure databricks
Hi, When we are invoking scim API in azure databricks it is by default adding users to the admins group and also after deleting users from only admins group they are being created again. Also calling scim API with adding groups as users also adding them…
Azure Data Bricks - User Doesn't have permission to perform this action while connecting to Azure Synapse Dedicate Pool
We are connecting Azure Synapse Analytics - Dedicated Pool using the PySpark Code that runs from Azure Data Bricks using SQL Authentication. While running, we are getting the below error when we use a user with db_datawriter and db_datareader…
How to ignore the records in ADF Data Flows
Hi All I am building a data transamination using mapping data flows ,I have a time stamp field Like TimeStampUpdated in the target table. I want to lockup historical data with incremental data transamination and ignore the records coming in the…
How can i connect Azure Databricks to Neo4j??
Hello, I want to connect to neo4j from Azure Databricks. What are the different approaches do I have? I am trying to connect here and i getting following error. Do I need to do anything before running the code? i mean setup managed identity or enable…
"Premium Automated Serverless Compute - Promo DBU" expenses arise from what, how can I disable it, and why are the costs so high?
"Premium Automated Serverless Compute - Promo DBU" expenses arise from what, how can I disable it, and why are the costs so high? detail in below
Databricks Simba Spark ODBC .NET8 C# Driver Parameters in SQL Queries
Hello, I'm using Simba ODBC driver v2.8.0 in order to query data from my azure databrick sql warehouse into a .net 8 Asp.net Api App. The ODBC driver works fine using plain text query but i need to parametrize the query. Searching around I found that it…
Access issue with app registration
I've created a Databricks workspace and a new notebook, but I don't have access to the secret keys under app registration, which are disabled for me. How can I solve this issue? Warning message You do not have access Your administrator has disabled the…
Azure Databricks exercise error
Keep receiving the error "No such file or directory /your_correct_source_value/wikipedia/pagecounts/staging_parquet_en_only_clean" When I checked Wikipedia, it appears this dataset has been deprecated since 2016-08-01 Could a new dataset be…
deploying Azure databrick with datalake
Deploying Azure Databricks creates an additional resource group in the background, which includes a data lake. Is it possible to use the data lake that I have already deployed in Azure instead of the one provisioned by Azure Databricks?
Access to C:\Data not allowed . Error Code 22853
Access to C:\Data not allowed . Error Code 22853 Any workway around this ?
How to specify a custom catalog name for Azure Databricks Delta Lake Dataset in ADF
Hello, I am creating an Azure Databricks Delta Lake Dataset in ADF and I am only able to choose the database name that links to Databricks's hive_metastore. How can I specify a custom catalog name that I created in Databricks instead of…
How to Create Delta Table in Azure Synapse Analytics with Id Auto Increment Identity Column ?
I have created the Delta Lake Delta tables In ADLS using Synapse Notebook and in that table, I want to add an identity column (Auto increment 1,1) but I am not able to create the same, Below is my Create table script and error which i am facing. Table…
Restricting files/folders to upload into External volumes in Azure databricks UC workspace
Hello Team, Is there a way to restrict the files or folders to upload/download from external volumes same like DBFS? Is there any option to disable the uploading files/folders feature in external volumes of azure databricks workspace with Unity Catalog.…