How to connect to on-prem sql server database using Pyspark.
Hi Team, Could you please tell, how to connect to sql server on-prem using pyspark code. we are trying with Azure Synapse notebook and Data bricks notebook. Could you help us to connecting using both synapse and data bricks notebooks
How does ADF Linked Service for Azure Databricks aligns with Job Compute Policy defined inside Azure Databricks
This is a two-part Question. First Part Context: I have ADF which contains several data pipelines. Some pipeline also includes a databricks notebook as an activity. I have created Linked Service Azure Data Factory to facilitate the pipeline which…
load data from ADLS to Unity Catalog tables using Azure Data Factory?
What is the best way to load data from ADLS to Unity Catalog tables using Azure Data Factory?
I want to give a static public IP address to my Databricks clusters
I could go over the "private" Microsoft network backbone using something like ExpressRoute. Connecting to a firewall outside of Databricks. But for now I just need to connect over the public internet to build a proof of concept. Would need to…
![](https://techprofile.blob.core.windows.net/images/lERuGqIsfE-3j2IJgoDLSw.png?8D844E)
Virtual Network settings on Storage account interaction with Databricks Table Monitoring
I have set up my Databricks Unity Catalog on an Azure Data Lake storage account which uses my companies virtual network to allow access. I have all privileges on my account, so I am able to create, alter or delete catalogs, schemas and tables using a…
![](https://techprofile.blob.core.windows.net/images/Ui4mhP2_BgAAAAAAAAAAAA.png?8DBD40)
Azure Databricks Too Many Requests errors
We are getting many errors with loading notebooks and also now running jobs on clusters due to Databricks saying it has too many requests. For example, getting the below error message: run failed with error message Cluster '0724-103023-f2llqh3p' was…
Efficient Log Handling and Data Retention in Azure Data Factory and Databricks
I need to create a solution to send logs from Azure Data Factory to the Databricks Unity Catalog. I'm considering the following structure: Whenever an activity run results in either failure or success, the corresponding log will be sent to Azure Logic…
Creating a Zip File from Blob Storage Using Python in Azure Databricks
Hello, I am working on a task where I need to create a zip file for multiple files stored in blob storage, without having to read the files again or using local storage. I am using Python in Azure Databricks and would like to leverage its capabilities…
Cluster Not Created in Pay as you go subscription
I have a pay as you subscription, I have created my databricks workspace in US East Region. Whatever i cluster i created with single or multi node. every time i am facing Quota limit exceeded error. What is the best available cluster in US east region,…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
transformation changes in silver and bronze layer
Hi , what transformation takes place between silver and gold layer .i.e have loaded data in bromze layer whichn is my bronze layer and i transformed data here ... then what will happend to silver to gold layer apart from pk,fk joins and all
CORS Issues Between Azure Static Web Apps and Azure Databricks
Hi, I'm currently facing some CORS issues in my application setup and would like to know your opinion on how to solve them. Front-end: Angular application deployed on Azure Static Web Apps (.azurestaticapps.net) Endpoint to access: Model serving…
![](https://techprofile.blob.core.windows.net/images/UVyLn-Xlr0apNce5TRX1RA.png?8DBA3B)
What is the best way to access data in the data bricks by using azure function?
I just tried to load data from data bricks by using data bricks jobs API and azure function. Can I know is there another way to do the same thing that based on azure function?
![](https://techprofile.blob.core.windows.net/images/9br_lVNzskCNQt61U0Odeg.png?8DC730)
![](https://techprofile.blob.core.windows.net/images/rnwqhQqer0-oefliFOISMA.png?8DC41C)
Getting the size of parquet files from azure blob storage
I have a blob container abcd The folder structure is like below: abcd/Folder1/Folder a, Folder b…..Folder z Inside a particular Folder a/v1/full/20230505/part12344.parquet Similarly Folder b/v1/full/20230505/part9385795.parquet Scenario is I need to get…
Azure Databricks Billing
I am confused about how the databricks service is billed under Azure. From documentation, it is said that Databricks is totally integrated with Azure billing: one bill for both Azure infrastructure (VM, storage, Network traffic, etc) and Databricks…
Kafka Connector for Databricks
Can you forward me documentation to use Kafka Connector for Apache Spark Streaming job to connect to Azure Eventhub. I am looking for maven library version etc.
![](https://techprofile.blob.core.windows.net/images/UVyLn-Xlr0apNce5TRX1RA.png?8DBA3B)
Captures logs from Azure Data Factory and inserts them into a Delta Table in Databricks
Good morning, I need assistance in creating a project that captures logs from Azure Data Factory and inserts them into a Delta Table in Databricks. The key requirements for this project are as follows: No Duplicate Logs: Ensuring that the logs are not…
how to fetch data from Azure Active Directory(AD) by using either ADF or databricks
To fetch data from Azure Active Directory (AD) using either Azure Data Factory (ADF) or Azure Databricks, Pleae let me know in detail. thanks
Data Factory monitoring by inserting data table
Hello, I would like to know the best way to insert Datafactory activity logs into my Databricks delta table, so that I can use dashbosrd and create monitoring in Databricks itself , can you help me? I would like every 5 minutes for all activity logs in…
How to use databricks ai to auto generate data definitions for all the tables in my database?
I know we can go to the catalog in databricks and generate data definitions for columns inside of our database using ai, but is there a way of automatically generating these definitions without have to manually generate them and click accept on every…
SAS token generation by Databricks to access CSV files from ADLS container folder
Hi Team, There are some csv files zips inside the ADLS container folder. These zip files need to be downloaded for data correction. Downloading the file requires SAS token embedded with zip file path. Databricks has been used to generate the token and…