Captures logs from Azure Data Factory and inserts them into a Delta Table in Databricks
Good morning, I need assistance in creating a project that captures logs from Azure Data Factory and inserts them into a Delta Table in Databricks. The key requirements for this project are as follows: No Duplicate Logs: Ensuring that the logs are not…
how to fetch data from Azure Active Directory(AD) by using either ADF or databricks
To fetch data from Azure Active Directory (AD) using either Azure Data Factory (ADF) or Azure Databricks, Pleae let me know in detail. thanks
Data Factory monitoring by inserting data table
Hello, I would like to know the best way to insert Datafactory activity logs into my Databricks delta table, so that I can use dashbosrd and create monitoring in Databricks itself , can you help me? I would like every 5 minutes for all activity logs in…
How to use databricks ai to auto generate data definitions for all the tables in my database?
I know we can go to the catalog in databricks and generate data definitions for columns inside of our database using ai, but is there a way of automatically generating these definitions without have to manually generate them and click accept on every…
Azure Databricks - Timeout error after 60 minutes when launching an Azure Databricks cluster
When I attempt to start a cluster through the Azure Databricks portal/UI, after 30 minutes I receive the following error in the event log: Failed to add 3 containers to the compute. Will attempt retry: true. Reason: Cloud provider launch failure Azure…
![](https://techprofile.blob.core.windows.net/images/VfQFAmOikEWfBHko2XlWTA.png?8D7F33)
How to add parent group for one specific group in databricks?
The question for this is in "databricks" -> "settings" -> "Identity and access" -> "Groups" Here has "admin" group with system managed. But we wonder if a new group can be assigned as a parent…
My python code uses AzureCliCredential() function, but it is giving error in running in Synapse/ADF/Databricks notebook .
My python code uses AzureCliCredential() function, but it is giving error in running in Synapse Workspace that your Azure Cli command not found on path . I have tried using other function like DefaultAzureCredentials(), ClientSecretCredentials () also…
Workflow that logs the completion of certain pipelines into a table
I'm having a lot of difficulty implementing the solutions I had in mind. Previously, I asked for help to create an architecture that would be efficient, easy to maintain, and cost-effective. I received various suggestions, but I can't decide which one to…
![](https://techprofile.blob.core.windows.net/images/UVyLn-Xlr0apNce5TRX1RA.png?8DBA3B)
Datafactory logs for dasboard on databricks
hello! need help! I need to solve a problem that consists of taking completed pipeline logs with specific names and inserting them into a deltatable within Databricks to create dashboards in it. this solution needs to be entirely using Azure tools. I'm…
data factory azure integration runtime with private end points
I want to create data factory with private end points, Can I use Azure integration runtime with private end points? Is it possible to use Azure integration runtime with private end points? Or "self hosted" is ONLY possible option…
Data Factory Logs --> Catolog Databricks
Good morning, I need assistance in creating a project that captures logs from Azure Data Factory and inserts them into a Delta Table in Databricks. The key requirements for this project are as follows: No Duplicate Logs: Ensuring that the logs are not…
Pipeline Executing Databricks Notebook Successfully Despite Stopped Cluster
In Azure Data Factory (ADF), I have a pipeline that executes a notebook in Azure Databricks. I noticed that even when the Databricks cluster is stopped, the ADF pipeline still completes successfully, and the notebook runs without any issues. Is this…
I am unable to create compute cluster in databricks from azure free trial subcription
I am unable to create compute cluster in databricks from azure free trial subcription. Failed to add 1 container to the compute. Will attempt retry: false. Reason: Azure Quota Exceeded Exception
How to create Synapse Serverless SQL Pool External Table using Databricks Notebook?
Hello, Can we create Synapse Serverless SQL Pool External Table using a Databricks Notebook? E.g., the script to create an Synapse Serverless SQL Pool External Table from within Synapse is as follows: CREATE EXTERNAL TABLE [SchemaName].[TableName]…
![](https://techprofile.blob.core.windows.net/images/Nf5r3_qEsUGQunzkbEkvfQ.png?8D8993)
Data Factory Logs to Databricks
I need to create a way to send logs from Data Factory to the Databricks Catalog. What is the most cost-effective and efficient method to achieve this?
How to Correctly Pass and Use Boolean Values from Datafactory to Databricks Notebook
How can I correctly pass a Boolean value from Datafactory to a Databricks notebook and use it in conditional logic? I configured a pipeline in Datafactory that calls a Databricks notebook. I attempted to pass a Boolean parameter from Datafactory as a…
Validate Task - Data Factory npm Build Fails bundle.manager.js:53 exit code 255
I have two devops accounts. In 1st account I configured a build pipeline for Azure Data Factory with the attached build.yaml132228-buildyaml.txt I then copied the repo and the build yaml to the 2nd devops account. In the 1st devops the build…
![](https://techprofile.blob.core.windows.net/images/ufD5gv2_AwAAAAAAAAAAAA.png?8D96D1)
Best Practices for Automating Pipeline Execution Data Collection in Azure Data Factory
Hello everyone, I am looking for the best practices to create an automated workflow for collecting execution data from Azure Data Factory (ADF) pipelines, storing this data in Azure Data Lake Storage Gen2, and consolidating it into a single table for…
How to replace Data flow activity having transformations inside it in ADF with other activity.
I have Azure data factory pipeline which have data flow activity. Data flow activity points to source file in storage account gets data from it as a source then performs different transformations on data using conditional split, derived column, flatten…
Managed storage account's compliance
Azure Databricks managed storage accounts need to have the key access disabled. But since these have deny assignment, I am able to see / influence the configuration. How do I make these storage accounts be green for this compliance?