Run Databricks notebook from ADF - error to find azure module to save the data in blob storage
Hi Guys, The requirement is - Call Rest API, read the records in jsonlines format and load into table in Azure SQL server. I used Databricks to read the jsonlines from Open API using Python script. It can read and keep the data into a file in Azure blob…
Self-hosted machine unable to access Data Lake Storage account when running a pipeline using Synapse
Hello - I need your help again: Here's the story: I have azure synapse workspace I have created managed private end point created for ADLS - working fine I have create private endpoint created for ADLS - working fine ADLS have set the public access…
Error trying to validate storage account name while creating a new synapse workspace
I am unable to create a new data lake storage (gen2) account name in the basic tab for create Azure Synapse Workspace. The error I get is - " There was an error trying to validate storage account name. please try again ". The error message is…
Secured kusto connection during data ingestion from fabric notebook to lakehouse.
Hi team Im looking for SN+I Auth or secured auth for kusto in fabric notebook.
Copy LOB based data from DB2 to Azure Data Lake Gen2 storage using ADF
Hi team, I was trying to work on a scenario to copy data from a table in db2 database having a column with datatype as LOB(CLOB) to Azure Data Lake Gen2 storage container using Azure Data Factory. However, its taking a considerable amount of time( a day)…
org.apache.hadoop.fs.FileAlreadyExistsException: Failed to rename temp file
[Repeat Question due to old thread] We have built a streaming pipeline with spark autoloader. Source Folder is a azure blob container. We've encountered a rare issue (could not replicate it). Below is the exception…
Data Lake as a storage/database for Express Angular Application?
So currently I'm using SQL server for our structured data, client uploads a file which has a minimum of a million of records, which gets uploaded to blob storage and then those million of records gets inserted into different tables in SQL. What I want to…
unable to connect adls gen storage from Purview
unable to connect adls gen storage from Purview gettting following error ADLS Gen2 operation failed for: Storage operation '' on container 'bloblblob' get failed with 'Operation returned an invalid status code 'Forbidden''. Possible root causes: (1).…
Have previously been able to map a received dataset to data store. But have been unable to do so today - OK button does not become enabled after selecting a data store folder. Has something changed?
I received some (Azure Data Lake Storage Gen2 Folder) datasets from an outside (partner) organization through a data share invite. Previously was able to map the dataset to a folder in one of my (Azure Blob Storage) data stores, but today the OK button…
How to assign a Static Public IP to Azure Storage Account
I have a storage account in Azure and I need to assign a static public IP because we use a dedicated internet link that requires the endpoint to have a public IP address. The dynamic IP addresses from Azure change over time, so I need a fixed IP. Is…
My Dev, test, prod environments are in different resource groups of same subscription. How do I create a devops pipeline in this case?a DevOps pipeline to deploy a
Hi, My dev, test and prod environments are in different resource groups of the same subscription. I am involved in a data engineering project where I will be using primarily below resources - ADLS - data storage ADF - Orchestration Azure Databricks - QC…
Can i copy data from ADL to Neo4j graph database using ADF
Hello, I want to load data from ADL to neo4j database using ADF. Can I do that using ADF?? If yes, what are the different options I have? My first priority is I want to use ADF activities if that's not possible i can use python coding.
Calling Power BI reports from ADF
Hello, I have one report with multiple client folders this is the ADLS gen 2 folder location for one client and data source connection: AzureStorageDataLake for different files: …
How to specify a custom catalog name for Azure Databricks Delta Lake Dataset in ADF
Hello, I am creating an Azure Databricks Delta Lake Dataset in ADF and I am only able to choose the database name that links to Databricks's hive_metastore. How can I specify a custom catalog name that I created in Databricks instead of…
How to Create Delta Table in Azure Synapse Analytics with Id Auto Increment Identity Column ?
I have created the Delta Lake Delta tables In ADLS using Synapse Notebook and in that table, I want to add an identity column (Auto increment 1,1) but I am not able to create the same, Below is my Create table script and error which i am facing. Table…
Building a chatbot via Azure
Hi Azure community! I want to build a chatbot for a company. It will be able to answer questions using the company data sources, including csv and pdf files. For example, if I ask the chatbot to summarize some csv financial reports, it will return an…
I am unable to mount containers using databricks and storage gen 2 ?
what is the issue?
The synapse link is not able to write the data in Azure data lake gen 2 storage account
Hi Team, The synapse link is not able to write the data in Azure data lake gen 2 storage account. I am getting the following error message. {"code":"AuthorizationFailed","message":"The client…
Parameters for switch between linked services
I have a pipeline in Azure Synapse which has multiple dataflows and I have 2 linked service how can I use the pipeline parameters to switch between linked services without changing the linked services in each data manually.
Azure Data Lake integration with Business Central, API returns authorization error on signature string
Hello, I'm trying to integrate Business Central (SaaS) with Azure Data Lake. The Azure service uses Shared Access Key authorization. I am building my authorization string according to the API documentation: I decode my access key from base 64; I build my…