Create External Table Query Taking Long Time
Hello there, I downloaded SQL Server 2019 evaluation version and configured PolyBase to connect ADLS Gen 2 storage account. I enabled PolyBase, created external data sources and file format. While trying to create external table, it is taking…
Azure Data Lake Storage
Developer technologies | Transact-SQL
SQL Server | Other
Ideas to generate "list of files" from ADLS gen 2 (csv files) for ADF copy data activity
Data Factory/Synapse copy data activity source has a feature to point to a text file that lists each file that we want to copy to the sink. The functionality works great but I'm breaking my head as to how I can generate that text-file in the first place…
Azure Data Lake Storage
Azure Data Factory
Azure Data Factory ADF - parameterize the From and To File Paths under Data Flow > Source Options > After Completion
Problem: I need help with ADF to parameterize the From and To File Paths under Data Flow > Source Options > After Completion. Environment: We are using Data Flow Activity to transform and load data from ADLS Gen 2 to a sink. I am moving a file…
Azure Data Lake Storage
Azure Data Factory

Azure Synapse Studio - Browsing a VNet isolated Data Lake
Hi, I want to use Synapse Studio to access a data lake that is in a VNet. For this I have created a workspace with the Managed VNet option. For the pipelines within Synapse this works well. I can preview files on the data lake. However, the file…
Azure Data Lake Storage
Azure Synapse Analytics
do you have python code to specifically load load data from on-premise MySQL database into azure data lake storage?
do you have python code to specifically load load data from on-premise MySQL database into azure data lake storage? I am not finding a specific example for MySQL in Azure Data Factory preferably Python example
Azure Data Lake Storage
Azure Data Lake Gen2 storing JSON file results in encoding content as NUL (Ascii 0) characters
We are creating json file in Azure Data Lake Gen2 from java application. File write success but content is encoded with one liner of NUL (Ascii 0) characters. This is happening intermittently. anything to do with storage account?
Azure Data Lake Storage

TIBCO Data Virtulization Connectivity to Blob Storage
Hi Guys, How do i connect Azure Blob storage , Azure data lake store data via Composite or Tibco data virtualization? Do we need to use any specific data source or using Cdata is the only option? https://www.cdata.com/kb/tech/azure-tdv-setup.rst …
Azure Data Lake Storage
Azure Blob Storage

How to enable object versioning on the files stored on ADLS Gen 2 in databricks
Hi, I am currently in the process of reading and api json payload and store the json files in ADLS Gen 2. I am exploring ways to enable object versioning and file retention programmatically in Azure Databricks. Could you please help me point in that…
Azure Data Lake Storage
Azure Databricks
Will accessing SFTP file error out when file is in transist
what will happen when you try to copy a file from SFTP to ADLS in Azure data factory while huge file still in SFTP transit. Will pipeline error out with appropriate error msg or still file be partially copied to ADLS.
Azure Data Lake Storage
Azure Data Factory


ADF pipeline to increment the date back to six months
Hi, i have folders created on each day [20201114], so there are 12 months folders created based on date. i want my pipeline to copy the files from each folders ; Starting from current date [20201114] till 6 months back folders data [20200514] i…
Azure Data Lake Storage
Sample c# Code to compare file Checksums of files migrated from DataLake Gen1 to Gen2
I am in the process of migrating some Non Production Data from ADLS Gen 1 to ADLS Gen2. I want to use Azure Batch to run c# program to compare the files by Name, by Size and then by Checksum (Sha or md5). Does anyone have any sample c# code to get…
Azure Data Lake Storage
Azure Batch
Can we Mount ADLS container as NFS drive in Appservice
Hi All, Can we mount ADLS container as network drive in app services.
Azure Data Lake Storage

How to copy a file from azure data lake storage gen 1 to a network drive using python
Our azure databricks cluster is restricted with saving spark data frame to a network drive directly, hence we have to save it as a csv file to our azure data lake storage gen1, then copy it to a network drive, we have successfully saved the spark data…
Azure Data Lake Storage

How to Mount ADLS blob container as Network Share in Azure Web apps
Hi How to Mount ADLS blob container as Network Share in Azure Web apps
Azure Data Lake Storage
Azure App Service
AZ Copy data security while data transfer from om-premises to Azure ADLS
What kind of data security is ensured while moving data from on-premise computer to ADLS Gen-2 using AZ Copy? Any encryption? TLS? or something else. Could not find anything in docs. Please advise.
Azure Data Lake Storage


Access data in Azure Storage Account Gen2 from Synapse initated by Data Factory with Firewall setup.
Hi, I have a setup as following: Data factory starts a Stored Procedure in Synapse to read data from a delta table like this: With allowing all networks on the storage account, this works without any problem. But when we only allow selected…
Azure Data Lake Storage
Azure Synapse Analytics
Azure Data Factory
Best way to implement a sheet update?
Today, we have a process that involves the following: JIRA Google Sheets/tsv files c# code Python code Manual intervention There is a c# project hosted on bitbukcet that has Azure functions. Amongst the important functions,…
Azure Data Lake Storage
Azure Functions
Azure Blob Storage
Developer technologies | C#

Azure AD and SSO integration with Global tenant
Hi, Is it possible to host some solutions in China Azure and integrate SSO with users in Global Azure AD? If not, what would be the best approach to achieve this. Users are in Global Organization on-prem AD and integrated with Global Azure AD. …
Azure Data Lake Storage
Microsoft Security | Microsoft Entra | Microsoft Entra ID
Using Service Principal (OID), Not Able to Access Azure Data Lake Storage from Azure Databricks Notebooks
Hi All, I am just mounting a directory of Azure Data Lake Gen2 instance in a Notebook cell using Service Principal. I fetched the Object ID (OID) of the Service Principal using the command "az ad sp show" and using the OID, I provided…
Azure Data Lake Storage
Azure Databricks
Reading file from Azure Data Lake Storage V2 with Spark 2.4
I am trying to read a simple csv file Azure Data Lake Storage V2 with Spark 2.4 on my IntelliJ-IDE on mac Code Below package com.example import org.apache.spark.SparkConf import org.apache.spark.sql._ object Test extends App { val appName:…