Copy only the newly created file from one folder to another
Using the Copy Data Activity in Synapse pipeline I want to copy some file from one folder to another folder but I only want to copy the newly generated files
Can't query partitioned tables - Azure Synapse Link for Dataverse
Hi, we have a problem with accessing partitioned tables in Azure Synapse Workspace that were previously created by ingesting tables from Power Platform via Azure Synapse Link for Dataverse. Snapshot folder and .csv files are normally created but can't be…
How to troubleshoot issues while creating Data Lake storage Gen2 account name for Azure Synapse workspace?
Hello there, I am new to Azure and currently using trial version and trying to create a new Azure Synapse workspace but getting an error while creating a new account name for Data Lake storage Gen2. I am seeing the error message "There was an error…
Copy Data activity Fails to fetch latest file from ADLS
Hi All, I am using ADF to get Latest file from ADLS but my copy data activity is failing as its need something the wildcard path either a file name or *. This is quite confusing I can predict which file would be the latest and I don't want all the file…
is there any way to copy the entire folder as it is with all nested folders and files from sharepoint to adls
i have sharepoint site ,my files are inside so many nestedfolders subfolders and folders ,i have main folder as documents ,iam unable to copy files to adls because its having so many subfolders and nested folders, is there any way to copy files from so…
I have a LinkedIn service for SQL server and want to read data via synapse spark pool
I have a LinkedIn service for SQL server and want to read data via synapse spark pool using the LinkedIn service which has all the credentials of the server.
Unstable ConfBasedSASProvider to load SAS token in synapse notebook
Hi there, I am working on ETL pipeline where I need to use SAS token to read/write data from external ADLS. The issue is that code for using ConfBasedSASProvider is unstable. Sometimes working, sometimes not. In case it fails I´m getting this…
Is exposing a sftp storage to selected internet users ip address secure ?
I am looking for a secure sftp solution in azure. I understand we can enable sftp on data lake storage account and also enable TLS 1.2 and secure access feature for https only. additionally I am aware we can limit the ip address to specific range of…
I want to delete storage account in azure before deleting what are the things we need to check in the azure
we need to understand if we are good to clean up the old logs from that storage account
How to run one synapse pipeline across tenants from different domain (entraID)
Hi, I have synapse pipeline with some spark notebooks with ETL transformations. I need to run this pipeline and save data to different ADLS. All of them are in different subscriptions - some of them are having different entraID than my workspace where I…
it's must using SAS token for copy file between blob and blob?
we want to use azcopy tool to copy file between storage account's blob, and use service principal as login identity, when we try to run azcopy copy command without SAS token of storage account ,the job is copy files failed. it's must using SAS token…
In ADLS, when using the DataLakeServiceClient, does every user have access to view the ACL for each directory?
Hello, I'm working on an application that needs to replicate ADLS access controls which are configured by Access Control LIsts. I'm using the python SDK with the InteractiveBrowserCredential class to authenticate a user through their Azure account. The…
Encountering errors when attempting to write files from Azure Synapse Analytics notebook to Azure Data Lake Storage (ADLS) Gen2, while having a private endpoint configured with the DFS URL (OSError: [Errno 5] Input/output error)
Issue: Encountering errors when attempting to write files from Azure Synapse Analytics notebook to Azure Data Lake Storage (ADLS) Gen2, while having a private endpoint configured with the DFS URL. Requirement: Files need to be downloaded from a…
Copy files from Sharepoint into Azure Data Lake Store Gen 2 using ADF linked services ?
I have to copy files from Sharepoint into Azure Data Lake Store Gen 2 using ADF linked services
Retirement Announcement - Azure Data Lake Storage Gen1
Hello Azure Customers in the community, Azure Data Lake Storage Gen1 will be retired on Feb 29, 2024. Azure Data Lake Storage Gen2 offers a richer set of data management capabilities, lower cost, and integration with newer analytics offerings such as…
write a file to azure storage account file share
I want to write data from azure synapse analytics to Azure SMB File share eg I have a dataframe df and I want to write this data to a container #df This is my…
Storage event trigger is running twice when a single blob is created
Hi All, I created a storage event trigger on a pipeline, but noticed that the trigger runs twice when a single file lands in the container , when two file lands the pipeline is triggered 3 times, i expect the pipeline to run once when a single file…
I have a dataframe in Synapse Apache spark pool and I want to create a table for that dataframe in my dedicated sql pool
I have a dataframe in Synapse Apache spark pool and I want to create a table for that dataframe in my dedicated sql pool.
Which subnet should be added to enable network communication interoperability
I want to use azcopy to transfer files from storage account A to storage account B. Where storage account A is in subnet a, and storage account B in subnet b, Azcopy is deployed in VM, which is in subnet c. What confuses me is when adding a network…
Copying Complete SharePoint Library to Azure Data Lake Storage in Azure Data Factory
How can I copy the entire SharePoint library, including its folders, subfolders (nested folders), and all files, to Azure Data Lake Storage (ADLS) in Azure Data Factory? Is this possible?