How can I find help validating a certificate?
Indeed, I am a student in cloud computing in Morocco and I do not have enough means to obtain the AZ 900 certification and my family also does not have enough financial means. Could someone help me validate this certificate please?
Assert error output not writing to blob
Hello. I have built a pipeline which gathers data from a .csv file and then goes through a couple of assert activities to check the validity of the data. Valid rows are supposed to be input into the table, while assert-failure rows are setup to be…
Possible bug or issue in Synapse dedicated SQL pool when exporting parquet files
I'm not sure if this is really a bug, but its definitly a frustration for me at least ;-). When trying to write data from Synapse dedicated SQL pools to data lake storage as parquet files (using a CETAS statement) it produces files with non-standard…
Call API with dynamic URL and store JSON results in DataLake Storage
I am trying to make implement the following scenario using Data Factory: I am making multiple API calls, relying on a dyanmic URL (e.g. "url.com/api/{ID}" for a list of IDs). The resulting JSON from each of the API calls should be stored as a…
Delete the file from SharePoint location
Hi All, I am trying to copy the files from Share Point to ADLS and referring to the below URL pipeline to achieve the copy functionality. https://www.syntera.ch/blog/2022/10/10/copy-files-from-sharepoint-to-blob-storage-using-azure-data-factory/ I need…
How to sync Azure data lake storage with sharepoint drive?
We have copied files from sharepoint site multiple drives having nested folders in Azure data lake storage container maintaining same folder structure. Now I want to create a pipeline in Azure data factory to delete file from ADLS container which is not…
How to copy files from sharepoint drive that has huge heirarcy of folders and subfolders
We want to get all files from the sharepoint into our ADLS container. In sharepoint drive there are arround 60k files but through our existing pipeline we are getting only 600 files. We have a web activity with below…
How to fix "the specify container does not exist"
Hi, When running the upgrade for my storage account to enable hierarchical namespace, I face an issue saying that “the specify container does not exist”. Can you help me to resolve this issue? Please find attached a screeshot of the issue I am following…
Changing Synapse Notebook variable values during deployment in Azure DevOps
How can I change the variable values in a Synapse Notebook, such as storage account name, container name, and file location, during deployment from dev to prod in Azure DevOps? The notebook is already being used by other notebooks in my dev environment.
Error trying to validate storage account name while creating a new synapse workspace
I am unable to create a new data lake storage (gen2) account name in the basic tab for create Azure Synapse Workspace. The error I get is - " There was an error trying to validate storage account name. please try again ". The error message is…
How to use a user-delegation SAS token to load parquet table from ADLS gen2?
Now I have a parquet table stored in ADLS gen2: adlss://mycontainer@mystorage.dfs.core.windows.net/folder1/table1. This is a read-only table and I want to restrict my service principal to only have read access to this table only. So I use ACL to grant…
Aritechture best practices to store data from websraping and use it in analytics
I am scrapping data from websites and I want to export what I scrapped as csv files then store them in Azure Data Lake, then apply an ETL and the final data output will be used for Power BI reports and for machine learning do I need to use Azure synapse?…
The synapse link is not able to write the data in Azure data lake gen 2 storage account
Hi Team, The synapse link is not able to write the data in Azure data lake gen 2 storage account. I am getting the following error message. {"code":"AuthorizationFailed","message":"The client…
Parameters for switch between linked services
I have a pipeline in Azure Synapse which has multiple dataflows and I have 2 linked service how can I use the pipeline parameters to switch between linked services without changing the linked services in each data manually.
Azure to AWS
Hello We need to transfer files from ADLS to AWS (S3 bucket) for a SAS application hosted in third party in batches. We need to ensure data security and best practices. My understanding, we can use ADF to create a linked service for AWS S3 but IT DOES…
How to setup modern Arcitechure for Small/Medium Business?
Currently we're using the following setup which is slow to process the data and is slow on the power bi side: Azure VM for third parties to upload via sftp C# script to ETL data to azure sql server and move files to ADLS Gen2 Power BI report pulling…
Have previously been able to map a received dataset to data store. But have been unable to do so today - OK button does not become enabled after selecting a data store folder. Has something changed?
I received some (Azure Data Lake Storage Gen2 Folder) datasets from an outside (partner) organization through a data share invite. Previously was able to map the dataset to a folder in one of my (Azure Blob Storage) data stores, but today the OK button…
How Can I read csv files which are in nested folders and copy them preserving the hierarchy?
I have csv.gz files which are partitioned this way /2024/01/01/xyz/x.csv /2024/01/01/yza/y.csv /2024/01/01/zab/z.csv and there are files for several years and i want to copy all those files using adf while maintaining the folder structure and hierarchy…
Azure Synapse Link for dataverse to bring Dynamics 365 data to ADLS and Synapse for analytical purpose - Facing Issues and inconsistencies in the feature
We are using Synapse Link for Dataverse feature to bring dynamics 365 data to ADLS and Synapse Analytics. We have below open issues due to which we are unable to finalize the solution: In F&O linked dataverse environment, we have created a synapse…
VPN and networking config for secure upload to ADLS Gen2 storage account?
I'm looking into network architectures and security for implementing secure access from user laptops into Azure to upload files to ADLS Gen2 data lake blob containers. We have no on-prem network or AD - just individual user laptops. We do have MS Entra…