is azure data lake analytics supports onlu storage Gen1
is azure data lake analytics account only support data lake storage Gen1? Can blob storage be use for data analytics account? data analytics account locations are very limited. if storage account is in different location than analytics account, will…
Data Lake site identification
Hi again, I am extracting sql tables and loading to data factory (data lake). I am extracting the same tables from multiple locations. When I query the datalake, I need to be able to identify which data references each location. It is about 30…
ADF: How to connect to Azure data lake Gen 1 to access csv files and delete the row from the file.
ADF: How to connect to Azure data lake Gen 1 to access csv files and delete the row from the file. while i tried to create Data flow with source as Azure data lake Gen1 the connection itself is failing with the below error: Azure datalakestorage…
Azure Blob Encoding issue
Hi Folks, I have a client dropping their CSV file into Azure Blob but when I download the file, everything is good but when I use the edit view, it has wierd symbol. When I use a power automate to get the file content, I also do have wierd symbol Any…
Azure Data Factory Power Query error: "We could not evaluate this query due to invalid or missing credentials"
I'm trying to add a Power Query resource to Azure Data Factory v2 (ADF). The dataset that I'm consuming is Parquet. I can successfully preview the data from within the dataset. As well, I can open and read the .parquet file from it's location in Azure…
Data lake Gen2 Linked service for connecting to a container
Hi Team, I would like to create a Data lake Gen2 Linked service for direct connection to a container. Storage account has multiple container's and want to get the LS connected to one particular container alone. I.e. Attached a screenshot of a…
How to dynamically rename files to data lake while copying from source (File share) with sub folder structure using ADF?
Hi Team, How to dynamically rename files to data lake while copying from source (File share) with sub folder structure using ADF? Folder L01 > Folder L11 > Folder L21 > Folder L31 > Folder L41 > File_A.csv Folder L01 >…
ADLS Gen2 Storage RBAC for user from differnt AAD tenant
HI, I have a service which is used by users from different AAD tenant like user1@a.onmicorosoft.com and user2@b.onmicrosoft.com. my blob storage is in my app AAD tenant. I want to assign permissions to users from different AAD tenant(to their…
How to connect Business Central to Azure Data Factory
I posted this question on Re: [MicrosoftDocs/azure-docs] Business Central Connector (Issue #88574) But that didn’t help. Is it actually possible to connect Business Central to ADF?
loading each object in an array vs the page
Greetings, I am just now getting started with Data Factory and am a little confused. Below is a sample of the JSON from the REST API that I am pulling. I have the pipeline working to paginate and save blobs, but it only saves each page and not each…
Failure happened on 'Source' side. ErrorCode=UserErrorOdbcOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException
I was trying to load data from on-prem data lake to azure gen2 using copy activity through SHIR. Facing this issue. Failure happened on 'Source' side.…
Getting unauthorized response with DataLake Gen 2 link
I'm having trouble with some simple data queries. I'm using an account key to authorize connections to Data Lake storage Gen 2. First I set the account key (I know not best practice storing key in notebook but I'm just trying to get something…
Error while using COPY INTO command to load data
Getting this error 'Not able to validate external location because The remote server returned an error: (403) Forbidden.' while using copy into command to load data from adls gen 2 path parquet to synapse internal table using SAS key .i…
ADF Copy Data REST API , putting JSON response to SQL Table
Hi, I am trying to call an API using REST connector in COPY DATA . However when I use SQL Table in SINK and run pipeline, the JSON is not getting stored in the table instead the NULL value is being inserted into the table. I have not defined any…
Using Azure Data Factory to send TSV files to a Snowflake stage, and it is escaping certain characters incorrectly.
I am using a Azure Data Factory pipeline to move compressed TSV files from an on-prem file share to a ADLS share that acts as a Snowflake stage, so we can move it to a Snowflake table. The problem is when the combination of a backslash and a newline…
data lake and schemas
Since data lakes store data in its raw form from the original data source how do you manage providing data to users to create reports when they're accustomed to snowflake and star schemas?
Azure data factory pipeline copy activity failure
I am getting " The new name (table name) is already in use as a object name and would a duplicate that is not permitted" in Azure data factory pipeline using copy activity. This pipeline will get triggered when a code is executed and a file is…
Trying to use power query on ADF error: "We could not evaluate this query due to invalid or missing credentials"
Hi I am trying to use power query on azure data factory and I am receiving the following error when I add a data source from our azure data lake: "We could not evaluate this query due to invalid or missing credentials" (see image below). …
Error managing tables dataverse to data lake gen 2
Hi, I have used Azure Synapse Link for Dataverse to sync some tables over to a data lake storage. For many tables this works fine, but for some of the "Append only" tables, the isDelete column is missing. If I remove some of the tables and…
How to dynamically map and copy each XML data response (via API/HTTP) in Azure Data Factory ??
Hello, We have a Web API source that returns responses(data) in XML format. We need to extract data for multiple objects from this source/method and hence using HTTP Web API as datasource/dataset/linkedservice with XML format. Here's a sample response…