1,332 questions with Azure Data Lake Storage tags

Sort by: Updated
0 answers

Download data from cosmos by v-studio

When I download data from cosmos from azure data lake storage of visual studio, it shows an error: 流 URL 无效

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
asked 2024-04-16T12:07:24.16+00:00
Xinlong Wang 0 Reputation points Microsoft Employee
commented 2024-04-16T15:41:05.82+00:00
KarishmaTiwari-MSFT 18,347 Reputation points Microsoft Employee
2 answers

While fetching data from cosmos db conatiner and persisting the json file in ADLS Gen 2 through Synapse Pipeline, some objects in my json file are appearing as blank string causing data loss. This is happening in PROD only not in UAT and DEV.

Hi, The issue i am facing is with the Synapse Analytics Service - Pipeline. I have created a dataflow which is pulling data from cosmos db container in json format and storing that json file in ADLS Gen 2. When I check the json file in ADLS I see that…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,329 questions
Azure Cosmos DB
Azure Cosmos DB
An Azure NoSQL database service for app development.
1,434 questions
asked 2024-04-05T13:17:47.2733333+00:00
Vity Pandita 0 Reputation points
commented 2024-04-16T13:40:11.6366667+00:00
Smaran Thoomu 8,965 Reputation points Microsoft Vendor
0 answers

Understanding the Structure of Incremental D365 FO data in Data Lake Gen2

I am a data engineer new to working with Azure and I have set up an ETL process to read incremental data out of Data Lake Gen2 storage and push to Azure SQL Database. I am using Azure Synapse Link to expose Dynamics 365 FO tables to the data lake. I'm…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,329 questions
asked 2024-04-16T12:19:26.2366667+00:00
Arbuckle, Grant 0 Reputation points
0 answers

Synapse Serverless CETAS fails with error "Fatal exception occurred: bad allocation".

Hello, I am trying to create an external table (CETAS) from a large amount of fairly small json files, so that they can be queried more efficiently. The json files are stored on ADLS. Previously this worked fine, when i let the query run for 1 - 1.5…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,329 questions
Transact-SQL
Transact-SQL
A Microsoft extension to the ANSI SQL language that includes procedural programming, local variables, and various support functions.
4,546 questions
asked 2024-04-08T15:27:45.2433333+00:00
Finn Schmidt 41 Reputation points
commented 2024-04-16T07:43:20.5333333+00:00
PRADEEPCHEEKATLA-MSFT 76,436 Reputation points Microsoft Employee
2 answers One of the answers was accepted by the question author.

In ADF using HDFS linked service my copy file activity throws the following error

Hi, I have an issue using ADF with HDFS linked service. I created a HDFS connection then a copy acitivity from HDFS to Azure Data Lake gen2. The source is a CSV file and the copy format is binary. When I run the pipeline I get the following error: …

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Virtual Machines
Azure Virtual Machines
An Azure service that is used to provision Windows and Linux virtual machines.
7,065 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,329 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
asked 2024-03-28T10:07:15.1766667+00:00
Ács Dániel 25 Reputation points
accepted 2024-04-16T07:07:13.72+00:00
Ács Dániel 25 Reputation points
0 answers

Azure Datafactory Out of memory error when read from Salesforce

I use Datafactory Copy activity to copy data from Salesforce to ADLS, I am facing with Out of memory error. The file size is 129k of rows 800MB, then I set block size to 100 MB and Max rows per file to 100 000, but the error still exist. What can you…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
asked 2024-04-04T13:32:58.54+00:00
Olga Valiuk 0 Reputation points
edited a comment 2024-04-15T12:06:37.73+00:00
Harishga 3,255 Reputation points Microsoft Vendor
1 answer One of the answers was accepted by the question author.

ADF pipeline to read the data from UC table to adls gen2 account

Hello Team, We have a requirement to create Azure Datafactory pipeline to read the data from UC table, access on the table is provided ( to Azure Datafactory Managed Identity) and copy the data into adls gen2. Is there a way or article to implement this?…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,903 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
asked 2024-04-11T19:05:05.9733333+00:00
Ashwini Gaikwad 65 Reputation points
accepted 2024-04-15T07:35:40.09+00:00
Ashwini Gaikwad 65 Reputation points
2 answers One of the answers was accepted by the question author.

Consistent data in data lake gen2

Hi friends, I have to understand how data consistency works in ADLS, I have found this old…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
asked 2024-04-04T12:24:04.8466667+00:00
Anshal 1,826 Reputation points
commented 2024-04-12T10:53:58.17+00:00
1 answer

Can I use wild card(*) in middle of File Path

Can I use wild card(*) in middle of File Path 50m ago Can I use wild card(*) in middle of File Path when I load files ADLS to Notebook?   I got file path like bellow …

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,329 questions
asked 2024-04-12T02:54:02.6333333+00:00
박상준 0 Reputation points
answered 2024-04-12T07:19:07.9533333+00:00
KarishmaTiwari-MSFT 18,347 Reputation points Microsoft Employee
1 answer

Copy Dataverse data into Azure SQL using Synapse Link - Initial data is not loaded

The intended setup is to link Dynamics environment to PowerApp and use Synapse Link to copy data to ADLS. From there ADF template is used to incrementally load data to Azure SQL. In short: Dynamics -> Synapse link -> ADLS -> ADF -> ASQL I…

Azure SQL Database
Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,329 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
asked 2024-03-19T15:28:23.0466667+00:00
rn 0 Reputation points
answered 2024-04-10T11:47:10.9666667+00:00
Mauro Gallo 0 Reputation points
0 answers

Copy data for very small file takes way too long

I encountered a timeout problem related to a copy data activity from ADSL Gen 2 to ADSL gen 2 trying to copy a 0 byte csv file. How is it possible that the activity takes 28 minutes to copy a 0 byte file?

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
asked 2024-04-04T12:01:29.0033333+00:00
Pegoraro, Francesco 40 Reputation points
commented 2024-04-10T07:07:07.1633333+00:00
Pegoraro, Francesco 40 Reputation points
1 answer

ErrorCode=UserErrorUnzipInvalidFile,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file 'All Weekly 18.02.2024.zip' is not a valid Zip file with Deflate compression method.,Source=Microsoft.DataTransfer.ClientLibrary,''Type

While Files copying from SFTP to ADLS using ADF Copy activity Pipeline fails when doing unzip from SFTP to ADLS.

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
asked 2024-04-09T06:13:16.2066667+00:00
Osadhis Nanda 0 Reputation points
commented 2024-04-10T06:35:23.3566667+00:00
Osadhis Nanda 0 Reputation points
2 answers

Getting this error while copying 3 files from adlgen2 container to same adlsgen2container.

Failure happened on 'Sink' side. ErrorCode=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'BadRequest'. Account:…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
asked 2021-03-19T08:25:11.743+00:00
Arpita Mishra 16 Reputation points
commented 2024-04-10T02:49:11.6266667+00:00
PRADEEPCHEEKATLA-MSFT 76,436 Reputation points Microsoft Employee
1 answer

How to write to datalakegen2 storage using databricks in delta format when connected using SAS tocken

I converted the data from parquet form to data format. Now I want to write the data to blob storage - datalakegen2. But facing below error while writing. I useed the below command to write my data: output_path =…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
2,659 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,903 questions
asked 2024-04-09T23:29:12.6666667+00:00
Sai Sunny Kothuri 0 Reputation points
answered 2024-04-10T00:20:52.1+00:00
KarishmaTiwari-MSFT 18,347 Reputation points Microsoft Employee
1 answer One of the answers was accepted by the question author.

last modified time in ADLS storage

Our data collector app is set to use UTC time. It writes to the ADLS Gen2 storage with a directory structure based on current UTC time (i.e. year/month/day). The azure region we selected is East 2. We access azure portal from west coast (i.e PST). …

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
asked 2024-04-08T17:08:44.3166667+00:00
Pan, John 120 Reputation points
accepted 2024-04-09T17:00:19.17+00:00
Pan, John 120 Reputation points
1 answer One of the answers was accepted by the question author.

How can I efficiently download files from various subfolders and nested folders within different levels of hierarchy in SharePoint, and then transfer them to Azure Data Lake Storage (ADLS) using Azure Data Factory (ADF)?

How can I efficiently download files from various subfolders and nested folders within different levels of hierarchy in SharePoint, and then transfer them to Azure Data Lake Storage (ADLS) using Azure Data Factory (ADF)? Despite attempting various…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
SharePoint
SharePoint
A group of Microsoft Products and technologies used for sharing and managing content, knowledge, and applications.
9,543 questions
asked 2024-04-04T06:47:14.5+00:00
Chebolu Sai Manasa 100 Reputation points
accepted 2024-04-09T09:04:10.4466667+00:00
Chebolu Sai Manasa 100 Reputation points
1 answer

Access Azure Synapse Link Data Through PowerBI

We are using Synapse Link to export D365 FnO tables to Azure Datalake. When we are creating the Synapse Link by configuring Synapse workspace, Spark pool and Storage, a LakeDatabase get created in the synapse workspace with the tables. Then I have…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,329 questions
Microsoft Dataverse Training
Microsoft Dataverse Training
Microsoft Dataverse: A Microsoft service that enables secure storage and management of data used by business apps. Previously known as Common Data Service.Training: Instruction to develop new skills.
5 questions
asked 2024-04-05T06:36:52.2933333+00:00
Malshini Nissanka 0 Reputation points
answered 2024-04-08T05:21:21.3133333+00:00
Harishga 3,255 Reputation points Microsoft Vendor
1 answer One of the answers was accepted by the question author.

How can I upload a csv from a local drive to Lakehouse programmatically, so I can schedule the pickup?

Hi, We are storing csvs on a local drive to support reporting and analysis. I would like to pull these csvs into the Lakehouse on a scheduled basis programmatically. All I can find so far is a manual upload. Can I code a Notebook, or leverage a…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,329 questions
Microsoft Fabric Training
Microsoft Fabric Training
Microsoft Fabric: A Microsoft unified data platform.Training: Instruction to develop new skills.
13 questions
asked 2024-04-04T14:38:45.8433333+00:00
Conner, Kyle 20 Reputation points
commented 2024-04-06T12:29:46.6333333+00:00
Conner, Kyle 20 Reputation points
0 answers

Azure Data Factory Self Hosted IR Cannot Connect To MongoDB Timeout With No Exception

I have an self-hosted (on Prem) Integrated Runtime. On this VM I have installed a MongoDB Compass client and I can connect to a target MongoDB cluster. The connection is okay and I can see the DB/collections. I use the Azure Data Factory to create a…

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
asked 2024-03-05T22:15:40.4333333+00:00
Samir Sharma 0 Reputation points
edited a comment 2024-04-02T15:42:04.3466667+00:00
Samir Sharma 0 Reputation points
1 answer

Logic app to retrieve the latest file from a blob folder

How can I create a logic app that retrieves the latest file from a blob folder when a Http request is received, where there are multiple files, and sends it as an attachment? Are there any specific steps or configurations required for this process?

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,332 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,409 questions
Azure Logic Apps
Azure Logic Apps
An Azure service that automates the access and use of data across clouds without writing code.
2,826 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,329 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
9,464 questions
asked 2024-04-01T14:00:18+00:00
Aditya Singh 105 Reputation points
edited a comment 2024-04-02T06:25:23.9566667+00:00
Nehruji R 1,511 Reputation points Microsoft Vendor