Azure Data Factory - Copy activity from Data Lake to Synapse staying in Queued status

Prashant Singh 1 Reputation point
2021-10-11T15:35:15.233+00:00

Hi,

We have an Azure Data Factory pipeline to copy data from Azure Data Lake to Synapse Table. The copy command is continuously getting stuck in the"Queued" status. This pipeline is using AutoResolve Azure Integration runtime in East US. Everything was working fine still this Saturday 9th Oct 21, and suddenly from Sun 10th Oct 21, all our production loads are now stuck as the status is not moving out from "Queued". We have tried all combination but not sure of the issue since we are not able to see any log while the status is "Queued".

Any help will be appreciated to understand the issue.

Regards
Prashant

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,526 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
5,123 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,107 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Prashant Singh 1 Reputation point
    2021-10-13T20:16:49.613+00:00

    Hi Himanshu,

    Thanks for your reply. We have raised a support ticket and got a response that there was some internal issue with the integration runtime and they suggested to create a new Integration runtime which then worked fine. Finally seems there was a problem with internal integration runtime which was not kicking off the copy activity. We are awaiting a root cause now to understand the issue and see how we can avoid in future.

    Regards
    Prashant


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.