Not able to copy data using azcopy from one subscription to another

Sumit Kumar Gupta 1 Reputation point Microsoft Employee

I have a scenario where I need to copy data from one storage account to another.
Subscription A storage account AA to (=>) Subscription B storage account BB.
AA -> BB

This copy needs to be recursive as we have a lot of subfolders and files.
Command used: azcopy copy "https://<storageaccountAA><src_container>/<sasToken1>" "https://<storageaccountBB><dest_container>/<sasToken2>" --recursive

sasToken1 generated from subscription A
sasToken2 generated from subscription B

I am facing auth issues and it gives following error:

failed to perform copy command due to error: cannot start job due to error: cannot list files due to reason ->, /home/vsts/go/pkg/mod/!azure/azure-storage-blob-go@v0.15.0/azblob/zc_storage_error.go:42
===== RESPONSE ERROR (ServiceCode=AuthorizationResourceTypeMismatch) =====
Description=403 This request is not authorized to perform this operation using this resource type., Details: (none)

RESPONSE Status: 403 This request is not authorized to perform this operation using this resource type.
Date: [Thu, 22 Sep 2022 12:23:39 GMT]

I am also fine is there is alternative to copy data from storage account AA -> databricks -> storage account BB. But unable to find solution yet.

Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
2,763 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
1,972 questions
Azure Data Lake Analytics
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Sumarigo-MSFT 44,081 Reputation points Microsoft Employee

    @Sumit Kumar Gupta Firstly, Apologies for the delay response!

    Based on the error message, Issue may be related to to SAS token, can you please regenerate the SAS token and try again ((Make sure you give write permissions)
    I would recommened to generated the SAS token through Storage Explorer tool and the easiest way is through Azure Storage Explorer.
    Remember that a SAS URL is basically like a time-limited password.

    Please refer to this article: How Authorize AzCopy

    If you copy to or from an account that has a hierarchical namespace, use instead of in the URL syntax. Multi-protocol access on Data Lake Storage enables you to use, and it is the only supported syntax for account to account copy scenarios.

    Additional information: You may need the Storage Blob Data Contributor role on the storage account.

    You can also refer to this link, for more detailed information how RBAC works

    If you have any additional questions or need further clarification, please let me know.


    Please do not forget to 244522-screenshot-2021-12-10-121802.png and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

  2. Vinodh247-1375 11,476 Reputation points


    Thanks for reaching out to Microsoft Q&A.

    From the error the issue seems to be related to permissions. Usually the common errors might be due to not having enough permission to upload a new blob into the storage(Storage Blob Data Contributor and Storage Blob Data Owner roles are required), but this error looks like it is the AZCOPY job unable to start relating to SAS token. Make sure your SAS token is not expired as it has daily/monthly expiry limit and generate a new token before starting the copy job to narrow down the issue.

    Please Upvote and Accept as answer if the reply was helpful, this will be helpful to other community members.

    0 comments No comments