Copy files to Azure with no files transferred

UrBel 240 Reputation points
2025-06-13T07:04:08.76+00:00

Hi Expert,

we have 2,6 million files to be transferred on our local server to Azure storage, it had been transferred 2,1 million files, after that nothing files transferred anymore after it is 500 thousand remain, AZCopy always report [Number of File Transfers = Number of File Transfer Failed ]

i've checked piece of slice log file i think it's one of the problem;

"...RESPONSE Status: 403 Server failed to authentication the request. Make sure the value of Authorization header is formed correctly including the signature..."

anyone ever experience this issue?

Great thanks & appreciation for all replies

warm regards,

UrBel

Azure Storage
Azure Storage
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,530 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Nandamuri Pranay Teja 3,615 Reputation points Microsoft External Staff Moderator
    2025-06-13T08:57:55.78+00:00

    Hello UrBel

    Thank you for your question!

    Firstly, check your SAS tokens which generated with a defined start time and expiry time?

    Because your AZCopy job runs longer than the expiresOn (or se parameter) value specified in the SAS token, the token becomes invalid, and all subsequent operations will fail with a 403-authentication error. This is by far the most common reason for transfers failing partway through.

    If you are using SAS ensure the token has not expired. Check the se (expiry time) and st (start time) parameters in the SAS token URL. For a transfer of this scale, set a long expiry time (e.g., several days or weeks) to avoid interruptions. or if you were using a Storage Account Access Key directly in your AZCopy command, or if the SAS token was generated based on a stored access policy that has since been revoked or modified, the existing credentials will become invalid.If you’re using azcopy login with Azure AD, ensure the user, service principal, or managed identity has the Storage Blob Data Contributor or Storage Blob Data Owner role assigned to the storage account or container.

    • Navigate to the storage account → Access Control (IAM).
    • Click Add → Add role assignment.
    • Select Storage Blob Data Contributor or Storage Blob Data Owner and assign it to your identity.

    Post which Go to your Storage Account -> Networking -> Firewalls and virtual networks.

    • Ensure "Public network access" is set to "Enabled from all networks" (less secure but rules out firewall issues for testing).
    • Or, if set to "Enabled from selected virtual networks and IP addresses," ensure your local server's current public IP address is explicitly added to the "Address range" list.

    Ensure you’re using the latest version of AzCopy (v10.25.1 as of June 2025). Older versions may have bugs or compatibility issues.

    References:

    Hope the above answer helps! Please let us know do you have any further queries.


    Please do not forget to "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members. 

    User's image


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.