Share via

I want to setup Azure Blob Storage, to upload and download artifacts to use in Azure pipelines through ADO ? also tell me how to set up Service Principal and get the ADO to access azure blob storage container and artifacts stored there ?

Jyoti Dixit 0 Reputation points
2026-02-09T06:51:14.7833333+00:00

Can anyone give me the complete guide as to which permissions to give, store artifacts, which type of access to be given to the contributor/Reader/Data role. And how can I set up service principals and give security and Network IP ranges to ADO agent to successfully access artifacts from blob storage and run the downloaded artifacts to run pipeline.

Azure Blob Storage
Azure Blob Storage

An Azure service that stores unstructured data in the cloud as blobs.

0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Venkatesan S 4,735 Reputation points Microsoft External Staff Moderator
    2026-02-09T07:21:21.9466667+00:00

    Hi Jyoti Dixit,

    Thanks for reaching out in Microsoft Q&A forum,

    I want to setup Azure Blob Storage, to upload and download artifacts to use in Azure pipelines through ADO? also tell me how to set up Service Principal and get the ADO to access azure blob storage container and artifacts stored there?

    First off, create your storage account and a container like "artifacts" in the Azure portal pick a general-purpose v2 account in a region close to your ADO agents for speed. Make the container private since we're dealing with pipeline goodies.

    • Next, for secure access without fumbling keys everywhere, spin up a service principal in Entra ID: Go to App registrations, create one, grab the client ID, tenant ID, and make a client secret. In ADO, head to Project Settings > Service connections > New > Azure RM, plug in those details, and name it something like "blob-spn-connection." Test it to confirm.
    • Permissions-wise, keep it least-privilege: Assign your SPN the "Storage Blob Data Contributor" role at the storage account level via IAM in the portal this lets it upload, download, list, and delete blobs perfectly for pipelines. If you just need reads downstream, swap to "Storage Blob Data Reader" on the container. Skip broad roles like Contributor; they're overkill and risky.
    - task: AzureFileCopy@4
        SourcePath: '$(Build.ArtifactStagingDirectory)/**'
        azureSubscription: 'blob-spn-connection'
        Destination: 'AzureBlob'
        storage: 'yourstorageacctname'
        ContainerName: 'artifacts'
        BlobPrefix: '$(Build.BuildId)/'
    
    • Download in later stages or releases, similarly, flipping to download mode—it drops files into $(Pipeline.Workspace) ready to run scripts or tests from.
    • For network lockdown (smart move in Chennai with your setup), under Storage > Networking > Firewalls, allow "Trusted Microsoft services" first. Then, grab the latest Azure DevOps IP ranges JSON from Microsoft's download center, filter for your agent pool's region (e.g., South India), and add those as firewall rules. Self-hosted agents? Whitelist your VM's IP or VNet. Test with a quick az storage blob list in a pipeline debug step.
    • To execute downloaded stuff, just add a script step: - script: ./artifact/script.ps1 (chmod +x for Linux). Watch logs for 403s usually a role or IP fix.

    Official Documentation

    Kindly let us know if the above helps or you need further assistance on this issue.

    Please do not forget to 210246-screenshot-2021-12-10-121802.pngand “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.