How to analyze the cost of each container within the storage account?

Caio Lopes 6 Reputation points
2022-05-06T19:07:47.753+00:00

Hello, I have a need to calculate the cost of each container within a storage account instance, I have been facing this problem for some time. I tried to create a Python script that would traverse all the blobs and calculate the sizes, but it became a completely inefficient process in terms of performance, and I still couldn't get the costs for read and write operations.

Is there any way to get the cost values for each container, including operations and storage values?

I would be very grateful if someone could help me.

Azure Storage
Azure Storage
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,537 questions
Azure Cost Management
Azure Cost Management
A Microsoft offering that enables tracking of cloud usage and expenditures for Azure and other cloud providers.
3,606 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
3,199 questions
0 comments No comments
{count} vote

1 answer

Sort by: Most helpful
  1. Sumarigo-MSFT 47,471 Reputation points Microsoft Employee Moderator
    2022-05-09T09:55:14.42+00:00

    @Caio Lopes Firstly, apologies for the delay in responding here! Welcome to Microsoft Q&A Forum, Thank you for posting you query here!

    You can sure do it via Python or PowerShell but you could also consider leveraging Blob Inventory. Blob inventory will periodically scan your Storage (can be scoped to container or even folder) and calculate total object size and number of objects, amongst other things. The result of this is delivered to another container in the form of CSV or Parquet file.

    The best option would be analyze how much data is stored per container (Azure Storage explorer is one option) and use it as alternative to understand usage per container.

    Inventory job fails to complete for hierarchical namespace enabled accounts
    The inventory job might not complete within 2 days for an account with hundreds of millions of blobs and hierarchical namespace enabled. If this happens, no inventory file is created. If a job does not complete successfully, check subsequent jobs to see if they complete before contacting support. The performance of a job can vary, so if a job doesn't complete, it's possible that subsequent jobs will.

    To count the bytes in a storage account you need to iterate over each object in the storage account. Blob inventory does this for you using some tricks on the backend, but you can also call the list blob api to get your own list (i.e. writing code). Both suffer the same problem on larger accounts it is hard to list all the objects in the account. If you write your own code you can use your knowledge of the account to scale out the listing operation and build robust retry logic because listing a large account will be a significant number of calls to the api.

    Azure Blob Storage pricing: https://azure.microsoft.com/en-in/pricing/details/storage/blobs/

    Understand the full billing model for Azure Blob Storage

    Public preview: Multitasking in the cost analysis preview

    If the issue is still not rectified, for more specialized assistance on this kindly contact Azure Billing support, it's free, and it's the best choice for you: https://azure.microsoft.com/en-in/support/create-ticket/, https://azure.microsoft.com/en-in/support/options/

    Please let us know if you have any further queries. I’m happy to assist you further.

    ----------

    Please do not forget to 200197-screenshot-2021-12-10-121802.png and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    4 people found this answer helpful.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.