Share via

Guidance Request for Estimating Costs and Time for Azure Durable Function Cleanup of Azure Table Storage

Parth Talaviya 0 Reputation points
2025-11-24T07:34:24.4366667+00:00

Overview: I have recently developed an Azure Durable Function App that runs under the Consumption Plan (Flexible) with 512 MB of memory. The purpose of this function is to clean up a massive Azure Table Storage that holds 45 TB of data and billions of records.

Architecture Overview:

  1. Orchestrator Function: Coordinates the cleanup workflow by triggering and managing activity functions.
  2. Query Activity: Queries Azure Table Storage in batches to identify records for deletion.
  3. Delete Activity: Performs batch deletions of up to 100 records per operation.
  4. Progress Tracking: Updates the status of the cleanup process in the Table Storage.
  5. Monitoring: Logs telemetry to Application Insights for monitoring and diagnostics.

Key Assumptions:

  • Batch Size: 100 records per delete operation
  • Azure Functions Plan: Flex Consumption (On-Demand mode)
  • Region: West US
  • Storage Redundancy: Locally Redundant Storage (LRS) for Table Storage

Specific Guidance Needed:

I need a detailed cost breakdown for the following components involved in this cleanup process:

  1. Azure Table Storage Costs
    • Costs associated with read and write operations (for querying and deleting records)
      • Storage costs for 45 TB of data (how the storage redundancy and access patterns impact pricing)
  2. Azure Functions (Flex Consumption Plan - On-Demand) Costs
  • Costs for running Durable Functions based on the execution time and memory consumption (512 MB)
    • How the number of executions (i.e., batch queries and deletes) and function execution duration impact costs
  1. Durable Functions Storage Costs
  • Costs associated with storing durable task state and history for the orchestrator and activity functions
    • Impact of the function history storage (Azure Storage used for this)
    1. Application Insights Costs
      • Costs related to logging telemetry for monitoring the cleanup process (based on volume of logs, data retention, etc.)

Additional Request:

  • Time Estimation: Given that I’m dealing with billions of records and 45 TB of data, I would also appreciate an estimate for the time required to complete the entire cleanup operation. Please consider:
    • Querying and deleting data in batches of 100 records
      • The potential impact of network throughput, storage performance, and function execution limits on the total time required to clean up the data.

Summary Request:

  • Complete cost estimation including Azure Table Storage, Function execution, Durable Functions state storage, and Application Insights
  • Time estimate for cleaning up billions of records and 45 TB of data

Thank you for your assistance! I look forward to receiving a detailed cost breakdown and time estimation to help guide the execution of this cleanup task efficiently.

 

Cost Management
Cost Management

A Microsoft offering that enables tracking of cloud usage and expenditures for Azure and other cloud providers.


1 answer

Sort by: Most helpful
  1. Anonymous
    2025-11-24T10:35:22.6566667+00:00

    Hello Parth Talaviya,

    Thanks for posting your query in Microsoft Q&A forum

    Azure Table Storage Costs

    Read and Write Operations: You'll incur costs for both querying and deleting records in batches. Each read and write operation is counted towards your total transaction costs.

    • Estimated Cost: You’d need to estimate the total number of read/write operations. Azure charges for transactions by offering the first 100,000 transactions for free, after which there's a standard charge. The Azure Storage Pricing Page can help with specifics.

    Storage Costs: For storing 45 TB of data, costs depend on the redundancy option (Locally Redundant Storage).

    • Estimated Cost: Classic Azure Table Storage is billed similarly to other Tables in general Azure Storage, with an LRS per‑GB monthly price in West US around the low‑single‑digit cents per GB. A recent reference for LRS storage shows about 0.045 USD per GB‑month for general LRS capacity in a US region.​

    45 TB ≈ 45,000 GB

    • Monthly storage cost ≈ 45,000×0.045≈2,025 USD/month while the data exists.​

    Because you are deleting data, this cost will fall proportionally as you reclaim space, if the cleanup is completed within, say, 1 month, use roughly one month of the starting size as a safe upper bound.

    Refer document: https://azure.microsoft.com/en-us/pricing/details/storage/blobs/

    Azure Functions (Flex Consumption Plan) Costs

    Execution Costs: You’ll be billed based on the number of executions and the execution time (with your function running at 512 MB).

    Estimated Cost: Each execution may have a different duration, and you're charged based on the time taken. Azure Functions pricing provides details on how to estimate these costs. Be mindful of the orchestration costs due to function replays in durable functions.

    Official pricing for the Flex Consumption plan includes:

    • Execution time: charged per GB‑second, with a price higher than classic Consumption, recent documentation cites around 0.000026 USD per GB‑second, with 100,000 GB‑s free each month per subscription.​
    • Executions: around 0.400 USD per million executions, with 250,000 free executions per month.​

    You are using 512 MB, so the billed GB‑seconds for one second of execution is 0.5 GB‑s

    Per‑execution compute cost (512 MB)

    Assume average execution duration per activity (query or delete) is t seconds at 512 MB.

    • GB‑seconds per execution = 0.5×t.

    Cost per GB‑second ≈ 0.000026 USD.​

    • Cost per execution ≈ 0.5×t×0.000026=0.000013t USD.

    Example: if each activity runs 0.5 seconds on average

    • GB‑s per exec = 0.5×0.5=0.25.
    • Cost per exec ≈ 0.25×0.000026≈0.0000065 USD.

    Number of executions:

    For N records and batch size 100:

    • Query activity executions ≈ N/100
    • Delete activity executions ≈ N/100
    • Orchestrator function executions: One orchestration instance per segment of the table, or possibly a single long‑running orchestrator that fans out to activities. Compared to activity counts, orchestrator invocations and replays will be small (likely under a percent of activity calls), but they do consume additional GB‑seconds.

    For simplicity, assume:

    Activities dominate cost.

    • Total executions ≈ 2×N/100

    Example for N=1 billion, t=0.5s

    Activity executions ≈ 20,000,000.

    Total GB‑seconds = 20,000,000 × 0.25 = 5,000,000 GB‑s.

    • Free grant = 100,000 GB‑s billable = 4,900,000 GB‑s.​
    • Compute cost ≈ 4,900,000×0.000026≈127.4 USD.

    Execution count charge:

    20,000,000 executions, 250,000 free → 19,750,000 billable.

    19.75 million ÷ 1 million ≈ 19.75.

    Cost ≈ 19.75 × 0.40 ≈ 7.9 USD.​

    Total Flex compute for 1B rows with these assumptions ≈ 135–150 USD, not counting orchestrator overhead.

    Orchestrator overhead can add more GB‑seconds because Durable Functions replays orchestrator code on each history update; if your orchestration is large and chatty, double or triple the compute estimates as a safety margin.

    Refer document: https://azure.microsoft.com/en-us/pricing/details/functions/?msockid=0cdc6a48cd50615a01497ce6cc4160b1

    Durable Functions State Storage Costs

    Task State and History Storage: The cost associated with storing the durable task state will also impact your overall expenses. Since you're using Azure Storage for this, this cost will depend on the number of tasks stored and the frequency of operations.

    • Estimated Cost: Similar to table storage, costs are outlined in the Azure Storage pricing portal.

    Refer document: https://azure.microsoft.com/en-us/pricing/details/storage/blobs/

    Application Insights Costs: Application Insights is now fully workspace‑based and bills telemetry through Azure Monitor Logs (Log Analytics). Pay‑as‑you‑go ingestion in 2025 is around 2.30–2.76 USD per GB beyond a ~5 GB/month free allowance.

    Time Estimation:

    Given that you are processing billions of records:

    • Batch Processing: If you process 100 records at a time, it should take several iterations to clear the total dataset.
    • Considerations: The total time will depend on factors like network throughput, storage performance, and potential delays from the durable function replays.

    Complete cost estimation

    For a one‑time cleanup of 1 billion rows (~45 TB) over about a 1–3 day period, rough ballpark figures

    • Table Storage capacity: ≈ 2,000 USD for one month at 45 TB LRS, decreasing as you delete, the cleanup itself does not add capacity cost, and you save this amount monthly after cleanup.​

    Table Storage transactions: likely 10–100 USD range depending on exact per‑operation price and query patterns, assuming ~20M–100M ops.​

    Azure Functions Flex compute: on the order of 100–300 USD for 20M+ activity executions and orchestrator overhead at 512 MB, assuming sub‑second execution times.​

    Durable Functions state storage: usually tens of dollars for backend storage + transactions unless orchestration is extremely chatty.

    Application Insights: 50–300 USD depending on how much you log per execution and sampling configuration (for 20–100 GB ingestion).​

    Total incremental cleanup cost (excluding the existing storage you’re trying to reclaim) is typically well under 1,000 USD for 1B rows, though App Insights and orchestration chattiness can push that higher if not tuned.

    I hope the provided answer is helpful, do let me know if you have any further questions on this Please accept as Yes & upvote if the answer is helpful so that it can help others in the community.

    Was this answer helpful?

    1 person found this answer helpful.

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.