Share via

High Log Analytics cost even after setting Application Insights daily cap (30 GB)

Swaroop Gowda D 25 Reputation points
2026-03-04T07:31:17.11+00:00

Hello,

I have configured Application Insights for several Azure resources, and it is connected to a Log Analytics Workspace for centralized logging.

To control ingestion, I configured the daily cap for Application Insights to 30 GB. However, even after setting this limit, we are still seeing very high costs in the Azure billing related to Log Analytics/Application Insights.

I would like to understand the following:

  1. Does the Application Insights daily cap limit only telemetry ingestion into Application Insights, while Log Analytics ingestion can still generate additional charges?

Could other Azure resources sending logs to the same Log Analytics workspace be contributing to the high cost?

Are there recommended best practices to reduce Log Analytics ingestion cost when using Application Insights, such as:

Sampling configuration

  Table-level retention changes
  
     Moving some tables to **Basic Logs**
     
        Filtering telemetry before ingestion
        

Currently:

Application Insights is connected to Log Analytics Workspace

Daily cap: 30 GB

Retention is set to the default configuration

Could you please suggest best practices or recommended configurations to optimize the logging cost in this scenario?

Thank you.

Azure Analysis Services
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Q&A Assist
    2026-03-04T07:31:39.27+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    Yes. The Application Insights daily cap and the Log Analytics workspace ingestion are controlled separately, and other data sources can still drive high Log Analytics cost even if the Application Insights daily cap is set to 30 GB.

    Key points for this scenario:

    1. Relationship between Application Insights daily cap and Log Analytics costs
    • Application Insights and Log Analytics now have separate daily cap implementations.
      • For workspace-based Application Insights, the effective cap is the minimum of:
        • The Application Insights daily cap, and
        • The Log Analytics workspace daily cap.
      • For classic Application Insights (not workspace-based), only the Application Insights daily cap applies because data does not reside in a workspace.
    • If only the Application Insights daily cap is configured and the Log Analytics workspace daily cap is not, then:
      • Application Insights telemetry is limited by its own cap.
      • Other Azure resources sending data to the same Log Analytics workspace (VMs, PaaS services, diagnostics, Activity logs, etc.) can still ingest unlimited data (subject to workspace limits) and generate additional Log Analytics charges.
    1. Other resources contributing to high Log Analytics cost
    • Any resource configured to send logs/metrics to the same Log Analytics workspace can contribute to ingestion volume and cost, independently of the Application Insights daily cap.
    • Also, if Application Insights is using the global ingestion endpoint with only an instrumentation key (ikey), the Application Insights daily cap might not be effective across regions, but the Log Analytics daily cap still applies.
    1. Best practices to reduce Log Analytics ingestion cost when using Application Insights

    Use a combination of caps, sampling, filtering, and retention tuning.

    A. Configure both Application Insights and Log Analytics daily caps

    • Set the Application Insights daily cap on each Application Insights resource:
      • Azure portal → Application Insights resource → ConfigureUsage and estimated costsDaily cap → set Daily volume cap (GB/day).
    • Set the Log Analytics workspace daily cap:
      • Azure portal → Log Analytics workspaces → select workspace → SettingsUsage and estimated costsDaily cap → turn On and set Daily volume cap (GB/day).
    • For workspace-based Application Insights, the lower of the two caps is the effective limit for Application Insights data. The workspace cap also limits all other data sources to that workspace.

    B. Use sampling to reduce Application Insights ingestion

    • Sampling is the primary mechanism to tune Application Insights data volume to the desired level and should be used before relying on caps.
    • Recommendation from the documentation:
      • Use sampling to reduce traffic and storage costs while preserving statistically correct analysis.
      • Use the daily cap as a safety net in case the application suddenly sends much higher volumes of telemetry.

    C. Analyze which tables and resources are driving ingestion

    • In the Log Analytics workspace, use queries to identify high-volume tables and sources, for example:
      • Compare record counts per table over a period (for example, 7 days) to find the noisiest tables.
      • Analyze by consumed bytes to see which tables cost the most.
    • Use the Log Analytics Workspace Insights → Usage workbook to see:
      • Percentage of ingestion per table.
      • Per-resource ingestion into the workspace.
    • After identifying the heavy tables (for example, traces, requests, dependencies, custom logs), adjust logging levels or sampling for those sources.

    D. Use data collection rules (DCR) to filter before ingestion

    • Data collection rules can reduce the volume of telemetry sent from Application Insights to the Log Analytics workspace by filtering or transforming data.
    • If a DCR is in place, the workspace may receive less data than the workspace daily cap, which helps control cost.
    • DCRs can also be used to filter other data sources (for example, platform logs) before they reach the workspace.

    E. Retention and log types (Basic vs Analytics)

    • Retention beyond the included period in Log Analytics is charged per GB-month. Reducing retention for high-volume tables can lower cost.
    • Moving less frequently queried or long-term data to cheaper log types (such as Basic Logs) and/or exporting to external storage (for example, Blob Storage) is a common strategy, though specific Basic Logs configuration is not detailed in the provided context.

    F. Cost-optimization strategy summary

    • Set a daily cap on both Application Insights and the Log Analytics workspace to bound maximum daily spend.
    • Set alerts when the daily cap is reached so that teams know data is being dropped and can adjust telemetry.
    • Use sampling and/or DCR-based filtering to proactively reduce ingestion volume.
    • Review Azure Monitor pricing and use the pricing calculator to model costs.
    • Continuously monitor usage (Cost Management + Log Analytics usage workbooks) and refine sampling, filtering, and retention.

    In the described setup (Application Insights → Log Analytics, 30 GB AI daily cap, default retention), high costs are likely due to:

    • Other resources sending large volumes of data to the same workspace, and/or
    • No daily cap configured on the Log Analytics workspace, allowing ingestion beyond what the Application Insights cap limits.

    Configuring a Log Analytics daily cap, enabling sampling, and using DCRs and retention tuning are the recommended next steps to optimize cost.


    References:

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.