An Azure service that provides an enterprise-grade analytics engine.
Yes. The Application Insights daily cap and the Log Analytics workspace ingestion are controlled separately, and other data sources can still drive high Log Analytics cost even if the Application Insights daily cap is set to 30 GB.
Key points for this scenario:
- Relationship between Application Insights daily cap and Log Analytics costs
- Application Insights and Log Analytics now have separate daily cap implementations.
- For workspace-based Application Insights, the effective cap is the minimum of:
- The Application Insights daily cap, and
- The Log Analytics workspace daily cap.
- For classic Application Insights (not workspace-based), only the Application Insights daily cap applies because data does not reside in a workspace.
- For workspace-based Application Insights, the effective cap is the minimum of:
- If only the Application Insights daily cap is configured and the Log Analytics workspace daily cap is not, then:
- Application Insights telemetry is limited by its own cap.
- Other Azure resources sending data to the same Log Analytics workspace (VMs, PaaS services, diagnostics, Activity logs, etc.) can still ingest unlimited data (subject to workspace limits) and generate additional Log Analytics charges.
- Other resources contributing to high Log Analytics cost
- Any resource configured to send logs/metrics to the same Log Analytics workspace can contribute to ingestion volume and cost, independently of the Application Insights daily cap.
- Also, if Application Insights is using the global ingestion endpoint with only an instrumentation key (ikey), the Application Insights daily cap might not be effective across regions, but the Log Analytics daily cap still applies.
- Best practices to reduce Log Analytics ingestion cost when using Application Insights
Use a combination of caps, sampling, filtering, and retention tuning.
A. Configure both Application Insights and Log Analytics daily caps
- Set the Application Insights daily cap on each Application Insights resource:
- Azure portal → Application Insights resource → Configure → Usage and estimated costs → Daily cap → set Daily volume cap (GB/day).
- Set the Log Analytics workspace daily cap:
- Azure portal → Log Analytics workspaces → select workspace → Settings → Usage and estimated costs → Daily cap → turn On and set Daily volume cap (GB/day).
- For workspace-based Application Insights, the lower of the two caps is the effective limit for Application Insights data. The workspace cap also limits all other data sources to that workspace.
B. Use sampling to reduce Application Insights ingestion
- Sampling is the primary mechanism to tune Application Insights data volume to the desired level and should be used before relying on caps.
- Recommendation from the documentation:
- Use sampling to reduce traffic and storage costs while preserving statistically correct analysis.
- Use the daily cap as a safety net in case the application suddenly sends much higher volumes of telemetry.
C. Analyze which tables and resources are driving ingestion
- In the Log Analytics workspace, use queries to identify high-volume tables and sources, for example:
- Compare record counts per table over a period (for example, 7 days) to find the noisiest tables.
- Analyze by consumed bytes to see which tables cost the most.
- Use the Log Analytics Workspace Insights → Usage workbook to see:
- Percentage of ingestion per table.
- Per-resource ingestion into the workspace.
- After identifying the heavy tables (for example, traces, requests, dependencies, custom logs), adjust logging levels or sampling for those sources.
D. Use data collection rules (DCR) to filter before ingestion
- Data collection rules can reduce the volume of telemetry sent from Application Insights to the Log Analytics workspace by filtering or transforming data.
- If a DCR is in place, the workspace may receive less data than the workspace daily cap, which helps control cost.
- DCRs can also be used to filter other data sources (for example, platform logs) before they reach the workspace.
E. Retention and log types (Basic vs Analytics)
- Retention beyond the included period in Log Analytics is charged per GB-month. Reducing retention for high-volume tables can lower cost.
- Moving less frequently queried or long-term data to cheaper log types (such as Basic Logs) and/or exporting to external storage (for example, Blob Storage) is a common strategy, though specific Basic Logs configuration is not detailed in the provided context.
F. Cost-optimization strategy summary
- Set a daily cap on both Application Insights and the Log Analytics workspace to bound maximum daily spend.
- Set alerts when the daily cap is reached so that teams know data is being dropped and can adjust telemetry.
- Use sampling and/or DCR-based filtering to proactively reduce ingestion volume.
- Review Azure Monitor pricing and use the pricing calculator to model costs.
- Continuously monitor usage (Cost Management + Log Analytics usage workbooks) and refine sampling, filtering, and retention.
In the described setup (Application Insights → Log Analytics, 30 GB AI daily cap, default retention), high costs are likely due to:
- Other resources sending large volumes of data to the same workspace, and/or
- No daily cap configured on the Log Analytics workspace, allowing ingestion beyond what the Application Insights cap limits.
Configuring a Log Analytics daily cap, enabling sampling, and using DCRs and retention tuning are the recommended next steps to optimize cost.
References:
- Daily cap unexpected behaviors in Application Insights
- Set daily cap on Log Analytics workspace
- Create and configure Application Insights resources
- Troubleshoot high data ingestion in Application Insights
- Controlling Telemetry Cost
- Architecture best practices for Application Insights
- Application Insights FAQ - Frequently Asked Questions