FastAPI remove duplicate logs in App Insights

Sibanda, Thabani 0 Reputation points
2025-12-03T14:07:42.56+00:00

code 1:

import logging
import random
import asyncio
from fastapi import FastAPI
from azure.monitor.opentelemetry import configure_azure_monitor

CONNECTION_STRING = "XXX"

configure_azure_monitor(
    logger_name="dice-game-logger",
    connection_string=CONNECTION_STRING
)

logger = logging.getLogger("dice-game-logger")
logger.setLevel(logging.INFO)
logger.propagate = False

app = FastAPI()


print("Handlers on dice-game-logger:", logger.handlers)
print("Handlers on root logger:", logging.getLogger().handlers)


@app.get("/roll")
async def roll_dice():
    """Roll a dice, emit trace/logs/metrics"""
    dice1_value = random.randint(1, 6)
    dice2_value = random.randint(1, 6)

    value = dice1_value + dice2_value

    # Log
    logger.info(f"Rolled a {value}", extra={"dice.value": value})

    # Demonstrate async work under the same trace
    await async_log()

    # Put out error if someone rolls 2 sixes
    if (value == 12):
        logger.error("You rolled 2 sixes!!!")

    # Return response
    return {"dice_value": value}

async def async_log():
    await asyncio.sleep(1)
    logger.warning("Slowing down between rolls")


output of the prints is """ Handlers on dice-game-logger: [<LoggingHandler (NOTSET)>] Handlers on root logger: []

"""

code 2:

import logging
import random
import asyncio
import time

from azure.monitor.opentelemetry import configure_azure_monitor

CONNECTION_STRING = "XXX"

configure_azure_monitor(
    logger_name="test-logger",
    connection_string=CONNECTION_STRING
)

logger = logging.getLogger("test-logger")
logger.setLevel(logging.INFO)

print("hello")

def roll_dice():
    """Roll a dice, emit trace/logs/metrics"""
    dice1_value = random.randint(1, 6)
    dice2_value = random.randint(1, 6)

    value = dice1_value + dice2_value

    # Log
    logger.info(f"Rolled a {value}", extra={"dice.value": value})

    # Demonstrate async work under the same trace
    async_log()

    # Put out error if someone rolls 2 sixes
    if (value == 12):
        logger.error("You rolled 2 sixes!!!")

    # Return response
    return {"dice_value": value}

def async_log():
    time.sleep(1)
    logger.warning("Slowing down between rolls")

print(roll_dice())
print(roll_dice())
print(roll_dice())

I'm sending logs to Azure app insights. when i run code 2, I can see the logs fine in app insights. when I run code 1, there's 2 sets of each log message. There's log duplicates

I've tried clearing root log handlers, but that didn't work

I ran code 2 using uvicorn test:app and also tried fastapi run test.py and same issue.

The duplicate logs are coming from same logger namespace and there's nothing different about them

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
{count} votes

1 answer

Sort by: Most helpful
  1. Bharath Y P 2,560 Reputation points Microsoft External Staff Moderator
    2025-12-03T15:21:35.57+00:00

    It looks like you're facing issues with duplicate logs in Azure Application Insights when using FastAPI. Duplicate logs can be a bit tricky, but let's go through some potential reasons and solutions.

    Why You Might Be Seeing Duplicate Logs:

    1. Multiple SDK Activations: If you have both the OpenCensus SDK and OpenTelemetry activated in the same application, that can lead to duplicate telemetry.
    2. Ingestion Behavior: Azure's ingestion pipeline might reprocess data due to local instabilities, which could introduce duplicates.
    3. Diagnostic Settings Misconfiguration: If you have configured diagnostic settings to export Application Insights data to a Log Analytics workspace while still sending data to Application Insights itself, you could see duplication.
    4. Logging Configuration: Ensure that the logger configurations, especially with propagate, are set correctly to avoid logging redundancies.

    Recommended Steps to Resolve Duplicate Logs:

    1. Check for Overlapping SDKs: Ensure you're only using one SDK for monitoring in your FastAPI application.
    2. Review Logging Configuration:
    • Double-check your logger setup, making sure you have only one logger initialized for each use case.
    • If not required, set logger.propagate = False correctly to prevent logs from propagating to the root logger.
    1. Debug Diagnostic Settings:
    • Verify the diagnostic settings in your Application Insights resource to ensure you're not inadvertently logging to multiple workspaces.
    • Ensure the Log Analytics workspace you are exporting to is different from the one your Application Insights resource is based on.
    1. Use KQL for Duplicate Analysis: To analyze the logs, you can use Kusto Query Language (KQL) to identify duplicates in your Application Insights:
    let data =
    traces
    | where timestamp > ago(2h);
    let _data = materialize(data);
    _data
    | summarize Count=count() by tostring(customDimensions['some_property'])
    | where Count > 1
    

    Follow-Up Questions for More Context:

    1. Have you confirmed that no other logging configuration or SDK is running concurrently with your FastAPI application?
    2. Can you provide details about how you are running your application (e.g., the exact command and environment)?
    3. Have you checked the Application Insights diagnostic settings to see if duplicates could be originating from here?
    4. Are there specific log entries that are being duplicated, or is it all log entries?

    I hope these insights help you resolve your issue with duplicate logs! If the problem persists after trying these steps, it might be useful to gather additional information and analyze the logs in depth. Feel free to reach out with any more questions!


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.