Rediger

Del via


Use OpenTelemetry with Azure Functions

Important

OpenTelemetry support for Azure Functions is currently in preview, and your app must be hosted in a Flex Consumption plan to use OpenTelemetry.

This article shows you how to configure your function app to export log and trace data in an OpenTelemetry format. Azure Functions generates telemetry data on your function executions from both the Functions host process and the language-specific worker process in which your function code runs. By default, this telemetry data is sent to Application Insights using the Application Insights SDK. However, you can choose to export this data using OpenTelemetry semantics. While you can still use an OpenTelemetry format to send your data to Application Insights, you can now also export the same data to any other OpenTelemetry-compliant endpoint.

Tip

Because this article is targeted at your development language of choice, remember to choose the correct language at the top of the article.

Currently, there's no client optimized OpenTelemetry support for Java apps.

OpenTelemetry currently isn't supported for C# in-process apps.

You can obtain these benefits by enabling OpenTelemetry in your function app:

  • Correlation across traces and logs being generated both at the host and in your application code.
  • Consistent, standards-based generation of exportable telemetry data.
  • Integrates with other providers that can consume OpenTeleletry-compliant data.

OpenTelemetry is enabled at the function app level, both in host configuration (host.json) and in your code project. Functions also provides a client optimized experience for exporting OpenTelemetry data from your function code that's running in a language-specific worker process.

1. Enable OpenTelemetry in the Functions host

When you enable OpenTelemetry output in the function app's host.json file, your host exports OpenTelemetry output regardless of the language stack used by your app.

To enable OpenTelemetry output from the Functions host, update the host.json file in your code project to add a "telemetryMode": "openTelemetry" element to the root collection. With OpenTelemetry enabled, your host.json file might look like this:

{
    "version": "2.0",
    "logging": {
        "applicationInsights": {
            "samplingSettings": {
                "isEnabled": true,
                "excludedTypes": "Request"
            },
            "enableLiveMetricsFilters": true
        }
    },
    "telemetryMode": "openTelemetry"
}

2. Configure application settings

When OpenTelemetry is enabled in the host.json file, the endpoints to which data is sent is determined based on which OpenTelemetry-supported application settings are available in your app's environment variables.

Create specific application settings in your function app based on the OpenTelemetry output destination. When connection settings are provided for both Application Insights and an OpenTelemetry protocol (OTLP) exporter, OpenTelemetry data is sent to both endpoints.

APPLICATIONINSIGHTS_CONNECTION_STRING: the connection string for an Application Insights workspace. When this setting exists, OpenTelemetry data is sent to that workspace. This setting is the same one used to connect to Application Insights without OpenTelemetry enabled. If your app doesn't already have this setting, you might need to Enable Application Insights integration.

3. Enable OpenTelemetry in your app

With the Functions host configured to use OpenTelemetry, you should also update your application code to output OpenTelemetry data. Enabling OpenTelemetry in both the host and your application code enables better correlation between traces and logs emitted both by the Functions host process and from your language worker process.

The way that you instrument your application to use OpenTelemetry depends on your target OpenTelemetry endpoint:

  1. Run these commands to install the required assemblies in your app:

    dotnet add package Microsoft.Azure.Functions.Worker.OpenTelemetry --version 1.0.0-preview1 
    dotnet add package OpenTelemetry.Extensions.Hosting 
    dotnet add package Azure.Monitor.OpenTelemetry.AspNetCore  
    
  2. In your Program.cs project file, add this using statement:

    using Azure.Monitor.OpenTelemetry.AspNetCore; 
    
  3. In the ConfigureServices delegate, add this service configuration:

    services.AddOpenTelemetry()
    .UseFunctionsWorkerDefaults()
    .UseAzureMonitor();
    

    To export to both OpenTelemetry endpoints, call both UseAzureMonitor and UseOtlpExporter.

Java worker optimizations aren't yet available for OpenTelemetry, so there's nothing to configure in your Java code.

  1. Install these npm packages in your project:

    npm install @opentelemetry/api 
    npm install @opentelemetry/auto-instrumentations-node 
    npm install @azure/monitor-opentelemetry-exporter 
    npm install @azure/functions-opentelemetry-instrumentation
    
  1. Create a code file in your project, copy and paste the following code in this new file, and save the file as src/index.js:

    const { AzureFunctionsInstrumentation } = require('@azure/functions-opentelemetry-instrumentation');
    const { AzureMonitorLogExporter, AzureMonitorTraceExporter } = require('@azure/monitor-opentelemetry-exporter');
    const { getNodeAutoInstrumentations, getResourceDetectors } = require('@opentelemetry/auto-instrumentations-node');
    const { registerInstrumentations } = require('@opentelemetry/instrumentation');
    const { detectResourcesSync } = require('@opentelemetry/resources');
    const { LoggerProvider, SimpleLogRecordProcessor } = require('@opentelemetry/sdk-logs');
    const { NodeTracerProvider, SimpleSpanProcessor } = require('@opentelemetry/sdk-trace-node');
    
    const resource = detectResourcesSync({ detectors: getResourceDetectors() });
    
    const tracerProvider = new NodeTracerProvider({ resource });
    tracerProvider.addSpanProcessor(new SimpleSpanProcessor(new AzureMonitorTraceExporter()));
    tracerProvider.register();
    
    const loggerProvider = new LoggerProvider({ resource });
    loggerProvider.addLogRecordProcessor(new SimpleLogRecordProcessor(new AzureMonitorLogExporter()));
    
    registerInstrumentations({
        tracerProvider,
        loggerProvider,
        instrumentations: [getNodeAutoInstrumentations(), new AzureFunctionsInstrumentation()],
    });
    
  2. Update the main field in your package.json file to include this new src/index.js file, which might look like this:

    "main": "src/{index.js,functions/*.js}"
    
  1. Create a code file in your project, copy and paste the following code in this new file, and save the file as src/index.ts:

    import { AzureFunctionsInstrumentation } from '@azure/functions-opentelemetry-instrumentation';
    import { AzureMonitorLogExporter, AzureMonitorTraceExporter } from '@azure/monitor-opentelemetry-exporter';
    import { getNodeAutoInstrumentations, getResourceDetectors } from '@opentelemetry/auto-instrumentations-node';
    import { registerInstrumentations } from '@opentelemetry/instrumentation';
    import { detectResourcesSync } from '@opentelemetry/resources';
    import { LoggerProvider, SimpleLogRecordProcessor } from '@opentelemetry/sdk-logs';
    import { NodeTracerProvider, SimpleSpanProcessor } from '@opentelemetry/sdk-trace-node';
    
    const resource = detectResourcesSync({ detectors: getResourceDetectors() });
    
    const tracerProvider = new NodeTracerProvider({ resource });
    tracerProvider.addSpanProcessor(new SimpleSpanProcessor(new AzureMonitorTraceExporter()));
    tracerProvider.register();
    
    const loggerProvider = new LoggerProvider({ resource });
    loggerProvider.addLogRecordProcessor(new SimpleLogRecordProcessor(new AzureMonitorLogExporter()));
    
    registerInstrumentations({
        tracerProvider,
        loggerProvider,
        instrumentations: [getNodeAutoInstrumentations(), new AzureFunctionsInstrumentation()],
    });
    
  2. Update the main field in your package.json file to include the output of this new src/index.ts file, which might look like this:

    "main": "dist/src/{index.js,functions/*.js}"
    

Important

OpenTelemetry output to Application Insights from the language worker isn't currently supported for PowerShell apps. You might instead want to use an OTLP exporter endpoint. When your host is configured for OpenTelemetry output to Application Insights, the logs generated by the PowerShell worker process are still be forwarded, but distributed tracing isn't supported at this time.

These instructions only apply for an OTLP exporter:

  1. Add an application setting named OTEL_FUNCTIONS_WORKER_ENABLED with value of True.

  2. Create an app-level Modules folder in the root of your app and run the following command:

    Save-Module -Name AzureFunctions.PowerShell.OpenTelemetry.SDK
    

    This installs the required AzureFunctions.PowerShell.OpenTelemetry.SDK module directly in your app. You can't use the requirements.psd1 file to automatically install this dependency because managed dependencies isn't currently supported in the Flex Consumption plan preview.

  3. Add this code to your profile.ps1 file:

    Import-Module AzureFunctions.PowerShell.OpenTelemetry.SDK -Force -ErrorAction Stop 
    Initialize-FunctionsOpenTelemetry 
    
  1. Add this entry in your requirements.txt file:

    azure.monitor.opentelemetry
    
  2. Add this code to your function_app.py main entry point file:

    from azure.monitor.opentelemetry import configure_azure_monitor 
    configure_azure_monitor() 
    

Considerations for OpenTelemetry

When you export your data using OpenTelemetry, keep these current considerations in mind.

  • When the host is configured to use OpenTelemetry, only logs and traces are exported. Host metrics aren't currently exported.

  • You can't currently run your app project locally using Core Tools when you have OpenTelemetry enabled in the host. You currently need to deploy your code to Azure to validate your OpenTelemetry-related updates.

  • At this time, only HTTP trigger and Azure SDK-based triggers are supported with OpenTelemetry outputs.

Monitor Azure Functions Flex Consumption plan