Редактиране

Споделяне чрез


Collect your Apache Spark applications logs and metrics using Azure Event Hubs (preview)

The Fabric Apache Spark diagnostic emitter extension is a library that enables Apache Spark applications to emit logs, event logs, and metrics to various destinations, including Azure Log Analytics, Azure Storage, and Azure Event Hubs.

In this tutorial, you learn how to use the Fabric Apache Spark diagnostic emitter extension to send Apache Spark application logs, event logs, and metrics to your Azure Event Hubs.

Collect logs and metrics to Azure Event Hubs

Step 1: Create an Azure Event Hubs Instance

To collect diagnostic logs and metrics, you can use an existing Azure Event Hubs instance. If you don't have one, you can create an event hub.

Step 2: Create a Fabric Environment Artifact with Apache Spark Configuration

Option 1: Configure with Azure Event Hubs Connection String

  1. Create a Fabric Environment Artifact in Fabric

  2. Add the following Spark properties with the appropriate values to the environment artifact, or select Add from .yml in the ribbon to download the sample yaml file which already containing the following properties.

    spark.synapse.diagnostic.emitters: MyEventHub
    spark.synapse.diagnostic.emitter.MyEventHub.type: "AzureEventHub"
    spark.synapse.diagnostic.emitter.MyEventHub.categories: "Log,EventLog,Metrics"
    spark.synapse.diagnostic.emitter.MyEventHub.secret: <connection-string>
    spark.fabric.pools.skipStarterPools: "true" //Add this Spark property when using the default pool.
    

    Fill in the <connection-string> parameters in the configuration file. For more information, see Azure Event Hubs configurations.

Option 2: Configure with Azure Key Vault

Note

Known issue: Unable to start a session using Option 2 provisionally. Currently, storing secrets in Key Vault prevents Spark sessions from starting. Please prioritize configuring it using the method outlined in Option 1.

Ensure that users who submit Apache Spark applications are granted read secret permissions. For more information, see Provide access to Key Vault keys, certificates, and secrets with an Azure role-based access control.

To configure Azure Key Vault for storing the workspace key:

  1. Create and go to your key vault in the Azure portal.

  2. On the settings page for the key vault, select Secrets, then Generate/Import.

  3. On the Create a secret screen, choose the following values:

    • Name: Enter a name for the secret.
    • Value: Enter the <connection-string> for the secret.
    • Leave the other values to their defaults. Then select Create.
  4. Create a Fabric Environment Artifact in Fabric.

  5. Add the following Spark properties. Or select Add from .yml on the ribbon to download the sample yaml file, which includes following Spark properties.

    spark.synapse.diagnostic.emitters: MyEventHub
    spark.synapse.diagnostic.emitter.MyEventHub.type: "AzureEventHub"
    spark.synapse.diagnostic.emitter.MyEventHub.categories: "Log,EventLog,Metrics"
    spark.synapse.diagnostic.emitter.MyEventHub.secret.keyVault: <AZURE_KEY_VAULT_NAME>
    spark.synapse.diagnostic.emitter.MyEventHub.secret.keyVault.secretName: <AZURE_KEY_VAULT_SECRET_KEY_NAME>
    spark.fabric.pools.skipStarterPools: "true" //Add this Spark property when using the default pool.
    

    Fill in the following parameters in the configuration file: <AZURE_KEY_VAULT_NAME>, <AZURE_KEY_VAULT_SECRET_KEY_NAME>. For more details on these parameters, refer to Azure Event Hubs configurations.

  6. Save and publish changes.

Step 3: Attach the Environment Artifact to Notebooks or Spark Job Definitions, or Set It as the Workspace Default

To attach the environment to Notebooks or Spark job definitions:

  1. Navigate to the specific notebook or Spark job definition in Fabric.
  2. Select the Environment menu on the Home tab and select the environment with the configured diagnostics Spark properties.
  3. The configuration is applied when you start a Spark session.

To set the environment as the workspace default:

  1. Navigate to Workspace Settings in Fabric.
  2. Find the Spark settings in your Workspace settings (Workspace setting -> Data Engineering/Science -> Spark settings).
  3. Select Environment tab and choose the environment with diagnostics spark properties configured, and click Save.

Note

Only workspace admins can manage workspace configurations. Changes made here will apply to all notebooks and Spark job definitions attached to the workspace settings. For more information, see Fabric Workspace Settings.

Available configurations

Configuration Description
spark.synapse.diagnostic.emitters Required. The comma-separated destination names of diagnostic emitters.
spark.synapse.diagnostic.emitter.<destination>.type Required. Built-in destination type. To enable Azure Event Hubs destination, the value should be AzureEventHub.
spark.synapse.diagnostic.emitter.<destination>.categories Optional. The comma-separated selected log categories. Available values include DriverLog, ExecutorLog, EventLog, Metrics. If not set, the default value is all categories.
spark.synapse.diagnostic.emitter.<destination>.secret Optional. The Azure Event Hubs instance connection string. This field should match this pattern Endpoint=sb://<FQDN>/;SharedAccessKeyName=<KeyName>;SharedAccessKey=<KeyValue>;EntityPath=<PathName>
spark.synapse.diagnostic.emitter.<destination>.secret.keyVault Required if .secret isn't specified. The Azure Key vault name where the secret (connection string) is stored.
spark.synapse.diagnostic.emitter.<destination>.secret.keyVault.secretName Required if .secret.keyVault is specified. The Azure Key vault secret name where the secret (connection string) is stored.
spark.synapse.diagnostic.emitter.<destination>.filter.eventName.match Optional. The comma-separated spark event names, you can specify which events to collect. For example: SparkListenerApplicationStart,SparkListenerApplicationEnd
spark.synapse.diagnostic.emitter.<destination>.filter.loggerName.match Optional. The comma-separated Log4j logger names, you can specify which logs to collect. For example: org.apache.spark.SparkContext,org.example.Logger
spark.synapse.diagnostic.emitter.<destination>.filter.metricName.match Optional. The comma-separated spark metric name suffixes, you can specify which metrics to collect. For example: jvm.heap.used

Note

The Azure Eventhub instance connection string should always contains the EntityPath, which is the name of the Azure Event Hubs instance.

Log data sample

Here's a sample log record in JSON format:

{
  "timestamp": "2024-09-06T03:09:37.235Z",
  "category": "Log|EventLog|Metrics",
  "fabricLivyId": "<fabric-livy-id>",
  "applicationId": "<application-id>",
  "applicationName": "<application-name>",
  "executorId": "<driver-or-executor-id>",
  "fabricTenantId": "<my-fabric-tenant-id>",
  "capacityId": "<my-fabric-capacity-id>",
  "artifactType": "SynapseNotebook|SparkJobDefinition",
  "artifactId": "<my-fabric-artifact-id>",
  "fabricWorkspaceId": "<my-fabric-workspace-id>",
  "fabricEnvId": "<my-fabric-environment-id>",
  "executorMin": "<executor-min>",
  "executorMax": "<executor-max>",
  "isHighConcurrencyEnabled": "true|false",
  "properties": {
    // The message properties of logs, events and metrics.
    "timestamp": "2024-09-06T03:09:37.235Z",
    "message": "Initialized BlockManager: BlockManagerId(1, vm-04b22223, 34319, None)",
    "logger_name": "org.apache.spark.storage.BlockManager",
    "level": "INFO",
    "thread_name": "dispatcher-Executor"
    //...
  }
}

Fabric workspaces with Managed virtual network

Create a managed private endpoint for the target Azure Event Hubs. For detailed instructions, refer to Create and use managed private endpoints in Microsoft Fabric - Microsoft Fabric.

Once the managed private endpoint is approved, users can begin emitting logs and metrics to the target Azure Event Hubs.

Next steps