共用方式為


適用於 JS 的 Azure Monitor 引入用戶端庫

The Azure Monitor Ingestion client library is used to send custom logs to Azure Monitor using the Logs Ingestion API.

此庫允許你將數據從幾乎任何源發送到受支援的內置表或你在Log Analytics工作區中創建的自定義表。 您甚至可以使用自定義列擴展內置表的架構。

Resources:

Getting started

Prerequisites

安裝套件

Install the Azure Monitor Ingestion client library for JS with npm:

npm install @azure/monitor-ingestion

驗證用戶端

需要經過身份驗證的用戶端才能攝取數據。 To authenticate, create an instance of a TokenCredential class (see @azure/identity for DefaultAzureCredential and other TokenCredential implementations). 將其傳遞給 Client 類的構造函數。

To authenticate, the following example uses DefaultAzureCredential from the @azure/identity package:

import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient } from "@azure/monitor-ingestion";

const logsIngestionEndpoint = "https://<my-endpoint>.azure.com";

const credential = new DefaultAzureCredential();
const logsIngestionClient = new LogsIngestionClient(logsIngestionEndpoint, credential);

設定 Azure 主權雲端的用戶端

默認情況下,用戶端配置為使用 Azure 公有雲。 要改用主權雲,請在實例化用戶端時提供正確的終端節點和受眾值。 For example:

import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient } from "@azure/monitor-ingestion";

const logsIngestionEndpoint = "https://<my-endpoint>.azure.cn";

const credential = new DefaultAzureCredential();
const logsIngestionClient = new LogsIngestionClient(logsIngestionEndpoint, credential, {
  audience: "https://api.loganalytics.azure.cn/.default",
});

Key concepts

資料收集端點

數據收集終端節點 (DCE) 允許您唯一地配置 Azure Monitor 的引入設置。 This article provides an overview of data collection endpoints including their contents and structure and how you can create and work with them.

數據收集規則

數據收集規則 (DCR) 定義 Azure Monitor 收集的數據,並指定資料的發送或儲存方式和位置。 REST API 調用必須指定要使用的 DCR。 單個 DCE 可以支援多個 DCR,因此您可以為不同的源和目標表指定不同的 DCR。

DCR 必須瞭解輸入數據的結構以及目標表的結構。 如果兩者不匹配,則可以使用轉換來轉換源數據以匹配目標表。 您還可以使用轉換來篩選源數據並執行任何其他計算或轉換。

有關更多詳細資訊,請參閱 Azure Monitor 中的數據收集規則。有關如何檢索 DCR ID 的資訊,請參閱 本教程

Log Analytics 工作區表

自定義日誌可以將數據發送到你創建的任何自定義表,以及Log Analytics工作區中的某些內置表。 目標資料表必須先存在,您才能將資料傳送至該資料表。 目前支援以下內置表:

Examples

You can familiarize yourself with different APIs using Samples.

上傳自定義日誌

您可以建立客戶端並呼叫用戶端 Upload 的方法。 Take note of the data ingestion limits.

import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient, isAggregateLogsUploadError } from "@azure/monitor-ingestion";

const logsIngestionEndpoint = "https://<my-endpoint>.azure.com";
const ruleId = "data_collection_rule_id";
const streamName = "data_stream_name";

const credential = new DefaultAzureCredential();
const logsIngestionClient = new LogsIngestionClient(logsIngestionEndpoint, credential);

const logs = [
  {
    Time: "2021-12-08T23:51:14.1104269Z",
    Computer: "Computer1",
    AdditionalContext: "context-2",
  },
  {
    Time: "2021-12-08T23:51:14.1104269Z",
    Computer: "Computer2",
    AdditionalContext: "context",
  },
];

try {
  await logsIngestionClient.upload(ruleId, streamName, logs);
} catch (e) {
  const aggregateErrors = isAggregateLogsUploadError(e) ? e.errors : [];
  if (aggregateErrors.length > 0) {
    console.log("Some logs have failed to complete ingestion");
    for (const error of aggregateErrors) {
      console.log(`Error - ${JSON.stringify(error.cause)}`);
      console.log(`Log - ${JSON.stringify(error.failedLogs)}`);
    }
  } else {
    console.log(`An error occurred: ${e}`);
  }
}

Verify logs

You can verify that your data has been uploaded correctly by using the @azure/monitor-query library. 在驗證日誌之前,請先運行 Upload customlogs (上傳自定義日誌 ) 範例。

import { DefaultAzureCredential } from "@azure/identity";
import { LogsQueryClient } from "@azure/monitor-query";

const monitorWorkspaceId = "workspace_id";
const tableName = "table_name";

const credential = new DefaultAzureCredential();
const logsQueryClient = new LogsQueryClient(credential);

const queriesBatch = [
  {
    workspaceId: monitorWorkspaceId,
    query: tableName + " | count;",
    timespan: { duration: "P1D" },
  },
];

const result = await logsQueryClient.queryBatch(queriesBatch);
if (result[0].status === "Success") {
  console.log("Table entry count: ", JSON.stringify(result[0].tables));
} else {
  console.log(
    `Some error encountered while retrieving the count. Status = ${result[0].status}`,
    JSON.stringify(result[0]),
  );
}

大批量上傳日誌

在對 upload on LogsIngestionClient方法的單次調用中上傳超過 1MB 的日誌時,上傳將被拆分為幾個較小的批次,每個批次不超過 1MB。 默認情況下,這些批次將並行上傳,最多同時上傳 5 個批次。 如果擔心記憶體使用,則可能需要降低最大併發數。 可以使用該 maxConcurrency 選項控制最大併發上傳數,如以下示例所示:

import { DefaultAzureCredential } from "@azure/identity";
import { LogsIngestionClient, isAggregateLogsUploadError } from "@azure/monitor-ingestion";

const logsIngestionEndpoint = "https://<my-endpoint>.azure.com";
const ruleId = "data_collection_rule_id";
const streamName = "data_stream_name";

const credential = new DefaultAzureCredential();
const client = new LogsIngestionClient(logsIngestionEndpoint, credential);

// Constructing a large number of logs to ensure batching takes place
const logs = [];
for (let i = 0; i < 100000; ++i) {
  logs.push({
    Time: "2021-12-08T23:51:14.1104269Z",
    Computer: "Computer1",
    AdditionalContext: `context-${i}`,
  });
}

try {
  // Set the maximum concurrency to 1 to prevent concurrent requests entirely
  await client.upload(ruleId, streamName, logs, { maxConcurrency: 1 });
} catch (e) {
  let aggregateErrors = isAggregateLogsUploadError(e) ? e.errors : [];
  if (aggregateErrors.length > 0) {
    console.log("Some logs have failed to complete ingestion");
    for (const error of aggregateErrors) {
      console.log(`Error - ${JSON.stringify(error.cause)}`);
      console.log(`Log - ${JSON.stringify(error.failedLogs)}`);
    }
  } else {
    console.log(e);
  }
}

Retrieve logs

使用 Monitor Ingestion 用戶端庫上傳的日誌可以使用 Monitor Query 用戶端庫進行檢索。

Troubleshooting

For details on diagnosing various failure scenarios, see our troubleshooting guide.

Next steps

若要瞭解有關 Azure Monitor 的詳細資訊,請參閱 Azure Monitor 服務文檔。 Please take a look at the samples directory for detailed examples on how to use this library.

Contributing

If you'd like to contribute to this library, please read the contributing guide to learn more about how to build and test the code.