Azure Storage Blob Change Feed client library for JavaScript - version 12.0.0-preview.4
Server Version: 2019-12-12 or later.
The change feed provides an ordered, guaranteed, durable, immutable, read-only transaction log of all the changes that occur to blobs and blob metadata in your storage account. Client applications can read these logs at any time. The change feed enables you to build efficient and scalable solutions that process change events that occur in your Blob Storage account at a low cost.
This project provides a client library in JavaScript that makes it easy to consume the change feed.
Use the client libraries in this package to:
- Reading change feed events, all or within a time range
- Resuming reading events from a saved position
Key links:
Getting started
Currently supported environments
- LTS versions of Node.js
- Latest versions of Safari, Chrome, Edge, and Firefox.
See our support policy for more details.
Prerequisites
Install the package
The preferred way to install the Azure Storage Blob Change Feed client library for JavaScript is to use the npm package manager. Type the following into a terminal window:
npm install @azure/storage-blob-changefeed
Authenticate the client
This library uses an authenticated BlobServiceClient
to initialize. Refer to storage-blob for how to authenticate a BlobServiceClient
.
Compatibility
For now, this library is only compatible with Node.js.
Key concepts
The change feed is stored as blobs in a special container in your storage account at standard blob pricing cost. You can control the retention period of these files based on your requirements. Change events are appended to the change feed as records in the Apache Avro format specification: a compact, fast, binary format that provides rich data structures with inline schema. This format is widely used in the Hadoop ecosystem, Stream Analytics, and Azure Data Factory.
This library offers a client you can use to fetch the change events.
Examples
- Initialize the change feed client
- Reading all events in the Change Feed
- Resuming reading events with a continuationToken
- Reading events within a time range
Initialize the change feed client
The BlobChangeFeedClient
requires almost the same parameters as BlobServiceClient
to initialize. Refer to storage-blob for how to create the blob service client. Here is an example using StorageSharedKeyCredential
.
const { StorageSharedKeyCredential } = require("@azure/storage-blob");
const { BlobChangeFeedClient } = require("@azure/storage-blob-changefeed");
// Enter your storage account name and shared key
const account = "<account>";
const accountKey = "<accountkey>";
// Use StorageSharedKeyCredential with storage account and account key
// StorageSharedKeyCredential is only available in Node.js runtime, not in browsers
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
const changeFeedClient = new BlobChangeFeedClient(
// When using AnonymousCredential, following url should include a valid SAS or support public access
`https://${account}.blob.core.windows.net`,
sharedKeyCredential
);
Reading all events in the Change Feed
Use BlobChangeFeedClient.listChanges()
to get iterators to iterate through the change events.
const { BlobChangeFeedEvent } = require("@azure/storage-blob-changefeed");
let changeFeedEvents = [];
for await (const event of changeFeedClient.listChanges()) {
changeFeedEvents.push(event);
}
By page.
const { BlobChangeFeedEvent } = require("@azure/storage-blob-changefeed");
let changeFeedEvents = [];
for await (const eventPage of changeFeedClient.listChanges().byPage()) {
for (const event of eventPage.events) {
changeFeedEvents.push(event);
}
}
Resuming reading events with a continuationToken
const { BlobChangeFeedEvent } = require("@azure/storage-blob-changefeed");
let changeFeedEvents = [];
const firstPage = await changeFeedClient
.listChanges()
.byPage({ maxPageSize: 10 })
.next();
for (const event of firstPage.value.events) {
changeFeedEvents.push(event);
}
// Resume iterating from the previous position with the continuationToken.
for await (const eventPage of changeFeedClient
.listChanges()
.byPage({ continuationToken: firstPage.value.continuationToken })) {
for (const event of eventPage.events) {
changeFeedEvents.push(event);
}
}
Reading events within a time range
Pass start time and end time to BlobChangeFeedClient.listChanges()
to fetch events within a time range.
Note that for now, the change feed client will round start time down to the nearest hour, and round end time up to the next hour.
const { BlobChangeFeedEvent } = require("@azure/storage-blob-changefeed");
const start = new Date(Date.UTC(2020, 1, 21, 22, 30, 0)); // will be rounded down to 22:00
const end = new Date(Date.UTC(2020, 4, 8, 21, 10, 0)); // will be rounded up to 22:00
let changeFeedEvents = [];
// You can also provide just a start or end time.
for await (const event of changeFeedClient.listChanges({ start, end })) {
changeFeedEvents.push(event);
}
Troubleshooting
Enabling logging may help uncover useful information about failures. In order to see a log of HTTP requests and responses, set the AZURE_LOG_LEVEL
environment variable to info
. Alternatively, logging can be enabled at runtime by calling setLogLevel
in the @azure/logger
:
import { setLogLevel } from "@azure/logger";
setLogLevel("info");
Next steps
More code samples:
- Blob Storage Change Feed Samples (JavaScript)
- Blob Storage Change Feed Samples (TypeScript)
- Blob Storage Change Feed Test Cases
Contributing
If you'd like to contribute to this library, please read the contributing guide to learn more about how to build and test the code.
Also refer to Storage specific guide for additional information on setting up the test environment for storage libraries.
Azure SDK for JavaScript