Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
PATCH https://management.azure.com/subscriptions/{subscriptionId}/resourcegroups/{resourceGroupName}/providers/Microsoft.StreamAnalytics/streamingjobs/{jobName}/outputs/{outputName}?api-version=2020-03-01
URI Parameters
Name |
In |
Required |
Type |
Description |
jobName
|
path |
True
|
string
|
The name of the streaming job.
|
outputName
|
path |
True
|
string
|
The name of the output.
|
resourceGroupName
|
path |
True
|
string
|
The name of the resource group. The name is case insensitive.
Regex pattern: ^[-\w\._\(\)]+$
|
subscriptionId
|
path |
True
|
string
|
The ID of the target subscription.
|
api-version
|
query |
True
|
string
|
The API version to use for this operation.
|
Name |
Required |
Type |
Description |
If-Match
|
|
string
|
The ETag of the output. Omit this value to always overwrite the current output. Specify the last-seen ETag value to prevent accidentally overwriting concurrent changes.
|
Request Body
Name |
Type |
Description |
name
|
string
|
Resource name
|
properties.datasource
|
OutputDataSource:
|
Describes the data source that output will be written to. Required on PUT (CreateOrReplace) requests.
|
properties.serialization
|
Serialization:
|
Describes how data from an input is serialized or how data is serialized when written to an output. Required on PUT (CreateOrReplace) requests.
|
properties.sizeWindow
|
integer
|
The size window to constrain a Stream Analytics output to.
|
properties.timeWindow
|
string
|
The time frame for filtering Stream Analytics job outputs.
|
Responses
Name |
Type |
Description |
200 OK
|
Output
|
The output was successfully updated.
Headers
ETag: string
|
Other Status Codes
|
Error
|
Error.
|
Security
azure_auth
Azure Active Directory OAuth2 Flow
Type:
oauth2
Flow:
implicit
Authorization URL:
https://login.microsoftonline.com/common/oauth2/authorize
Scopes
Name |
Description |
user_impersonation
|
impersonate your user account
|
Examples
Update a blob output with CSV serialization
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg5023/providers/Microsoft.StreamAnalytics/streamingjobs/sj900/outputs/output1623?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "Microsoft.Storage/Blob",
"properties": {
"container": "differentContainer"
}
},
"serialization": {
"type": "Csv",
"properties": {
"fieldDelimiter": "|",
"encoding": "UTF8"
}
}
}
}
import com.azure.core.util.Context;
import com.azure.resourcemanager.streamanalytics.models.BlobOutputDataSource;
import com.azure.resourcemanager.streamanalytics.models.CsvSerialization;
import com.azure.resourcemanager.streamanalytics.models.Encoding;
import com.azure.resourcemanager.streamanalytics.models.Output;
/** Samples for Outputs Update. */
public final class Main {
/*
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_Blob.json
*/
/**
* Sample code: Update a blob output with CSV serialization.
*
* @param manager Entry point to StreamAnalyticsManager.
*/
public static void updateABlobOutputWithCSVSerialization(
com.azure.resourcemanager.streamanalytics.StreamAnalyticsManager manager) {
Output resource = manager.outputs().getWithResponse("sjrg5023", "sj900", "output1623", Context.NONE).getValue();
resource
.update()
.withDatasource(new BlobOutputDataSource().withContainer("differentContainer"))
.withSerialization(new CsvSerialization().withFieldDelimiter("|").withEncoding(Encoding.UTF8))
.apply();
}
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_Blob.json
func ExampleOutputsClient_Update_updateABlobOutputWithCsvSerialization() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg5023", "sj900", "output1623", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.BlobOutputDataSource{
Type: to.Ptr("Microsoft.Storage/Blob"),
Properties: &armstreamanalytics.BlobOutputDataSourceProperties{
Container: to.Ptr("differentContainer"),
},
},
Serialization: &armstreamanalytics.CSVSerialization{
Type: to.Ptr(armstreamanalytics.EventSerializationTypeCSV),
Properties: &armstreamanalytics.CSVSerializationProperties{
Encoding: to.Ptr(armstreamanalytics.EncodingUTF8),
FieldDelimiter: to.Ptr("|"),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("output1623"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg5023/providers/Microsoft.StreamAnalytics/streamingjobs/sj900/outputs/output1623"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.BlobOutputDataSource{
// Type: to.Ptr("Microsoft.Storage/Blob"),
// Properties: &armstreamanalytics.BlobOutputDataSourceProperties{
// Container: to.Ptr("differentContainer"),
// DateFormat: to.Ptr("yyyy/MM/dd"),
// PathPattern: to.Ptr("{date}/{time}"),
// StorageAccounts: []*armstreamanalytics.StorageAccount{
// {
// AccountName: to.Ptr("someAccountName"),
// }},
// TimeFormat: to.Ptr("HH"),
// },
// },
// Serialization: &armstreamanalytics.CSVSerialization{
// Type: to.Ptr(armstreamanalytics.EventSerializationTypeCSV),
// Properties: &armstreamanalytics.CSVSerializationProperties{
// Encoding: to.Ptr(armstreamanalytics.EncodingUTF8),
// FieldDelimiter: to.Ptr("|"),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
const { StreamAnalyticsManagementClient } = require("@azure/arm-streamanalytics");
const { DefaultAzureCredential } = require("@azure/identity");
/**
* This sample demonstrates how to Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
*
* @summary Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_Blob.json
*/
async function updateABlobOutputWithCsvSerialization() {
const subscriptionId = "56b5e0a9-b645-407d-99b0-c64f86013e3d";
const resourceGroupName = "sjrg5023";
const jobName = "sj900";
const outputName = "output1623";
const output = {
datasource: {
type: "Microsoft.Storage/Blob",
container: "differentContainer",
},
serialization: { type: "Csv", encoding: "UTF8", fieldDelimiter: "|" },
};
const credential = new DefaultAzureCredential();
const client = new StreamAnalyticsManagementClient(credential, subscriptionId);
const result = await client.outputs.update(resourceGroupName, jobName, outputName, output);
console.log(result);
}
updateABlobOutputWithCsvSerialization().catch(console.error);
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: 3a1b2023-79a9-4b33-93e8-f49fc3e573fe
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg5023/providers/Microsoft.StreamAnalytics/streamingjobs/sj900/outputs/output1623",
"name": "output1623",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "Microsoft.Storage/Blob",
"properties": {
"storageAccounts": [
{
"accountName": "someAccountName"
}
],
"container": "differentContainer",
"pathPattern": "{date}/{time}",
"dateFormat": "yyyy/MM/dd",
"timeFormat": "HH"
}
},
"serialization": {
"type": "Csv",
"properties": {
"fieldDelimiter": "|",
"encoding": "UTF8"
}
}
}
}
Update a DocumentDB output
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg7983/providers/Microsoft.StreamAnalytics/streamingjobs/sj2331/outputs/output3022?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "Microsoft.Storage/DocumentDB",
"properties": {
"partitionKey": "differentPartitionKey"
}
}
}
}
import com.azure.core.util.Context;
import com.azure.resourcemanager.streamanalytics.models.DocumentDbOutputDataSource;
import com.azure.resourcemanager.streamanalytics.models.Output;
/** Samples for Outputs Update. */
public final class Main {
/*
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_DocumentDB.json
*/
/**
* Sample code: Update a DocumentDB output.
*
* @param manager Entry point to StreamAnalyticsManager.
*/
public static void updateADocumentDBOutput(
com.azure.resourcemanager.streamanalytics.StreamAnalyticsManager manager) {
Output resource =
manager.outputs().getWithResponse("sjrg7983", "sj2331", "output3022", Context.NONE).getValue();
resource
.update()
.withDatasource(new DocumentDbOutputDataSource().withPartitionKey("differentPartitionKey"))
.apply();
}
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_DocumentDB.json
func ExampleOutputsClient_Update_updateADocumentDbOutput() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg7983", "sj2331", "output3022", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.DocumentDbOutputDataSource{
Type: to.Ptr("Microsoft.Storage/DocumentDB"),
Properties: &armstreamanalytics.DocumentDbOutputDataSourceProperties{
PartitionKey: to.Ptr("differentPartitionKey"),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("output3022"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg7983/providers/Microsoft.StreamAnalytics/streamingjobs/sj2331/outputs/output3022"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.DocumentDbOutputDataSource{
// Type: to.Ptr("Microsoft.Storage/DocumentDB"),
// Properties: &armstreamanalytics.DocumentDbOutputDataSourceProperties{
// AccountID: to.Ptr("someAccountId"),
// CollectionNamePattern: to.Ptr("collection"),
// Database: to.Ptr("db01"),
// DocumentID: to.Ptr("documentId"),
// PartitionKey: to.Ptr("differentPartitionKey"),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
const { StreamAnalyticsManagementClient } = require("@azure/arm-streamanalytics");
const { DefaultAzureCredential } = require("@azure/identity");
/**
* This sample demonstrates how to Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
*
* @summary Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_DocumentDB.json
*/
async function updateADocumentDbOutput() {
const subscriptionId = "56b5e0a9-b645-407d-99b0-c64f86013e3d";
const resourceGroupName = "sjrg7983";
const jobName = "sj2331";
const outputName = "output3022";
const output = {
datasource: {
type: "Microsoft.Storage/DocumentDB",
partitionKey: "differentPartitionKey",
},
};
const credential = new DefaultAzureCredential();
const client = new StreamAnalyticsManagementClient(credential, subscriptionId);
const result = await client.outputs.update(resourceGroupName, jobName, outputName, output);
console.log(result);
}
updateADocumentDbOutput().catch(console.error);
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: 7849c132-e995-4631-91c3-931606eec432
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg7983/providers/Microsoft.StreamAnalytics/streamingjobs/sj2331/outputs/output3022",
"name": "output3022",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "Microsoft.Storage/DocumentDB",
"properties": {
"accountId": "someAccountId",
"database": "db01",
"collectionNamePattern": "collection",
"partitionKey": "differentPartitionKey",
"documentId": "documentId"
}
}
}
}
Update a Power BI output
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg7983/providers/Microsoft.StreamAnalytics/streamingjobs/sj2331/outputs/output3022?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "PowerBI",
"properties": {
"dataset": "differentDataset"
}
}
}
}
import com.azure.core.util.Context;
import com.azure.resourcemanager.streamanalytics.models.Output;
import com.azure.resourcemanager.streamanalytics.models.PowerBIOutputDataSource;
/** Samples for Outputs Update. */
public final class Main {
/*
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_PowerBI.json
*/
/**
* Sample code: Update a Power BI output.
*
* @param manager Entry point to StreamAnalyticsManager.
*/
public static void updateAPowerBIOutput(com.azure.resourcemanager.streamanalytics.StreamAnalyticsManager manager) {
Output resource =
manager.outputs().getWithResponse("sjrg7983", "sj2331", "output3022", Context.NONE).getValue();
resource.update().withDatasource(new PowerBIOutputDataSource().withDataset("differentDataset")).apply();
}
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_PowerBI.json
func ExampleOutputsClient_Update_updateAPowerBiOutput() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg7983", "sj2331", "output3022", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.PowerBIOutputDataSource{
Type: to.Ptr("PowerBI"),
Properties: &armstreamanalytics.PowerBIOutputDataSourceProperties{
Dataset: to.Ptr("differentDataset"),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("output3022"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg7983/providers/Microsoft.StreamAnalytics/streamingjobs/sj2331/outputs/output3022"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.PowerBIOutputDataSource{
// Type: to.Ptr("PowerBI"),
// Properties: &armstreamanalytics.PowerBIOutputDataSourceProperties{
// TokenUserDisplayName: to.Ptr("Bob Smith"),
// TokenUserPrincipalName: to.Ptr("bobsmith@contoso.com"),
// Dataset: to.Ptr("differentDataset"),
// GroupID: to.Ptr("ac40305e-3e8d-43ac-8161-c33799f43e95"),
// GroupName: to.Ptr("MyPowerBIGroup"),
// Table: to.Ptr("someTable"),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
const { StreamAnalyticsManagementClient } = require("@azure/arm-streamanalytics");
const { DefaultAzureCredential } = require("@azure/identity");
/**
* This sample demonstrates how to Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
*
* @summary Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_PowerBI.json
*/
async function updateAPowerBiOutput() {
const subscriptionId = "56b5e0a9-b645-407d-99b0-c64f86013e3d";
const resourceGroupName = "sjrg7983";
const jobName = "sj2331";
const outputName = "output3022";
const output = {
datasource: { type: "PowerBI", dataset: "differentDataset" },
};
const credential = new DefaultAzureCredential();
const client = new StreamAnalyticsManagementClient(credential, subscriptionId);
const result = await client.outputs.update(resourceGroupName, jobName, outputName, output);
console.log(result);
}
updateAPowerBiOutput().catch(console.error);
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: 7849c132-e995-4631-91c3-931606eec432
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg7983/providers/Microsoft.StreamAnalytics/streamingjobs/sj2331/outputs/output3022",
"name": "output3022",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "PowerBI",
"properties": {
"dataset": "differentDataset",
"table": "someTable",
"tokenUserPrincipalName": "bobsmith@contoso.com",
"tokenUserDisplayName": "Bob Smith",
"groupId": "ac40305e-3e8d-43ac-8161-c33799f43e95",
"groupName": "MyPowerBIGroup"
}
}
}
}
Update a Service Bus Queue output with Avro serialization
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg3410/providers/Microsoft.StreamAnalytics/streamingjobs/sj5095/outputs/output3456?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "Microsoft.ServiceBus/Queue",
"properties": {
"queueName": "differentQueueName"
}
},
"serialization": {
"type": "Json",
"properties": {
"encoding": "UTF8",
"format": "LineSeparated"
}
}
}
}
import com.azure.core.util.Context;
import com.azure.resourcemanager.streamanalytics.models.Encoding;
import com.azure.resourcemanager.streamanalytics.models.JsonOutputSerializationFormat;
import com.azure.resourcemanager.streamanalytics.models.JsonSerialization;
import com.azure.resourcemanager.streamanalytics.models.Output;
import com.azure.resourcemanager.streamanalytics.models.ServiceBusQueueOutputDataSource;
/** Samples for Outputs Update. */
public final class Main {
/*
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_ServiceBusQueue.json
*/
/**
* Sample code: Update a Service Bus Queue output with Avro serialization.
*
* @param manager Entry point to StreamAnalyticsManager.
*/
public static void updateAServiceBusQueueOutputWithAvroSerialization(
com.azure.resourcemanager.streamanalytics.StreamAnalyticsManager manager) {
Output resource =
manager.outputs().getWithResponse("sjrg3410", "sj5095", "output3456", Context.NONE).getValue();
resource
.update()
.withDatasource(new ServiceBusQueueOutputDataSource().withQueueName("differentQueueName"))
.withSerialization(
new JsonSerialization()
.withEncoding(Encoding.UTF8)
.withFormat(JsonOutputSerializationFormat.LINE_SEPARATED))
.apply();
}
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_ServiceBusQueue.json
func ExampleOutputsClient_Update_updateAServiceBusQueueOutputWithAvroSerialization() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg3410", "sj5095", "output3456", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.ServiceBusQueueOutputDataSource{
Type: to.Ptr("Microsoft.ServiceBus/Queue"),
Properties: &armstreamanalytics.ServiceBusQueueOutputDataSourceProperties{
QueueName: to.Ptr("differentQueueName"),
},
},
Serialization: &armstreamanalytics.JSONSerialization{
Type: to.Ptr(armstreamanalytics.EventSerializationTypeJSON),
Properties: &armstreamanalytics.JSONSerializationProperties{
Format: to.Ptr(armstreamanalytics.JSONOutputSerializationFormatLineSeparated),
Encoding: to.Ptr(armstreamanalytics.EncodingUTF8),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("output3456"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg3410/providers/Microsoft.StreamAnalytics/streamingjobs/sj5095/outputs/output3456"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.ServiceBusQueueOutputDataSource{
// Type: to.Ptr("Microsoft.ServiceBus/Queue"),
// Properties: &armstreamanalytics.ServiceBusQueueOutputDataSourceProperties{
// ServiceBusNamespace: to.Ptr("sdktest"),
// SharedAccessPolicyName: to.Ptr("RootManageSharedAccessKey"),
// PropertyColumns: []*string{
// to.Ptr("column1"),
// to.Ptr("column2")},
// QueueName: to.Ptr("differentQueueName"),
// },
// },
// Serialization: &armstreamanalytics.JSONSerialization{
// Type: to.Ptr(armstreamanalytics.EventSerializationTypeJSON),
// Properties: &armstreamanalytics.JSONSerializationProperties{
// Format: to.Ptr(armstreamanalytics.JSONOutputSerializationFormatLineSeparated),
// Encoding: to.Ptr(armstreamanalytics.EncodingUTF8),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
const { StreamAnalyticsManagementClient } = require("@azure/arm-streamanalytics");
const { DefaultAzureCredential } = require("@azure/identity");
/**
* This sample demonstrates how to Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
*
* @summary Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_ServiceBusQueue.json
*/
async function updateAServiceBusQueueOutputWithAvroSerialization() {
const subscriptionId = "56b5e0a9-b645-407d-99b0-c64f86013e3d";
const resourceGroupName = "sjrg3410";
const jobName = "sj5095";
const outputName = "output3456";
const output = {
datasource: {
type: "Microsoft.ServiceBus/Queue",
queueName: "differentQueueName",
},
serialization: { type: "Json", format: "LineSeparated", encoding: "UTF8" },
};
const credential = new DefaultAzureCredential();
const client = new StreamAnalyticsManagementClient(credential, subscriptionId);
const result = await client.outputs.update(resourceGroupName, jobName, outputName, output);
console.log(result);
}
updateAServiceBusQueueOutputWithAvroSerialization().catch(console.error);
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: 429adaec-a777-4750-8a39-8d0c931d801c
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg3410/providers/Microsoft.StreamAnalytics/streamingjobs/sj5095/outputs/output3456",
"name": "output3456",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "Microsoft.ServiceBus/Queue",
"properties": {
"queueName": "differentQueueName",
"propertyColumns": [
"column1",
"column2"
],
"serviceBusNamespace": "sdktest",
"sharedAccessPolicyName": "RootManageSharedAccessKey"
}
},
"serialization": {
"type": "Json",
"properties": {
"encoding": "UTF8",
"format": "LineSeparated"
}
}
}
}
Update a Service Bus Topic output with CSV serialization
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg6450/providers/Microsoft.StreamAnalytics/streamingjobs/sj7094/outputs/output7886?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "Microsoft.ServiceBus/Topic",
"properties": {
"topicName": "differentTopicName"
}
},
"serialization": {
"type": "Csv",
"properties": {
"fieldDelimiter": "|",
"encoding": "UTF8"
}
}
}
}
import com.azure.core.util.Context;
import com.azure.resourcemanager.streamanalytics.models.CsvSerialization;
import com.azure.resourcemanager.streamanalytics.models.Encoding;
import com.azure.resourcemanager.streamanalytics.models.Output;
import com.azure.resourcemanager.streamanalytics.models.ServiceBusTopicOutputDataSource;
/** Samples for Outputs Update. */
public final class Main {
/*
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_ServiceBusTopic.json
*/
/**
* Sample code: Update a Service Bus Topic output with CSV serialization.
*
* @param manager Entry point to StreamAnalyticsManager.
*/
public static void updateAServiceBusTopicOutputWithCSVSerialization(
com.azure.resourcemanager.streamanalytics.StreamAnalyticsManager manager) {
Output resource =
manager.outputs().getWithResponse("sjrg6450", "sj7094", "output7886", Context.NONE).getValue();
resource
.update()
.withDatasource(new ServiceBusTopicOutputDataSource().withTopicName("differentTopicName"))
.withSerialization(new CsvSerialization().withFieldDelimiter("|").withEncoding(Encoding.UTF8))
.apply();
}
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_ServiceBusTopic.json
func ExampleOutputsClient_Update_updateAServiceBusTopicOutputWithCsvSerialization() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg6450", "sj7094", "output7886", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.ServiceBusTopicOutputDataSource{
Type: to.Ptr("Microsoft.ServiceBus/Topic"),
Properties: &armstreamanalytics.ServiceBusTopicOutputDataSourceProperties{
TopicName: to.Ptr("differentTopicName"),
},
},
Serialization: &armstreamanalytics.CSVSerialization{
Type: to.Ptr(armstreamanalytics.EventSerializationTypeCSV),
Properties: &armstreamanalytics.CSVSerializationProperties{
Encoding: to.Ptr(armstreamanalytics.EncodingUTF8),
FieldDelimiter: to.Ptr("|"),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("output7886"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg6450/providers/Microsoft.StreamAnalytics/streamingjobs/sj7094/outputs/output7886"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.ServiceBusTopicOutputDataSource{
// Type: to.Ptr("Microsoft.ServiceBus/Topic"),
// Properties: &armstreamanalytics.ServiceBusTopicOutputDataSourceProperties{
// ServiceBusNamespace: to.Ptr("sdktest"),
// SharedAccessPolicyName: to.Ptr("RootManageSharedAccessKey"),
// PropertyColumns: []*string{
// to.Ptr("column1"),
// to.Ptr("column2")},
// TopicName: to.Ptr("differentTopicName"),
// },
// },
// Serialization: &armstreamanalytics.CSVSerialization{
// Type: to.Ptr(armstreamanalytics.EventSerializationTypeCSV),
// Properties: &armstreamanalytics.CSVSerializationProperties{
// Encoding: to.Ptr(armstreamanalytics.EncodingUTF8),
// FieldDelimiter: to.Ptr("|"),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
const { StreamAnalyticsManagementClient } = require("@azure/arm-streamanalytics");
const { DefaultAzureCredential } = require("@azure/identity");
/**
* This sample demonstrates how to Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
*
* @summary Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_ServiceBusTopic.json
*/
async function updateAServiceBusTopicOutputWithCsvSerialization() {
const subscriptionId = "56b5e0a9-b645-407d-99b0-c64f86013e3d";
const resourceGroupName = "sjrg6450";
const jobName = "sj7094";
const outputName = "output7886";
const output = {
datasource: {
type: "Microsoft.ServiceBus/Topic",
topicName: "differentTopicName",
},
serialization: { type: "Csv", encoding: "UTF8", fieldDelimiter: "|" },
};
const credential = new DefaultAzureCredential();
const client = new StreamAnalyticsManagementClient(credential, subscriptionId);
const result = await client.outputs.update(resourceGroupName, jobName, outputName, output);
console.log(result);
}
updateAServiceBusTopicOutputWithCsvSerialization().catch(console.error);
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: c1c2007f-45b2-419a-ae7d-4d2148998460
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg6450/providers/Microsoft.StreamAnalytics/streamingjobs/sj7094/outputs/output7886",
"name": "output7886",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "Microsoft.ServiceBus/Topic",
"properties": {
"topicName": "differentTopicName",
"propertyColumns": [
"column1",
"column2"
],
"serviceBusNamespace": "sdktest",
"sharedAccessPolicyName": "RootManageSharedAccessKey"
}
},
"serialization": {
"type": "Csv",
"properties": {
"fieldDelimiter": "|",
"encoding": "UTF8"
}
}
}
}
Update an Azure Data Lake Store output with JSON serialization
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg6912/providers/Microsoft.StreamAnalytics/streamingjobs/sj3310/outputs/output5195?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "Microsoft.DataLake/Accounts",
"properties": {
"accountName": "differentaccount"
}
},
"serialization": {
"type": "Json",
"properties": {
"encoding": "UTF8",
"format": "LineSeparated"
}
}
}
}
import com.azure.core.util.Context;
import com.azure.resourcemanager.streamanalytics.models.AzureDataLakeStoreOutputDataSource;
import com.azure.resourcemanager.streamanalytics.models.Encoding;
import com.azure.resourcemanager.streamanalytics.models.JsonOutputSerializationFormat;
import com.azure.resourcemanager.streamanalytics.models.JsonSerialization;
import com.azure.resourcemanager.streamanalytics.models.Output;
/** Samples for Outputs Update. */
public final class Main {
/*
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_AzureDataLakeStore.json
*/
/**
* Sample code: Update an Azure Data Lake Store output with JSON serialization.
*
* @param manager Entry point to StreamAnalyticsManager.
*/
public static void updateAnAzureDataLakeStoreOutputWithJSONSerialization(
com.azure.resourcemanager.streamanalytics.StreamAnalyticsManager manager) {
Output resource =
manager.outputs().getWithResponse("sjrg6912", "sj3310", "output5195", Context.NONE).getValue();
resource
.update()
.withDatasource(new AzureDataLakeStoreOutputDataSource().withAccountName("differentaccount"))
.withSerialization(
new JsonSerialization()
.withEncoding(Encoding.UTF8)
.withFormat(JsonOutputSerializationFormat.LINE_SEPARATED))
.apply();
}
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_AzureDataLakeStore.json
func ExampleOutputsClient_Update_updateAnAzureDataLakeStoreOutputWithJsonSerialization() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg6912", "sj3310", "output5195", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.AzureDataLakeStoreOutputDataSource{
Type: to.Ptr("Microsoft.DataLake/Accounts"),
Properties: &armstreamanalytics.AzureDataLakeStoreOutputDataSourceProperties{
AccountName: to.Ptr("differentaccount"),
},
},
Serialization: &armstreamanalytics.JSONSerialization{
Type: to.Ptr(armstreamanalytics.EventSerializationTypeJSON),
Properties: &armstreamanalytics.JSONSerializationProperties{
Format: to.Ptr(armstreamanalytics.JSONOutputSerializationFormatLineSeparated),
Encoding: to.Ptr(armstreamanalytics.EncodingUTF8),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("output5195"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg6912/providers/Microsoft.StreamAnalytics/streamingjobs/sj3310/outputs/output5195"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.AzureDataLakeStoreOutputDataSource{
// Type: to.Ptr("Microsoft.DataLake/Accounts"),
// Properties: &armstreamanalytics.AzureDataLakeStoreOutputDataSourceProperties{
// TokenUserDisplayName: to.Ptr("Bob Smith"),
// TokenUserPrincipalName: to.Ptr("bobsmith@contoso.com"),
// AccountName: to.Ptr("differentaccount"),
// DateFormat: to.Ptr("yyyy/MM/dd"),
// FilePathPrefix: to.Ptr("{date}/{time}"),
// TenantID: to.Ptr("cea4e98b-c798-49e7-8c40-4a2b3beb47dd"),
// TimeFormat: to.Ptr("HH"),
// },
// },
// Serialization: &armstreamanalytics.JSONSerialization{
// Type: to.Ptr(armstreamanalytics.EventSerializationTypeJSON),
// Properties: &armstreamanalytics.JSONSerializationProperties{
// Format: to.Ptr(armstreamanalytics.JSONOutputSerializationFormatLineSeparated),
// Encoding: to.Ptr(armstreamanalytics.EncodingUTF8),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
const { StreamAnalyticsManagementClient } = require("@azure/arm-streamanalytics");
const { DefaultAzureCredential } = require("@azure/identity");
/**
* This sample demonstrates how to Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
*
* @summary Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_AzureDataLakeStore.json
*/
async function updateAnAzureDataLakeStoreOutputWithJsonSerialization() {
const subscriptionId = "56b5e0a9-b645-407d-99b0-c64f86013e3d";
const resourceGroupName = "sjrg6912";
const jobName = "sj3310";
const outputName = "output5195";
const output = {
datasource: {
type: "Microsoft.DataLake/Accounts",
accountName: "differentaccount",
},
serialization: { type: "Json", format: "LineSeparated", encoding: "UTF8" },
};
const credential = new DefaultAzureCredential();
const client = new StreamAnalyticsManagementClient(credential, subscriptionId);
const result = await client.outputs.update(resourceGroupName, jobName, outputName, output);
console.log(result);
}
updateAnAzureDataLakeStoreOutputWithJsonSerialization().catch(console.error);
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: 5020de6b-5bb3-4b88-8606-f11fb3c46185
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg6912/providers/Microsoft.StreamAnalytics/streamingjobs/sj3310/outputs/output5195",
"name": "output5195",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "Microsoft.DataLake/Accounts",
"properties": {
"accountName": "differentaccount",
"tenantId": "cea4e98b-c798-49e7-8c40-4a2b3beb47dd",
"tokenUserPrincipalName": "bobsmith@contoso.com",
"tokenUserDisplayName": "Bob Smith",
"filePathPrefix": "{date}/{time}",
"dateFormat": "yyyy/MM/dd",
"timeFormat": "HH"
}
},
"serialization": {
"type": "Json",
"properties": {
"encoding": "UTF8",
"format": "LineSeparated"
}
}
}
}
Update an Azure Data Warehouse output
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg/providers/Microsoft.StreamAnalytics/streamingjobs/sjName/outputs/dwOutput?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "Microsoft.Sql/Server/Database",
"properties": {
"table": "differentTable"
}
}
}
}
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_DataWarehouse.json
func ExampleOutputsClient_Update_updateAnAzureDataWarehouseOutput() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg", "sjName", "dwOutput", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.AzureSQLDatabaseOutputDataSource{
Type: to.Ptr("Microsoft.Sql/Server/Database"),
Properties: &armstreamanalytics.AzureSQLDatabaseOutputDataSourceProperties{
Table: to.Ptr("differentTable"),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("dwOutput"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg/providers/Microsoft.StreamAnalytics/streamingjobs/sjName/outputs/dwOutput"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.AzureSynapseOutputDataSource{
// Type: to.Ptr("Microsoft.Sql/Server/DataWarehouse"),
// Properties: &armstreamanalytics.AzureSynapseOutputDataSourceProperties{
// Database: to.Ptr("zhayaSQLpool"),
// Server: to.Ptr("asatestserver"),
// Table: to.Ptr("differentTable"),
// User: to.Ptr("tolladmin"),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: f489d6f3-fcd5-4bcb-b642-81e987ee16d6
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg/providers/Microsoft.StreamAnalytics/streamingjobs/sjName/outputs/dwOutput",
"name": "dwOutput",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "Microsoft.Sql/Server/DataWarehouse",
"properties": {
"table": "differentTable",
"server": "asatestserver",
"database": "zhayaSQLpool",
"user": "tolladmin"
}
}
}
}
Update an Azure Function output
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg/providers/Microsoft.StreamAnalytics/streamingjobs/sjName/outputs/azureFunction1?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "Microsoft.AzureFunction",
"properties": {
"functionName": "differentFunctionName"
}
}
}
}
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_AzureFunction.json
func ExampleOutputsClient_Update_updateAnAzureFunctionOutput() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg", "sjName", "azureFunction1", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.AzureFunctionOutputDataSource{
Type: to.Ptr("Microsoft.AzureFunction"),
Properties: &armstreamanalytics.AzureFunctionOutputDataSourceProperties{
FunctionName: to.Ptr("differentFunctionName"),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("azureFunction1"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg/providers/Microsoft.StreamAnalytics/streamingjobs/sjName/outputs/azureFunction1"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.AzureFunctionOutputDataSource{
// Type: to.Ptr("Microsoft.AzureFunction"),
// Properties: &armstreamanalytics.AzureFunctionOutputDataSourceProperties{
// FunctionAppName: to.Ptr("functionappforasaautomation"),
// FunctionName: to.Ptr("differentFunctionName"),
// MaxBatchCount: to.Ptr[float32](100),
// MaxBatchSize: to.Ptr[float32](256),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: f489d6f3-fcd5-4bcb-b642-81e987ee16d6
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg/providers/Microsoft.StreamAnalytics/streamingjobs/sjName/outputs/azureFunction1",
"name": "azureFunction1",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "Microsoft.AzureFunction",
"properties": {
"functionAppName": "functionappforasaautomation",
"functionName": "differentFunctionName",
"apiKey": null,
"maxBatchSize": 256,
"maxBatchCount": 100
}
}
}
}
Update an Azure SQL database output
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg2157/providers/Microsoft.StreamAnalytics/streamingjobs/sj6458/outputs/output1755?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "Microsoft.Sql/Server/Database",
"properties": {
"table": "differentTable"
}
}
}
}
import com.azure.core.util.Context;
import com.azure.resourcemanager.streamanalytics.models.AzureSqlDatabaseOutputDataSource;
import com.azure.resourcemanager.streamanalytics.models.Output;
/** Samples for Outputs Update. */
public final class Main {
/*
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_AzureSQL.json
*/
/**
* Sample code: Update an Azure SQL database output.
*
* @param manager Entry point to StreamAnalyticsManager.
*/
public static void updateAnAzureSQLDatabaseOutput(
com.azure.resourcemanager.streamanalytics.StreamAnalyticsManager manager) {
Output resource =
manager.outputs().getWithResponse("sjrg2157", "sj6458", "output1755", Context.NONE).getValue();
resource.update().withDatasource(new AzureSqlDatabaseOutputDataSource()).apply();
}
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_AzureSQL.json
func ExampleOutputsClient_Update_updateAnAzureSqlDatabaseOutput() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg2157", "sj6458", "output1755", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.AzureSQLDatabaseOutputDataSource{
Type: to.Ptr("Microsoft.Sql/Server/Database"),
Properties: &armstreamanalytics.AzureSQLDatabaseOutputDataSourceProperties{
Table: to.Ptr("differentTable"),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("output1755"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg2157/providers/Microsoft.StreamAnalytics/streamingjobs/sj6458/outputs/output1755"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.AzureSQLDatabaseOutputDataSource{
// Type: to.Ptr("Microsoft.Sql/Server/Database"),
// Properties: &armstreamanalytics.AzureSQLDatabaseOutputDataSourceProperties{
// Database: to.Ptr("someDatabase"),
// Server: to.Ptr("someServer"),
// Table: to.Ptr("differentTable"),
// User: to.Ptr("someUser"),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
const { StreamAnalyticsManagementClient } = require("@azure/arm-streamanalytics");
const { DefaultAzureCredential } = require("@azure/identity");
/**
* This sample demonstrates how to Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
*
* @summary Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_AzureSQL.json
*/
async function updateAnAzureSqlDatabaseOutput() {
const subscriptionId = "56b5e0a9-b645-407d-99b0-c64f86013e3d";
const resourceGroupName = "sjrg2157";
const jobName = "sj6458";
const outputName = "output1755";
const output = {
datasource: {
type: "Microsoft.Sql/Server/Database",
table: "differentTable",
},
};
const credential = new DefaultAzureCredential();
const client = new StreamAnalyticsManagementClient(credential, subscriptionId);
const result = await client.outputs.update(resourceGroupName, jobName, outputName, output);
console.log(result);
}
updateAnAzureSqlDatabaseOutput().catch(console.error);
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: f489d6f3-fcd5-4bcb-b642-81e987ee16d6
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg2157/providers/Microsoft.StreamAnalytics/streamingjobs/sj6458/outputs/output1755",
"name": "output1755",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "Microsoft.Sql/Server/Database",
"properties": {
"server": "someServer",
"database": "someDatabase",
"table": "differentTable",
"user": "someUser"
}
}
}
}
Update an Azure Table output
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg5176/providers/Microsoft.StreamAnalytics/streamingjobs/sj2790/outputs/output958?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "Microsoft.Storage/Table",
"properties": {
"partitionKey": "differentPartitionKey"
}
}
}
}
import com.azure.core.util.Context;
import com.azure.resourcemanager.streamanalytics.models.AzureTableOutputDataSource;
import com.azure.resourcemanager.streamanalytics.models.Output;
/** Samples for Outputs Update. */
public final class Main {
/*
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_AzureTable.json
*/
/**
* Sample code: Update an Azure Table output.
*
* @param manager Entry point to StreamAnalyticsManager.
*/
public static void updateAnAzureTableOutput(
com.azure.resourcemanager.streamanalytics.StreamAnalyticsManager manager) {
Output resource = manager.outputs().getWithResponse("sjrg5176", "sj2790", "output958", Context.NONE).getValue();
resource
.update()
.withDatasource(new AzureTableOutputDataSource().withPartitionKey("differentPartitionKey"))
.apply();
}
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_AzureTable.json
func ExampleOutputsClient_Update_updateAnAzureTableOutput() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg5176", "sj2790", "output958", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.AzureTableOutputDataSource{
Type: to.Ptr("Microsoft.Storage/Table"),
Properties: &armstreamanalytics.AzureTableOutputDataSourceProperties{
PartitionKey: to.Ptr("differentPartitionKey"),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("output958"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg5176/providers/Microsoft.StreamAnalytics/streamingjobs/sj2790/outputs/output958"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.AzureTableOutputDataSource{
// Type: to.Ptr("Microsoft.Storage/Table"),
// Properties: &armstreamanalytics.AzureTableOutputDataSourceProperties{
// AccountName: to.Ptr("someAccountName"),
// BatchSize: to.Ptr[int32](25),
// ColumnsToRemove: []*string{
// to.Ptr("column1"),
// to.Ptr("column2")},
// PartitionKey: to.Ptr("differentPartitionKey"),
// RowKey: to.Ptr("rowKey"),
// Table: to.Ptr("samples"),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
const { StreamAnalyticsManagementClient } = require("@azure/arm-streamanalytics");
const { DefaultAzureCredential } = require("@azure/identity");
/**
* This sample demonstrates how to Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
*
* @summary Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_AzureTable.json
*/
async function updateAnAzureTableOutput() {
const subscriptionId = "56b5e0a9-b645-407d-99b0-c64f86013e3d";
const resourceGroupName = "sjrg5176";
const jobName = "sj2790";
const outputName = "output958";
const output = {
datasource: {
type: "Microsoft.Storage/Table",
partitionKey: "differentPartitionKey",
},
};
const credential = new DefaultAzureCredential();
const client = new StreamAnalyticsManagementClient(credential, subscriptionId);
const result = await client.outputs.update(resourceGroupName, jobName, outputName, output);
console.log(result);
}
updateAnAzureTableOutput().catch(console.error);
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: ea1d20bf-6cb3-40bc-bc7b-ec3a7fd5977e
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg5176/providers/Microsoft.StreamAnalytics/streamingjobs/sj2790/outputs/output958",
"name": "output958",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "Microsoft.Storage/Table",
"properties": {
"accountName": "someAccountName",
"table": "samples",
"partitionKey": "differentPartitionKey",
"rowKey": "rowKey",
"columnsToRemove": [
"column1",
"column2"
],
"batchSize": 25
}
}
}
}
Update an Event Hub output with JSON serialization
Sample request
PATCH https://management.azure.com/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourcegroups/sjrg6912/providers/Microsoft.StreamAnalytics/streamingjobs/sj3310/outputs/output5195?api-version=2020-03-01
{
"properties": {
"datasource": {
"type": "Microsoft.ServiceBus/EventHub",
"properties": {
"partitionKey": "differentPartitionKey"
}
},
"serialization": {
"type": "Json",
"properties": {
"encoding": "UTF8",
"format": "LineSeparated"
}
}
}
}
import com.azure.core.util.Context;
import com.azure.resourcemanager.streamanalytics.models.Encoding;
import com.azure.resourcemanager.streamanalytics.models.EventHubOutputDataSource;
import com.azure.resourcemanager.streamanalytics.models.JsonOutputSerializationFormat;
import com.azure.resourcemanager.streamanalytics.models.JsonSerialization;
import com.azure.resourcemanager.streamanalytics.models.Output;
/** Samples for Outputs Update. */
public final class Main {
/*
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_EventHub.json
*/
/**
* Sample code: Update an Event Hub output with JSON serialization.
*
* @param manager Entry point to StreamAnalyticsManager.
*/
public static void updateAnEventHubOutputWithJSONSerialization(
com.azure.resourcemanager.streamanalytics.StreamAnalyticsManager manager) {
Output resource =
manager.outputs().getWithResponse("sjrg6912", "sj3310", "output5195", Context.NONE).getValue();
resource
.update()
.withDatasource(new EventHubOutputDataSource().withPartitionKey("differentPartitionKey"))
.withSerialization(
new JsonSerialization()
.withEncoding(Encoding.UTF8)
.withFormat(JsonOutputSerializationFormat.LINE_SEPARATED))
.apply();
}
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
package armstreamanalytics_test
import (
"context"
"log"
"github.com/Azure/azure-sdk-for-go/sdk/azcore/to"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/Azure/azure-sdk-for-go/sdk/resourcemanager/streamanalytics/armstreamanalytics"
)
// Generated from example definition: https://github.com/Azure/azure-rest-api-specs/blob/d55b8005f05b040b852c15e74a0f3e36494a15e1/specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_EventHub.json
func ExampleOutputsClient_Update_updateAnEventHubOutputWithJsonSerialization() {
cred, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
log.Fatalf("failed to obtain a credential: %v", err)
}
ctx := context.Background()
clientFactory, err := armstreamanalytics.NewClientFactory("<subscription-id>", cred, nil)
if err != nil {
log.Fatalf("failed to create client: %v", err)
}
res, err := clientFactory.NewOutputsClient().Update(ctx, "sjrg6912", "sj3310", "output5195", armstreamanalytics.Output{
Properties: &armstreamanalytics.OutputProperties{
Datasource: &armstreamanalytics.EventHubOutputDataSource{
Type: to.Ptr("Microsoft.ServiceBus/EventHub"),
Properties: &armstreamanalytics.EventHubOutputDataSourceProperties{
PartitionKey: to.Ptr("differentPartitionKey"),
},
},
Serialization: &armstreamanalytics.JSONSerialization{
Type: to.Ptr(armstreamanalytics.EventSerializationTypeJSON),
Properties: &armstreamanalytics.JSONSerializationProperties{
Format: to.Ptr(armstreamanalytics.JSONOutputSerializationFormatLineSeparated),
Encoding: to.Ptr(armstreamanalytics.EncodingUTF8),
},
},
},
}, &armstreamanalytics.OutputsClientUpdateOptions{IfMatch: nil})
if err != nil {
log.Fatalf("failed to finish the request: %v", err)
}
// You could use response here. We use blank identifier for just demo purposes.
_ = res
// If the HTTP response code is 200 as defined in example definition, your response structure would look as follows. Please pay attention that all the values in the output are fake values for just demo purposes.
// res.Output = armstreamanalytics.Output{
// Name: to.Ptr("output5195"),
// Type: to.Ptr("Microsoft.StreamAnalytics/streamingjobs/outputs"),
// ID: to.Ptr("/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg6912/providers/Microsoft.StreamAnalytics/streamingjobs/sj3310/outputs/output5195"),
// Properties: &armstreamanalytics.OutputProperties{
// Datasource: &armstreamanalytics.EventHubOutputDataSource{
// Type: to.Ptr("Microsoft.ServiceBus/EventHub"),
// Properties: &armstreamanalytics.EventHubOutputDataSourceProperties{
// ServiceBusNamespace: to.Ptr("sdktest"),
// SharedAccessPolicyName: to.Ptr("RootManageSharedAccessKey"),
// EventHubName: to.Ptr("sdkeventhub"),
// PartitionKey: to.Ptr("differentPartitionKey"),
// },
// },
// Serialization: &armstreamanalytics.JSONSerialization{
// Type: to.Ptr(armstreamanalytics.EventSerializationTypeJSON),
// Properties: &armstreamanalytics.JSONSerializationProperties{
// Format: to.Ptr(armstreamanalytics.JSONOutputSerializationFormatLineSeparated),
// Encoding: to.Ptr(armstreamanalytics.EncodingUTF8),
// },
// },
// },
// }
}
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
const { StreamAnalyticsManagementClient } = require("@azure/arm-streamanalytics");
const { DefaultAzureCredential } = require("@azure/identity");
/**
* This sample demonstrates how to Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
*
* @summary Updates an existing output under an existing streaming job. This can be used to partially update (ie. update one or two properties) an output without affecting the rest the job or output definition.
* x-ms-original-file: specification/streamanalytics/resource-manager/Microsoft.StreamAnalytics/stable/2020-03-01/examples/Output_Update_EventHub.json
*/
async function updateAnEventHubOutputWithJsonSerialization() {
const subscriptionId = "56b5e0a9-b645-407d-99b0-c64f86013e3d";
const resourceGroupName = "sjrg6912";
const jobName = "sj3310";
const outputName = "output5195";
const output = {
datasource: {
type: "Microsoft.ServiceBus/EventHub",
partitionKey: "differentPartitionKey",
},
serialization: { type: "Json", format: "LineSeparated", encoding: "UTF8" },
};
const credential = new DefaultAzureCredential();
const client = new StreamAnalyticsManagementClient(credential, subscriptionId);
const result = await client.outputs.update(resourceGroupName, jobName, outputName, output);
console.log(result);
}
updateAnEventHubOutputWithJsonSerialization().catch(console.error);
To use the Azure SDK library in your project, see this documentation. To provide feedback on this code sample, open a GitHub issue
Sample response
ETag: 5020de6b-5bb3-4b88-8606-f11fb3c46185
{
"id": "/subscriptions/56b5e0a9-b645-407d-99b0-c64f86013e3d/resourceGroups/sjrg6912/providers/Microsoft.StreamAnalytics/streamingjobs/sj3310/outputs/output5195",
"name": "output5195",
"type": "Microsoft.StreamAnalytics/streamingjobs/outputs",
"properties": {
"datasource": {
"type": "Microsoft.ServiceBus/EventHub",
"properties": {
"eventHubName": "sdkeventhub",
"partitionKey": "differentPartitionKey",
"serviceBusNamespace": "sdktest",
"sharedAccessPolicyName": "RootManageSharedAccessKey"
}
},
"serialization": {
"type": "Json",
"properties": {
"encoding": "UTF8",
"format": "LineSeparated"
}
}
}
}
Definitions
AuthenticationMode
Authentication Mode. Valid modes are ConnectionString
, Msi
and 'UserToken'.
Name |
Type |
Description |
ConnectionString
|
string
|
|
Msi
|
string
|
|
UserToken
|
string
|
|
AvroSerialization
Describes how data from an input is serialized or how data is serialized when written to an output in Avro format.
Name |
Type |
Description |
type
|
string:
Avro
|
Indicates the type of serialization that the input or output uses. Required on PUT (CreateOrReplace) requests.
|
AzureDataLakeStoreOutputDataSource
Describes an Azure Data Lake Store output data source.
Name |
Type |
Default value |
Description |
properties.accountName
|
string
|
|
The name of the Azure Data Lake Store account. Required on PUT (CreateOrReplace) requests.
|
properties.authenticationMode
|
AuthenticationMode
|
ConnectionString
|
Authentication Mode.
|
properties.dateFormat
|
string
|
|
The date format. Wherever {date} appears in filePathPrefix, the value of this property is used as the date format instead.
|
properties.filePathPrefix
|
string
|
|
The location of the file to which the output should be written to. Required on PUT (CreateOrReplace) requests.
|
properties.refreshToken
|
string
|
|
A refresh token that can be used to obtain a valid access token that can then be used to authenticate with the data source. A valid refresh token is currently only obtainable via the Azure Portal. It is recommended to put a dummy string value here when creating the data source and then going to the Azure Portal to authenticate the data source which will update this property with a valid refresh token. Required on PUT (CreateOrReplace) requests.
|
properties.tenantId
|
string
|
|
The tenant id of the user used to obtain the refresh token. Required on PUT (CreateOrReplace) requests.
|
properties.timeFormat
|
string
|
|
The time format. Wherever {time} appears in filePathPrefix, the value of this property is used as the time format instead.
|
properties.tokenUserDisplayName
|
string
|
|
The user display name of the user that was used to obtain the refresh token. Use this property to help remember which user was used to obtain the refresh token.
|
properties.tokenUserPrincipalName
|
string
|
|
The user principal name (UPN) of the user that was used to obtain the refresh token. Use this property to help remember which user was used to obtain the refresh token.
|
type
|
string:
Microsoft.DataLake/Accounts
|
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
AzureFunctionOutputDataSource
Defines the metadata of AzureFunctionOutputDataSource
Name |
Type |
Description |
properties.apiKey
|
string
|
If you want to use an Azure Function from another subscription, you can do so by providing the key to access your function.
|
properties.functionAppName
|
string
|
The name of your Azure Functions app.
|
properties.functionName
|
string
|
The name of the function in your Azure Functions app.
|
properties.maxBatchCount
|
number
|
A property that lets you specify the maximum number of events in each batch that's sent to Azure Functions. The default value is 100.
|
properties.maxBatchSize
|
number
|
A property that lets you set the maximum size for each output batch that's sent to your Azure function. The input unit is in bytes. By default, this value is 262,144 bytes (256 KB).
|
type
|
string:
Microsoft.AzureFunction
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
AzureSqlDatabaseOutputDataSource
Describes an Azure SQL database output data source.
Name |
Type |
Default value |
Description |
properties.authenticationMode
|
AuthenticationMode
|
ConnectionString
|
Authentication Mode.
|
properties.database
|
string
|
|
The name of the Azure SQL database. Required on PUT (CreateOrReplace) requests.
|
properties.maxBatchCount
|
number
|
|
Max Batch count for write to Sql database, the default value is 10,000. Optional on PUT requests.
|
properties.maxWriterCount
|
number
|
|
Max Writer count, currently only 1(single writer) and 0(based on query partition) are available. Optional on PUT requests.
|
properties.password
|
string
|
|
The password that will be used to connect to the Azure SQL database. Required on PUT (CreateOrReplace) requests.
|
properties.server
|
string
|
|
The name of the SQL server containing the Azure SQL database. Required on PUT (CreateOrReplace) requests.
|
properties.table
|
string
|
|
The name of the table in the Azure SQL database. Required on PUT (CreateOrReplace) requests.
|
properties.user
|
string
|
|
The user name that will be used to connect to the Azure SQL database. Required on PUT (CreateOrReplace) requests.
|
type
|
string:
Microsoft.Sql/Server/Database
|
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
AzureSynapseOutputDataSource
Describes an Azure Synapse output data source.
Name |
Type |
Description |
properties.database
|
string
|
The name of the Azure SQL database. Required on PUT (CreateOrReplace) requests.
|
properties.password
|
string
|
The password that will be used to connect to the Azure SQL database. Required on PUT (CreateOrReplace) requests.
|
properties.server
|
string
|
The name of the SQL server containing the Azure SQL database. Required on PUT (CreateOrReplace) requests.
|
properties.table
|
string
|
The name of the table in the Azure SQL database. Required on PUT (CreateOrReplace) requests.
|
properties.user
|
string
|
The user name that will be used to connect to the Azure SQL database. Required on PUT (CreateOrReplace) requests.
|
type
|
string:
Microsoft.Sql/Server/DataWarehouse
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
AzureTableOutputDataSource
Describes an Azure Table output data source.
Name |
Type |
Description |
properties.accountKey
|
string
|
The account key for the Azure Storage account. Required on PUT (CreateOrReplace) requests.
|
properties.accountName
|
string
|
The name of the Azure Storage account. Required on PUT (CreateOrReplace) requests.
|
properties.batchSize
|
integer
|
The number of rows to write to the Azure Table at a time.
|
properties.columnsToRemove
|
string[]
|
If specified, each item in the array is the name of a column to remove (if present) from output event entities.
|
properties.partitionKey
|
string
|
This element indicates the name of a column from the SELECT statement in the query that will be used as the partition key for the Azure Table. Required on PUT (CreateOrReplace) requests.
|
properties.rowKey
|
string
|
This element indicates the name of a column from the SELECT statement in the query that will be used as the row key for the Azure Table. Required on PUT (CreateOrReplace) requests.
|
properties.table
|
string
|
The name of the Azure Table. Required on PUT (CreateOrReplace) requests.
|
type
|
string:
Microsoft.Storage/Table
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
BlobOutputDataSource
Describes a blob output data source.
Name |
Type |
Default value |
Description |
properties.authenticationMode
|
AuthenticationMode
|
ConnectionString
|
Authentication Mode.
|
properties.blobPathPrefix
|
string
|
|
Blob path prefix.
|
properties.container
|
string
|
|
The name of a container within the associated Storage account. This container contains either the blob(s) to be read from or written to. Required on PUT (CreateOrReplace) requests.
|
properties.dateFormat
|
string
|
|
The date format. Wherever {date} appears in pathPattern, the value of this property is used as the date format instead.
|
properties.pathPattern
|
string
|
|
The blob path pattern. Not a regular expression. It represents a pattern against which blob names will be matched to determine whether or not they should be included as input or output to the job. See https://docs.microsoft.com/en-us/rest/api/streamanalytics/stream-analytics-input or https://docs.microsoft.com/en-us/rest/api/streamanalytics/stream-analytics-output for a more detailed explanation and example.
|
properties.storageAccounts
|
StorageAccount[]
|
|
A list of one or more Azure Storage accounts. Required on PUT (CreateOrReplace) requests.
|
properties.timeFormat
|
string
|
|
The time format. Wherever {time} appears in pathPattern, the value of this property is used as the time format instead.
|
type
|
string:
Microsoft.Storage/Blob
|
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
CsvSerialization
Describes how data from an input is serialized or how data is serialized when written to an output in CSV format.
DiagnosticCondition
Condition applicable to the resource, or to the job overall, that warrant customer attention.
Name |
Type |
Description |
code
|
string
|
The opaque diagnostic code.
|
message
|
string
|
The human-readable message describing the condition in detail. Localized in the Accept-Language of the client request.
|
since
|
string
|
The UTC timestamp of when the condition started. Customers should be able to find a corresponding event in the ops log around this time.
|
Diagnostics
Describes conditions applicable to the Input, Output, or the job overall, that warrant customer attention.
Name |
Type |
Description |
conditions
|
DiagnosticCondition[]
|
A collection of zero or more conditions applicable to the resource, or to the job overall, that warrant customer attention.
|
DocumentDbOutputDataSource
Describes a DocumentDB output data source.
Name |
Type |
Description |
properties.accountId
|
string
|
The DocumentDB account name or ID. Required on PUT (CreateOrReplace) requests.
|
properties.accountKey
|
string
|
The account key for the DocumentDB account. Required on PUT (CreateOrReplace) requests.
|
properties.collectionNamePattern
|
string
|
The collection name pattern for the collections to be used. The collection name format can be constructed using the optional {partition} token, where partitions start from 0. See the DocumentDB section of https://docs.microsoft.com/en-us/rest/api/streamanalytics/stream-analytics-output for more information. Required on PUT (CreateOrReplace) requests.
|
properties.database
|
string
|
The name of the DocumentDB database. Required on PUT (CreateOrReplace) requests.
|
properties.documentId
|
string
|
The name of the field in output events used to specify the primary key which insert or update operations are based on.
|
properties.partitionKey
|
string
|
The name of the field in output events used to specify the key for partitioning output across collections. If 'collectionNamePattern' contains the {partition} token, this property is required to be specified.
|
type
|
string:
Microsoft.Storage/DocumentDB
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
Encoding
Specifies the encoding of the incoming data in the case of input and the encoding of outgoing data in the case of output.
Name |
Type |
Description |
UTF8
|
string
|
|
Error
Common error representation.
Name |
Type |
Description |
error
|
Error
|
Error definition properties.
|
EventHubOutputDataSource
Describes an Event Hub output data source.
Name |
Type |
Default value |
Description |
authenticationMode
|
AuthenticationMode
|
ConnectionString
|
Authentication Mode.
|
properties.eventHubName
|
string
|
|
The name of the Event Hub. Required on PUT (CreateOrReplace) requests.
|
properties.partitionKey
|
string
|
|
The key/column that is used to determine to which partition to send event data.
|
properties.propertyColumns
|
string[]
|
|
The properties associated with this Event Hub output.
|
serviceBusNamespace
|
string
|
|
The namespace that is associated with the desired Event Hub, Service Bus Queue, Service Bus Topic, etc. Required on PUT (CreateOrReplace) requests.
|
sharedAccessPolicyKey
|
string
|
|
The shared access policy key for the specified shared access policy. Required on PUT (CreateOrReplace) requests.
|
sharedAccessPolicyName
|
string
|
|
The shared access policy name for the Event Hub, Service Bus Queue, Service Bus Topic, etc. Required on PUT (CreateOrReplace) requests.
|
type
|
string:
Microsoft.ServiceBus/EventHub
|
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
EventHubV2OutputDataSource
Describes an Event Hub output data source.
Name |
Type |
Default value |
Description |
authenticationMode
|
AuthenticationMode
|
ConnectionString
|
Authentication Mode.
|
properties.eventHubName
|
string
|
|
The name of the Event Hub. Required on PUT (CreateOrReplace) requests.
|
properties.partitionKey
|
string
|
|
The key/column that is used to determine to which partition to send event data.
|
properties.propertyColumns
|
string[]
|
|
The properties associated with this Event Hub output.
|
serviceBusNamespace
|
string
|
|
The namespace that is associated with the desired Event Hub, Service Bus Queue, Service Bus Topic, etc. Required on PUT (CreateOrReplace) requests.
|
sharedAccessPolicyKey
|
string
|
|
The shared access policy key for the specified shared access policy. Required on PUT (CreateOrReplace) requests.
|
sharedAccessPolicyName
|
string
|
|
The shared access policy name for the Event Hub, Service Bus Queue, Service Bus Topic, etc. Required on PUT (CreateOrReplace) requests.
|
type
|
string:
Microsoft.EventHub/EventHub
|
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
EventSerializationType
Indicates the type of serialization that the input or output uses. Required on PUT (CreateOrReplace) requests.
Name |
Type |
Description |
Avro
|
string
|
|
Csv
|
string
|
|
Json
|
string
|
|
Parquet
|
string
|
|
GatewayMessageBusOutputDataSource
Describes a Gateway Message Bus output data source.
Name |
Type |
Description |
properties.topic
|
string
|
The name of the Service Bus topic.
|
type
|
string:
GatewayMessageBus
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
Specifies the format of the JSON the output will be written in. The currently supported values are 'lineSeparated' indicating the output will be formatted by having each JSON object separated by a new line and 'array' indicating the output will be formatted as an array of JSON objects.
Name |
Type |
Description |
Array
|
string
|
|
LineSeparated
|
string
|
|
JsonSerialization
Describes how data from an input is serialized or how data is serialized when written to an output in JSON format.
Name |
Type |
Description |
properties.encoding
|
Encoding
|
Specifies the encoding of the incoming data in the case of input and the encoding of outgoing data in the case of output. Required on PUT (CreateOrReplace) requests.
|
properties.format
|
JsonOutputSerializationFormat
|
This property only applies to JSON serialization of outputs only. It is not applicable to inputs. This property specifies the format of the JSON the output will be written in. The currently supported values are 'lineSeparated' indicating the output will be formatted by having each JSON object separated by a new line and 'array' indicating the output will be formatted as an array of JSON objects. Default value is 'lineSeparated' if left null.
|
type
|
string:
Json
|
Indicates the type of serialization that the input or output uses. Required on PUT (CreateOrReplace) requests.
|
Output
An output object, containing all information associated with the named output. All outputs are contained under a streaming job.
Name |
Type |
Description |
id
|
string
|
Resource Id
|
name
|
string
|
Resource name
|
properties.datasource
|
OutputDataSource:
|
Describes the data source that output will be written to. Required on PUT (CreateOrReplace) requests.
|
properties.diagnostics
|
Diagnostics
|
Describes conditions applicable to the Input, Output, or the job overall, that warrant customer attention.
|
properties.etag
|
string
|
The current entity tag for the output. This is an opaque string. You can use it to detect whether the resource has changed between requests. You can also use it in the If-Match or If-None-Match headers for write operations for optimistic concurrency.
|
properties.serialization
|
Serialization:
|
Describes how data from an input is serialized or how data is serialized when written to an output. Required on PUT (CreateOrReplace) requests.
|
properties.sizeWindow
|
integer
|
The size window to constrain a Stream Analytics output to.
|
properties.timeWindow
|
string
|
The time frame for filtering Stream Analytics job outputs.
|
type
|
string
|
Resource type
|
ParquetSerialization
Describes how data from an input is serialized or how data is serialized when written to an output in Parquet format.
Name |
Type |
Description |
type
|
string:
Parquet
|
Indicates the type of serialization that the input or output uses. Required on PUT (CreateOrReplace) requests.
|
PowerBIOutputDataSource
Describes a Power BI output data source.
Name |
Type |
Default value |
Description |
properties.authenticationMode
|
AuthenticationMode
|
ConnectionString
|
Authentication Mode.
|
properties.dataset
|
string
|
|
The name of the Power BI dataset. Required on PUT (CreateOrReplace) requests.
|
properties.groupId
|
string
|
|
The ID of the Power BI group.
|
properties.groupName
|
string
|
|
The name of the Power BI group. Use this property to help remember which specific Power BI group id was used.
|
properties.refreshToken
|
string
|
|
A refresh token that can be used to obtain a valid access token that can then be used to authenticate with the data source. A valid refresh token is currently only obtainable via the Azure Portal. It is recommended to put a dummy string value here when creating the data source and then going to the Azure Portal to authenticate the data source which will update this property with a valid refresh token. Required on PUT (CreateOrReplace) requests.
|
properties.table
|
string
|
|
The name of the Power BI table under the specified dataset. Required on PUT (CreateOrReplace) requests.
|
properties.tokenUserDisplayName
|
string
|
|
The user display name of the user that was used to obtain the refresh token. Use this property to help remember which user was used to obtain the refresh token.
|
properties.tokenUserPrincipalName
|
string
|
|
The user principal name (UPN) of the user that was used to obtain the refresh token. Use this property to help remember which user was used to obtain the refresh token.
|
type
|
string:
PowerBI
|
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
ServiceBusQueueOutputDataSource
Describes a Service Bus Queue output data source.
Name |
Type |
Default value |
Description |
properties.authenticationMode
|
AuthenticationMode
|
ConnectionString
|
Authentication Mode.
|
properties.propertyColumns
|
string[]
|
|
A string array of the names of output columns to be attached to Service Bus messages as custom properties.
|
properties.queueName
|
string
|
|
The name of the Service Bus Queue. Required on PUT (CreateOrReplace) requests.
|
properties.serviceBusNamespace
|
string
|
|
The namespace that is associated with the desired Event Hub, Service Bus Queue, Service Bus Topic, etc. Required on PUT (CreateOrReplace) requests.
|
properties.sharedAccessPolicyKey
|
string
|
|
The shared access policy key for the specified shared access policy. Required on PUT (CreateOrReplace) requests.
|
properties.sharedAccessPolicyName
|
string
|
|
The shared access policy name for the Event Hub, Service Bus Queue, Service Bus Topic, etc. Required on PUT (CreateOrReplace) requests.
|
properties.systemPropertyColumns
|
object
|
|
The system properties associated with the Service Bus Queue. The following system properties are supported: ReplyToSessionId, ContentType, To, Subject, CorrelationId, TimeToLive, PartitionKey, SessionId, ScheduledEnqueueTime, MessageId, ReplyTo, Label, ScheduledEnqueueTimeUtc.
|
type
|
string:
Microsoft.ServiceBus/Queue
|
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
ServiceBusTopicOutputDataSource
Describes a Service Bus Topic output data source.
Name |
Type |
Default value |
Description |
properties.authenticationMode
|
AuthenticationMode
|
ConnectionString
|
Authentication Mode.
|
properties.propertyColumns
|
string[]
|
|
A string array of the names of output columns to be attached to Service Bus messages as custom properties.
|
properties.serviceBusNamespace
|
string
|
|
The namespace that is associated with the desired Event Hub, Service Bus Queue, Service Bus Topic, etc. Required on PUT (CreateOrReplace) requests.
|
properties.sharedAccessPolicyKey
|
string
|
|
The shared access policy key for the specified shared access policy. Required on PUT (CreateOrReplace) requests.
|
properties.sharedAccessPolicyName
|
string
|
|
The shared access policy name for the Event Hub, Service Bus Queue, Service Bus Topic, etc. Required on PUT (CreateOrReplace) requests.
|
properties.systemPropertyColumns
|
object
|
|
The system properties associated with the Service Bus Topic Output. The following system properties are supported: ReplyToSessionId, ContentType, To, Subject, CorrelationId, TimeToLive, PartitionKey, SessionId, ScheduledEnqueueTime, MessageId, ReplyTo, Label, ScheduledEnqueueTimeUtc.
|
properties.topicName
|
string
|
|
The name of the Service Bus Topic. Required on PUT (CreateOrReplace) requests.
|
type
|
string:
Microsoft.ServiceBus/Topic
|
|
Indicates the type of data source output will be written to. Required on PUT (CreateOrReplace) requests.
|
StorageAccount
The properties that are associated with an Azure Storage account
Name |
Type |
Description |
accountKey
|
string
|
The account key for the Azure Storage account. Required on PUT (CreateOrReplace) requests.
|
accountName
|
string
|
The name of the Azure Storage account. Required on PUT (CreateOrReplace) requests.
|