Azure Functions Cannot Authenticate to Storage Account

Konstantinos Passadis 17,286 Reputation points
2023-03-05T15:28:40.79+00:00

Hello all!

I am at breaking point ! I hava an Azure Functions Java deployemnt which reads changes from a CosmosDB As Input trigger binding and then it moves the data as a CSV file to Azure Storage, output binding. I have tried everything to make the Functon Authenticate , with Managed Identity , Service Principal , at the end i am getting the error:

"Exception while executing function: Functions.CosmosTrigger1 Result: Failure

Exception: Server failed to authenticate the request. Please refer to the information in the www-authenticate header. "

The Cosmos part is working fine since i can see the trigger firing , but always i come to this error when it tries to auth and write the data. I have also set many parameters as the SAS Conenction string , with and without the container , key2 , HML encode and so on.....

I have seen in the Azure AD Logs that the sign-in either with Managed Identity or with SP are showing success!!! So i would really appreciate your help ! BTW i am not a coder so this was a real struggle to reach this point .....but i am confident that someone will point me to the right direction!

Also networking is fine ( Otherwise i would not see success in the sign-n logs in AAD)

The code is here :

const { DefaultAzureCredential } = require("@azure/identity");

const { BlobServiceClient } = require("@azure/storage-blob");

const csv = require('csv-parser');

const stream = require('stream');

const moment = require('moment');

require('dotenv').config()

const accountName = process.env.BlobAccountName;

const containerName = process.env.BlobContainerName;

const defaultAzureCredential = new DefaultAzureCredential({

**additionallyAllowedTenants: ["*"]**

**});**

module.exports = async function (context, documents) {

**context.log(`Cosmos DB trigger function processed ${documents.length} documents`);**

**// Create a BlobServiceClient object which will be used to create a container client**

**const blobServiceClient = new BlobServiceClient(**

    **`https://${accountName}.blob.core.windows.net`,**

    **defaultAzureCredential**

**);**

**// Get a reference to a container**

**const containerClient = blobServiceClient.getContainerClient(containerName);**

**// Create a new blob name**

**const blobName = moment().utc().format('YYYY/MM/DD/HH/mm/ss') + '.csv';**

**// Create an output stream to write CSV data**

**const outputStream = new stream.Writable();**

**const csvData = [];**

**outputStream._write = (chunk, encoding, done) => {**

    **csvData.push(chunk);**

    **done();**

**};**

**outputStream.getReadableStream = () => {**

    **const bufferStream = new stream.PassThrough();**

    **bufferStream.write(csvData.join(''));**

    **bufferStream.end();**

    **return bufferStream;**

**};**

**// Parse the input documents and write them to the output stream**

**for (const document of documents) {**

    **const csvRow = {**

        **'id': document.id,**

        **'firstName': document.firstName,**

        **'lastName': document.lastName,**

        **'nickname': document.nickname**

    **};**

    **outputStream.write(JSON.stringify(csvRow));**

**}**

**outputStream.end();**

**// Upload the CSV data to a Storage Blob**

**const blobClient = containerClient.getBlockBlobClient(blobName);**

**const uploadOptions = { blobHTTPHeaders: { blobContentType: 'text/csv' } };**

**await blobClient.uploadStream(outputStream.getReadableStream(), undefined, undefined, uploadOptions);**

**context.log(`Uploaded CSV data to blob: ${blobName}`);**

};

And the functions.json is this

{

"bindings": [

**{**

  **"type": "cosmosDBTrigger",**

  **"name": "documents",**

  **"direction": "in",**

  **"leaseCollectionName": "leases",**

  **"connectionStringSetting": "xxxxx_XXXXXX",**

  **"databaseName": "Input",**

  **"collectionName": "Signup",**

  **"createLeaseCollectionIfNotExists": true**

**},**

**{**

  **"type": "blob",**

  **"name": "outputBlob",**

  **"connection": "AzureWebJobsStorage_accountname",**

  **"path": "csvfiles/Users.csv",**

  **"direction": "out",**

  **"sasToken": {**

    **"name": "BlobSasUri",**

    **"type": "custom"**

  **}**

**}**

],

"disabled": false,

"environment": {

**"AZURE_CLIENT_ID" : "XXXXXXXXXXX",**

 **"AZURE_TENANT_ID" : "XXXXX-XXXXX-XXXX-XXXX-XXXXXXXX",**

  **"AZURE_CLIENT_SECRET" : "XXXX~XXXXXXXXXXXXXXXX",**

**"ContainerName": "csvfiles",**

**"BlobAccountName": "XXXXXXXX"**

}

}

Microsoft Identity Manager
Microsoft Identity Manager
A family of Microsoft products that manage a user's digital identity using identity synchronization, certificate management, and user provisioning.
610 questions
Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
4,238 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,425 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Konstantinos Passadis 17,286 Reputation points
    2023-03-06T05:19:51.35+00:00

    I found the problem. Must add the Storage Account user.impersonation permission to the Service Principal!

    Cheers!