compress and write a file in another container with azure blob storage trigger in nodejs

Giampaolo Spagoni 26 Reputation points
2021-08-03T20:42:17.01+00:00

I have to make an API call passing a compressed file as input. I have a working example on-premise but I would like to move the solution to cloud. I was thinking to use azure blob storage and the azure function trigger. I have the below code that works for files but I don't know how to do the same with Azure blob storage and azure function in nodejs

const zlib = require('zlib');
const fs = require('fs');

const def = zlib.createDeflate();

input = fs.createReadStream('claudio.json')
output = fs.createWriteStream('claudio-def.json')

input.pipe(def).pipe(output)

this code read a file as a stream , compress the file and write another file as a stream.

what I would like to do is reading the file any time I upload it in a container of azure blob storage, then I want to compress it and save it in a different container with a different name, then make an API call passing as input the compressed file saved in the other container

I tried this code for compressing the incoming file

const fs = require("fs");
const zlib = require('zlib');
const {Readable, Writable} = require('stream');
module.exports = async function (context, myBlob) {
context.log("JavaScript blob trigger function processed blob \n Blob:", context.bindingData.blobTrigger, "\n Blob Size:", myBlob.length, "Bytes");
// const fin = fs.createReadStream(context.bindingData.blobTrigger);
const def = zlib.createDeflate();
const s = Readable.from(myBlob.toString())
context.log(myBlob);
context.bindings.outputBlob = s.pipe(def)
};
the problem with this approach is that in the last line of the code

context.bindings.outputBlob = s.pipe(def)

I don't have the compressed file, while if I use this

s.pipe(def).pipe(process.stdout)
i can read the compressed file

as you can see above I also tried to use the fs.createReadStream(context.bindingData.blobTrigger) that contains the name of the uploaded file with the container name, but it doesn't work

any idea? thank you

Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
4,354 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,472 questions
0 comments No comments
{count} votes

Accepted answer
  1. AnuragSingh-MSFT 20,431 Reputation points
    2021-08-09T14:49:29.413+00:00

    @Giampaolo Spagoni

    The JavaScript and Java functions load the entire blob into memory which can be accessed using context.bindings.<name>. (*Where <name> is the binding name as specified in the function.json file. *)

    For more details, please check Azure Blob storage trigger for Azure Functions

    Therefore, it does not give you input stream as you get using fs.createReadStream or an output stream as you get when using fs.createWriteStream
    A simplistic example to read content of blob and write it to a different blob would look like below (you may include the logic to process the input content so that processed data gets written to the new blob in destination):

    module.exports = async function (context, myBlob) {  
        context.log("Function Triggered. ", context.bindingData.blobTrigger);  
          
    var input = context.bindings.myBlob;  
        /*  
        process the whole content stored in "input" here (encode/compress etc.);  
        In my example, I am uploading a .txt file with some data, therefore input  
        has the data contained in that .txt file.   
        */  
       
        context.bindings.myOutputBlob = input;  
        //this simply writes the content of input to output blob.  
    };  
    

    The corresponding bindings in the function.json file would like below:

    {  
      "bindings": [  
        {  
          "name": "myBlob",  
          "type": "blobTrigger",  
          "direction": "in",  
          "path": "source/{name}",  
          "connection": "storageaccountazfuna20e_STORAGE"  
        },  
        {  
          "name": "myOutputBlob",  
          "type": "blob",  
          "path": "dest/{name}-processed.txt",  
          "connection": "storageaccountazfuna20e_STORAGE",  
          "direction": "out"  
        }  
      ]  
    }  
    

    Here, {name} is the name of blob which triggered this function.
    Please 'Accept as answer' and ‘Upvote’ if it helped so that it can help others in the community looking for help on similar topics.


0 additional answers

Sort by: Most helpful