How to Capture Incoming Webhook JSON Payload using Az Functions

PS 376 Reputation points
2023-02-01T21:53:17.3133333+00:00

Hi All,

I have configured a basic Azure Function and using that function URL as an endpoint for a Webhook on a vendor's website. I have turned on the App Insights and was able to see the response coming in the log. I would like to capture the JSON payload as individual files for every event and store it in the ADLS. Can someone pls point me to the resources to achieve this?

TIA.

Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
4,262 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. PS 376 Reputation points
    2023-02-13T21:01:02.54+00:00

    Thank you @MughundhanRaveendran-MSFT .

    I was able to get it to work by modifying some of the code and here is the final version which is working for me and this might not be the perfect way to achieve this but, serving my purpose. Hope this helps someone with similar need.

    using System;
    using System.IO;
    using System.Threading.Tasks;
    using Microsoft.AspNetCore.Mvc;
    using Microsoft.Azure.WebJobs;
    using Microsoft.Azure.WebJobs.Extensions.Http;
    using Microsoft.AspNetCore.Http;
    using Microsoft.Extensions.Logging;
    using Newtonsoft.Json;
    using Azure.Storage.Blobs;
    
    namespace Webhook
    {
        public static class ThirdPartyWebhook
        {
            [FunctionName("ThirdPartyWebhook")]
            public static async Task<IActionResult> Run(
                [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
                ILogger log)
            {
                log.LogInformation("ThirdParty Payload has been captured...");
    
                string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
                dynamic data = JsonConvert.DeserializeObject(requestBody);
    
                string env_connstring = Environment.GetEnvironmentVariable("connectionstring");
                string env_container = Environment.GetEnvironmentVariable("container");
                string messageContent = $"{data}";
    
                WriteToBlob(env_connstring, env_container, messageContent);
    
                return new OkObjectResult(messageContent);
            }
            public static void WriteToBlob(string env_connstring, string env_container, string content)
            {
                //string filename = $"{DateTime.Now.Ticks}.json";
                
                string fileName = "Events_" + DateTime.Now.ToString("yyyyMMddHHmmss") + ".json";
                string filePath = $"ThirdParty/WebhookPayload/{fileName}";
    
                BlobContainerClient containerClient = new(env_connstring, env_container);
                BlobClient blobClient = containerClient.GetBlobClient(filePath);
    
                blobClient.Upload(BinaryData.FromString(content), overwrite: true);
            }
        }
    }
    

    Open for optimization suggestions.
    Thank you!

    1 person found this answer helpful.
    0 comments No comments

  2. MughundhanRaveendran-MSFT 12,421 Reputation points
    2023-02-07T06:08:15.51+00:00

    @PS ,

    Thanks for reaching out to Q&A forum.

    You can read the JSON data from the request and write to memory stream file. This memory stream file can be written to Azure Data lake storage using the Data lake storage SDK

    Sample code:

    
    [FunctionName("FunctionName")]
    public static async Task<IActionResult> Run(
        [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
        ILogger log)
    {
        log.LogInformation("C# HTTP trigger function processed a request.");
    
        string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
        dynamic data = JsonConvert.DeserializeObject(requestBody);
    
        using (var stream = new MemoryStream())
        {
            using (var writer = new StreamWriter(stream))
            {
                writer.Write(requestBody);
                writer.Flush();
                stream.Position = 0;
    
                // Store stream data in Data Lake Storage
                
                var fileName = "data/" + DateTime.Now.ToString("yyyyMMddHHmmss") + ".json";
                using (var streamReader = new StreamReader(stream))
                {
                    //call methods that are mentioned here : 
                }
            }
        }
    
        return new OkObjectResult("Data stored in Data Lake Storage");
    }
    
    

    In this example, the JSON payload is read from the request body, deserialized into a dynamic object, and then stored in a memory stream file. The memory stream is used to write the request body data, and then the stream position is set to 0 so it can be read again later if needed. The stream data is then written to Azure Data Lake Storage using the AdlsClient.CreateFile method from the Microsoft.Azure.DataLake.Store library. The file name is created using the current date and time, and the file is stored in a directory named "data". The connection string for the Data Lake Storage account needs to be updated in the code with the actual connection string.