Is it possible to use Azure Function Apps as proxy for LLM streams?

Giel Oomen 36 Reputation points
2024-05-17T14:25:18.8366667+00:00

Hi all,

Is there a way to stream textual data from an Azure Functions App (in Python)?

Case: we have a web app built completely on Azure with an API built in Azure Functions, we want to integrate LLM access with custom tool_calls so the LLM can access a users data and answer specific questions based on it. However, in order to integrate the LLM with data access and without doing this directly from the front-end we need some sort of proxy.

Azure Functions keep complaining when I try to use mimetype event/text-stream.

Thank you in advance, I think this will become a question for many app builders.

Azure Functions
Azure Functions
An Azure service that provides an event-driven serverless compute platform.
4,453 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Gowtham CP 2,920 Reputation points
    2024-05-17T15:28:47.0466667+00:00

    Hello Giel Oomen ,

    Thank you for reaching out on Microsoft Q&A.

    When it comes to streaming textual data from your Azure Functions App (Python), you've got three solid options to consider. First, there's Chunked HTTP Response, which is great for simplicity. Then, you might want to explore Azure Event Hubs, particularly suitable for handling larger datasets and asynchronous processing. Lastly, for real-time, interactive communication, check out Azure SignalR Service. Remember to take into account your data volume and real-time needs.

    If you found this solution helpful, consider accepting it.