Possibility to use prompt flow with private endpoint?

Kushal 20 Reputation points

My case; I want to develop my LLM in a secure environment so I want to deploy it in private networks. Is that a private endpoint needed and how to accomplish it? Any suggestions or guidance is appreciated.

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
2,561 questions
0 comments No comments
{count} votes

Accepted answer
  1. YutongTie-MSFT 46,406 Reputation points


    Thanks for reaching out to us, yes, you can secure prompt flow using private networks.

    When you're developing your LLM application using prompt flow, you want a secured environment. You can make the following services private via network setting.

    • Workspace: you can make Azure Machine Learning workspace as private and limit inbound and outbound of it.
    • Compute resource: you can also limit inbound and outbound rule of compute resource in the workspace.
    • Storage account: you can limit the accessibility of the storage account to specific virtual network.
    • Container registry: you also want to secure your container registry with virtual network.
    • Endpoint: you want to limit Azure services or IP address to access your endpoint.
    • Related Azure Cognitive Services as such Azure OpenAI, Azure content safety and Azure AI Search, you can use network config to make them as private then using private endpoint to let Azure Machine Learning services communicate with them.
    • Other non Azure resources such as SerpAPI etc. If you have strict outbound rule, you need add FQDN rule to access them.

    For next steps of how to build your prompt flow in a private network, please follow the guidance below -


    I hope this helps, please let us know if you have any other questions.



    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful