Thanks for reaching out to us, yes, you can secure prompt flow using private networks.
When you're developing your LLM application using prompt flow, you want a secured environment. You can make the following services private via network setting.
- Workspace: you can make Azure Machine Learning workspace as private and limit inbound and outbound of it.
- Compute resource: you can also limit inbound and outbound rule of compute resource in the workspace.
- Storage account: you can limit the accessibility of the storage account to specific virtual network.
- Container registry: you also want to secure your container registry with virtual network.
- Endpoint: you want to limit Azure services or IP address to access your endpoint.
- Related Azure Cognitive Services as such Azure OpenAI, Azure content safety and Azure AI Search, you can use network config to make them as private then using private endpoint to let Azure Machine Learning services communicate with them.
- Other non Azure resources such as SerpAPI etc. If you have strict outbound rule, you need add FQDN rule to access them.
For next steps of how to build your prompt flow in a private network, please follow the guidance below -
I hope this helps, please let us know if you have any other questions.
Regards,
Yutong