Hi Ahmed Saber Elkhouly(Administrator),
Thanks for posting your question on Microsoft Q&A.
To deploy a DeepSeek model on Azure AI Foundry as a serverless service and access it privately via a private endpoint, follow these steps:
- Set Up Private Endpoint & DNS:
- Create a private endpoint to securely access the model within your virtual network.
- Configure a private DNS zone and link it to your virtual network to ensure proper DNS resolution to private IPs.
- Install the inference package:
pip install azure-ai-inference
- Use the endpoint URL in the format:
https://your-host-name.your-azure-region.inference.ai.azure.com
- Authenticate using either a key or Microsoft Entra ID credentials.
For more details, refer to these Microsoft docs:
If the reply was helpful please don't forget to upvote and/or accept as answer or let me know if you have any other query.
Thank you