Hi @Autoize,
We have noticed that you rated an answer as not helpful. We appreciate your feedback and are committed to improving your experience with the Q&A.
Thanks for sharing your findings! It looks like the "Deploy models to Azure AI model inference service" toggle being enabled by default was causing the deployment to use the services.ai.azure.com
format. Disabling it ensured that the endpoint follows the correct models.ai.azure.com
structure.
Additionally, another way to achieve the models.ai.azure.com
format is by deploying the model through Azure ML Studio under the Model catalog section. This also ensures that the endpoint is provisioned correctly under the Serverless model inference framework.
Let us know if you have any further questions!
If this answer is helpful, do click Accept Answer
and Yes
. Thank you for your support.