Llama 3.1 405B Instruct as a serverless API not working with stream response
wong2
5
Reputation points
I have deployed Llama 3.1 models with Azure AI Studio by following this document: https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-llama?tabs=llama-three
When calling it with the API, it works if stream
is set to false:
But if I set stream
to true, the response is empty:
Sign in to answer