@Anuj Agarwal I tried the below sample and it worked with my Azure OpenAI deployment with streaming enabled. Please see the screen shot below.
import asyncio
# gets API Key from environment variable OPENAI_API_KEY
client = AsyncAzureOpenAI(azure_endpoint = "your_endpoint",api_key="your_key",api_version="2023-09-01-preview")
async def main() -> None:
stream = await client.chat.completions.create(
model="your_deployment_name",
messages = [ {"role": "user", "content": "What is chatgpt?"} ],
stream=True,
)
#async for data in stream:
# print(data.model_dump_json())
# print("test")
async for choices in stream:
print(choices.model_dump_json(indent=2))
print()
asyncio.run(main())
Referencing the same issue from SDK repo.
If this answers your query, do click Accept Answer
and Yes
for was this answer helpful. And, if you have any further query do let us know.