Hi @현욱 김,
Here are my findings:
When using the stream option in Azure OpenAI, the response is divided into chunks to allow for more efficient processing and transmission of data. The size and number of chunks can vary depending on the length and complexity of the input text, as well as the specific AI model being used.
In general, the chunk size is determined by the AI model and the input text. The model will generate chunks based on the structure and content of the input text, which can result in smaller or larger chunks depending on the complexity of the text.
Unfortunately, there is no way to designate a specific chunk size when using the stream option in Azure OpenAI. The chunk size is determined by the AI model and cannot be modified by the user.
If you are experiencing issues with the chunk size when using Azure OpenAI, you may want to try adjusting the input text to see if this affects the size and number of chunks generated. You can also try using a different AI model to see if this produces more desirable results.
Also, please refer to this sample if you can leverage it for your solution. This solution utilizes a text chunker in which you can configure minimum chunk size to exclude small chunks.
I hope this helps!
Please let me know if you have any further questions.
Thanks
Saurabh
Please 'Accept as answer' and Upvote if it helped so that it can help others in the community looking for help on similar topics.