Data Juggler You can share your feedback by creating a new issue on the repo and/or on this portal.
Do let me know if you have any further queries.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
I have been using the new Azure Open AI 2.0.0 beta, and I noticed that in the examples shown, the entire message history is fed back for each subsequent call, so the message history grows with each round of prompts.
Wouldn't it be more efficient to pass back a Conversation Id with a prompt, and the rest of the conversation could be loaded from a temporary cache of the current conversation?
I am considering building a site as a demo, but if each user sends the entire conversation back, it seems that doesn't scale as well.
Data Juggler You can share your feedback by creating a new issue on the repo and/or on this portal.
Do let me know if you have any further queries.