Azure AI use tokens for prompts even AI don't respond, how to fix it?

Vishav Singh | CHECKER 0 Reputation points
2024-07-14T10:38:38.28+00:00

Hi,

I am using Auzre OpenAI API services for chat completion. However, when I send a prompt to AI then sometimes it does not respond. In this case, token counts are being reduce even though Azure Open AI didn't respond. Respond error occurs whenever I send around 4000 token prompts to process. The response is not more then 800 tokens.

How can I:

1- Avoid token loss when Azure AI don't respond?

2- Another way to handle this matter?

Vishav Deep Singh

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
2,607 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 19,221 Reputation points
    2024-07-14T15:04:17.8366667+00:00

    You can implement a retry mechanism and error handling in your code. This way, if a request fails, it can be retried without immediately counting it as a token loss.

    Or, when dealing with large prompts, you can:

    • If possible, break down the prompt into smaller chunks and process them individually.
    • Use the stream parameter to get responses as they are generated, reducing the risk of hitting token limits.
    • Simplify or condense your prompts to stay within token limits while conveying the necessary information.
    0 comments No comments