How do I get the logprobs with the Chat Completion Model gpt-4-turbo-2024-04-09 with 2024-06-01 api version?

Yash Runwal 0 Reputation points
2024-11-19T17:39:23.5266667+00:00

I have deployed a gpt-4-turbo-2024-04-09 model in Azure OpenAI Service and am constructing the payload with the logprobs parameter as True. But in the response, the logprobs are absent. But according to the following documentation, it should be available with the 2024-06-01 api version.

https://learn.microsoft.com/en-us/azure/ai-services/openai/whats-new

So what am I missing here?

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
4,098 questions
{count} votes

2 answers

Sort by: Most helpful
  1. Sina Salam 22,031 Reputation points Volunteer Moderator
    2024-11-19T19:10:41.97+00:00

    Hello Yash Runwal ,

    Welcome to the Microsoft Q&A and thank you for posting your questions here.

    I understand that you would like to know how you can get the logprobs with the Chat Completion Model gpt-4-turbo-2024-04-09 with 2024-06-01 api version.

    No, you cannot get logprobs with the Chat Completion Model gpt-4-turbo-2024-04-09 even with the 2024-06-01 API version. The logprobs parameter is supported for the completions API but not for the chat completions API. This is likely why you are not seeing logprobs in your response.

    If you need logprobs, you should use the completions API instead of the chat completions API. This is an example of how to construct your payload for the completions API:

    import openai
    response = openai.Completion.create(
        model="gpt-4-turbo-2024-04-09",
        prompt="Your prompt here",
        max_tokens=100,
        logprobs=5,
        api_version="2024-06-01"
    )
    print(response)
    

    I hope this is helpful! Do not hesitate to let me know if you have any other questions.


    Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.

    0 comments No comments

  2. Pavankumar Purilla 8,570 Reputation points Microsoft External Staff Moderator
    2024-11-19T20:29:13.06+00:00

    Hi Yash Runwal,

    Greetings & Welcome to the Microsoft Q&A forum! Thank you for posting your query.

    I understand that you are facing an issue with logprobs not appearing in the response from the Azure OpenAI service using the GPT-4-Turbo model (gpt-4-turbo-2024-04-09) with the 2024-06-01 API version.

    I have tested this scenario using the same model and API version (2024-06-01), and I was able to successfully retrieve logprobs in the response.
    Here's a working example of a correct API request:

    curl -X POST https://<resourcename>.openai.azure.com/openai/deployments/deploymentname/chat/completions?api-version=2024-06-01 ^
    -H "Content-Type: application/json" ^
    -H "api-key: <API KEY>" ^
    -d "{\"messages\":[{\"role\":\"user\",\"content\":\"What is the capital of France?\"}],\"max_tokens\":10,\"temperature\":0,\"logprobs\":true,\"top_logprobs\":5}"
    
    

    Here is the screenshot of output:
    User's image Here are the some troubleshoot steps:

    1. Ensure the deployed model is gpt-4-turbo-2024-04-09.
    2. Confirm the API version is set to 2024-06-01.
    3. Use "logprobs": true in the request payload.
    4. Double-check the deployment name in the API URL matches your deployment.
    5. Ensure your subscription has sufficient quota and necessary permissions.
    6. Test the same payload with another deployment or region to see if results vary.
    7. If still unresolved, contact Azure support with resource details, payload, response, and error logs.

    Please refer the following documentation: createChatCompletionRequest.

    Hope this helps. Do let us know if you have any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.