Need python example for gpt-4o model authentication using client_id and client_secret.
Hello Team,
I am trying to make some python examples for gpt-4o model. I need to use client_id, client_secret & API endpoint to use gpt-4o model chat completions. can you share some references for code ?
Also, not able to install azure-ai-openai so i am not able to use the below library as well. Please help with this. Thanks!!
Note: Main things is instead of API key i wanted to use client_id and Client_secret for authentication.
from azure.ai.openai import OpenAI
Azure OpenAI Service
-
YutongTie-MSFT 51,696 Reputation points
2024-09-17T17:27:40.69+00:00 Hello @Vijayakumar Elumalai
Thanks for reaching out to us, I think you are mentioning you would like to implement DefaultAzureCredential in your application as the document here - https://learn.microsoft.com/en-us/azure/developer/python/sdk/authentication/azure-hosted-apps?tabs=azure-cli%2Cazure-app-service#3---implement-defaultazurecredential-in-your-application
You need to ensure that your Service Principal has the "Cognitive Service OpenAI User" role assigned to it on the Azure OpenAI resource so that you can do it.
Check the environment for a service principal as defined by the environment variables
AZURE_CLIENT_ID
,AZURE_TENANT_ID
, and eitherAZURE_CLIENT_SECRET
orAZURE_CLIENT_CERTIFICATE_PATH
and (optionally)AZURE_CLIENT_CERTIFICATE_PASSWORD
.For code reference for call the secret, you can refer to above document, or there is a repo which leverage Azure OpenAI as an example, you can check on the scenario 3 and scenario 4 to see which is working better for you.
I hope this helps!
Regards,
Yutong
-
Vijayakumar Elumalai 105 Reputation points
2024-09-17T17:38:15.3566667+00:00 Hello @YutongTie-MSFT .. Thank you so much for response. Actually, in the below example instead of Azure openai endpoint and azure openai api key i wanted to use actual endpoint and for authentication i need to use client_id and client_secret which i already has it. is it possible ?
import os
from openai import AzureOpenAI
client = AzureOpenAI(
azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT"),
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
api_version="2024-02-01"
)
response = client.chat.completions.create(
model="gpt-35-turbo", # model = "deployment_name". messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Does Azure OpenAI support customer managed keys?"}, {"role": "assistant", "content": "Yes, customer managed keys are supported by Azure OpenAI."}, {"role": "user", "content": "Do other Azure AI services support this too?"} ]
)
print(response.choices[0].message.content)
-
YutongTie-MSFT 51,696 Reputation points
2024-09-17T17:45:16.34+00:00 Got it, I think below is the example for creating a client with endpoint and client_Id -
from azure.identity import DefaultAzureCredential, get_bearer_token_provider from openai import AzureOpenAI token_provider = get_bearer_token_provider( DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default" ) client = AzureOpenAI( api_version="2024-02-15-preview", azure_endpoint="https://{your-custom-endpoint}.openai.azure.com/", azure_ad_token_provider=token_provider ) response = client.chat.completions.create( model="gpt-35-turbo-0125", # model = "deployment_name". messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Does Azure OpenAI support customer managed keys?"}, {"role": "assistant", "content": "Yes, customer managed keys are supported by Azure OpenAI."}, {"role": "user", "content": "Do other Azure AI services support this too?"} ] ) print(response.choices[0].message.content)
It should works! I attached the document here - https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/managed-identity
But please be aware that you need to assign the role. Assign yourself either the Cognitive Services OpenAI User or Cognitive Services OpenAI Contributor role to allow you to use your account to make Azure OpenAI inference API calls rather than having to use key-based auth. After you make this change it can take up to 5 minutes before the change takes effect.
Also you need to set your client ID for use.
Let me know how this works.
Regards,
Yutong
-
Vijayakumar Elumalai 105 Reputation points
2024-09-17T18:18:36.9533333+00:00 We have a mulesoft layer in-between so need to use the client_id and client_secret combination for authentication. Also, we can use the direct post call full endpoint right instead of Azure endpoint ?
Endpoint -
https://xxxxx/api/xxxxx/1/genai/Azure/GPT-4o/completions?api-version=2024-02-01
Below is the openai example but i wanted to use client_id & client_secret for authentication.
from openai import OpenAI
import os
Set the API key and model name
MODEL="gpt-4o-mini"
client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY", "<your OpenAI API key if not set as an env var>"))
completion = client.chat.completions.create(
model=MODEL,
messages=[
{"role": "system", "content": "You are a helpful assistant. Help me with my math homework!"}, # <-- This is the system message that provides context to the model {"role": "user", "content": "Hello! Could you solve 2+2?"} # <-- This is the user message for which the model will generate a response
]
)
print("Assistant: " + completion.choices[0].message.content)
-
Vijayakumar Elumalai 105 Reputation points
2024-09-17T18:27:25.01+00:00 -
YutongTie-MSFT 51,696 Reputation points
2024-09-17T19:05:41.8833333+00:00 Thanks for your response, the example I shared is leveraging client_id and client_secret in the code, but they are only saved in the env. Do you mean you want to reveal your client_id and client_secret in the code directly rather than save it in the environment?
Regards,
Yutong
-
Vijayakumar Elumalai 105 Reputation points
2024-09-18T01:53:24.9633333+00:00 Yes, just for understanding purpose later i will handle it with environment.
-
Vijayakumar Elumalai 105 Reputation points
2024-09-18T08:18:22.3466667+00:00 Also, in below code i wanted to use API endpoint and for authentication need to use client-id & client_secret. can we do that ?
from openai import OpenAI
import os
Set the API key and model name
MODEL="gpt-4o-mini"
client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY", "<your OpenAI API key if not set as an env var>"))
completion = client.chat.completions.create(
model=MODEL,
messages=[
{"role": "system", "content": "You are a helpful assistant. Help me with my math homework!"}, # <-- This is the system message that provides context to the model {"role": "user", "content": "Hello! Could you solve 2+2?"} # <-- This is the user message for which the model will generate a response
]
)
print("Assistant: " + completion.choices[0].message.content)
-
YutongTie-MSFT 51,696 Reputation points
2024-09-19T01:14:13.7533333+00:00 Hello @Vijayakumar Elumalai
Thanks for your response, yes you can leverage your client_id and client_secret directly, since you are mentioning you want to use model gpt 4o, please refer to below sample -
import os import requests import base64 from azure.identity import ClientSecretCredential # Configuration TENANT_ID = "YOUR_TENANT_ID" CLIENT_ID = "YOUR_CLIENT_ID" CLIENT_SECRET = "YOUR_CLIENT_SECRET" RESOURCE = "https://management.azure.com/.default" IMAGE_PATH = "YOUR_IMAGE_PATH" # Authenticate and get token credential = ClientSecretCredential(tenant_id=TENANT_ID, client_id=CLIENT_ID, client_secret=CLIENT_SECRET) token = credential.get_token(RESOURCE).token encoded_image = base64.b64encode(open(IMAGE_PATH, 'rb').read()).decode('ascii') headers = { "Content-Type": "application/json", "Authorization": f"Bearer {token}" } # Payload for the request payload = { "messages": [ { "role": "system", "content": [ { "type": "text", "text": "You are an AI assistant that helps people find information." } ] } ], "temperature": 0.7, "top_p": 0.95, "max_tokens": 800 } ENDPOINT = "https://yutongopenai2.openai.azure.com/openai/deployments/gpt-4o-mini/chat/completions?api-version=2024-02-15-preview" # Send request try: response = requests.post(ENDPOINT, headers=headers, json=payload) response.raise_for_status() # Will raise an HTTPError if the HTTP request returned an unsuccessful status code except requests.RequestException as e: raise SystemExit(f"Failed to make the request. Error: {e}") # Handle the response as needed (e.g., print or process) print(response.json())
You can always get a code sample for different model from Azure OpenAI Studio play ground -> chat -> view code.
I hope this helps!
Regards,
Yutong
-
YutongTie-MSFT 51,696 Reputation points
2024-09-19T15:20:30.3+00:00 Hello Vijayakumar,
Have you tried above solution which leverage the client_id, client_secret, endpoint. I hope it works for you! We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others. Otherwise, will respond with more details and we will try to help.
Regards, Yutong
-
YutongTie-MSFT 51,696 Reputation points
2024-09-20T15:28:45.0166667+00:00 Hello Vijayakumar,
Thanks for reaching out to us again, I hope your issue has been resolved, feel free to open a new thread if you have any other question.
Let us know if you still need help on this topic. If above comment helps, please let us know!
Regards,
Yutong
Sign in to comment