OpenAI: error inference result

Sarah Kuhn 10 Reputation points
2023-09-08T11:21:47.6866667+00:00

i'm trying to make an inference request with my prox. I.ex i deployed a model on azure openAI services and now want to test and communicate with it.

But i get the following error while trying to do a inference request:

Can someone maybe help me on what i'm doing wrong ? Its my first project using Azure OpenAI Service...

Inference result: {'error': 'Error communicating with OpenAI: HTTPSConnectionPool(host=\'agiopenai.openai.azure.com\', port=443): Max retries exceeded with url: //openai/deployments/%3Cai-for-proxy%3E/chat/completions?api-version=2023-05-15 (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x7f7010033ca0>: Failed to resolve \'agiopenai.openai.azure.com\' ([Errno -3] Temporary failure in name resolution)"))'}
{'_content': b'{"error":"Error communicating with OpenAI: HTTPSConnectionPool(h'
             b"ost='agiopenai.openai.azure.com', port=443): Max retries exceede"
             b'd with url: //openai/deployments/%3Cai-for-proxy%3E/chat/complet'
             b'ions?api-version=2023-05-15 (Caused by NameResolutionError(\\'
             b'"<urllib3.connection.HTTPSConnection object at 0x7f7010033ca0>: '
             b"Failed to resolve 'agiopenai.openai.azure.com' ([Errno -3] Tempo"
             b'rary failure in name resolution)\\"))"}\n',
Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
3,198 questions
Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
2,893 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.