Error "Backend returned unexpected response" with DeepSeek in AI Foundry

Wenjun Che 140 Reputation points
2025-03-17T22:25:29.8966667+00:00

Trying to call the chat completion API for LLMs deployed in AI Foundry using the 'openai' JavaScript SDK. The code works correctly for the gpt-40-mini model, but when using DeepSeek-V3, it results in an "InternalServerError" with the message "Backend returned unexpected response. Please contact Microsoft for help."

Here is the code:

import { AzureOpenAI, AzureClientOptions } from "openai";

const endpoint = '...';
const apiKey = '...';
const client = new AzureOpenAI({
    apiKey,
    apiVersion: '2024-05-01-preview', // '2025-01-01-preview' for gpt-4o-mini,
    endpoint,
});

const messages: ChatCompletionMessageParam[] = [
    { role: 'user', content: 'why is sky blue' }
];

const response = await client.chat.completions.create({
            model:  'DeepSeek-V3', // or 'gpt-4o-mini',
            messages
        });
Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,602 questions
{count} votes

Accepted answer
  1. Pavankumar Purilla 8,335 Reputation points Microsoft External Staff Moderator
    2025-03-18T00:04:50.23+00:00

    Hi Wenjun Che,

    The OpenAI SDK (openai package) does not support DeepSeek-V3 in AI Foundry, which is why you're seeing the "Backend returned unexpected response" error.

    To resolve this, use the Azure AI Inference SDK (@azure-rest/ai-inference) instead. This SDK is designed for AI Foundry models like DeepSeek-V3, and it does not require specifying an apiVersion.

    For more information: Azure AI Inference SDK Documentation
    I hope this information helps. Thank you!


0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.