Rediger

Del via


Azure OpenAI Semantic Search Input Binding for Azure Functions

Important

The Azure OpenAI extension for Azure Functions is currently in preview.

The Azure OpenAI semantic search input binding allows you to use semantic search on your embeddings.

For information on setup and configuration details of the Azure OpenAI extension, see Azure OpenAI extensions for Azure Functions. To learn more about semantic ranking in Azure AI Search, see Semantic ranking in Azure AI Search.

Note

References and examples are only provided for the Node.js v4 model.

Note

References and examples are only provided for the Python v2 model.

Note

While both C# process models are supported, only isolated worker model examples are provided.

Example

This example shows how to perform a semantic search on a file.


public class EmbeddingsStoreOutputResponse
{
    [EmbeddingsStoreOutput("{url}", InputType.Url, "AISearchEndpoint", "openai-index", Model = "%EMBEDDING_MODEL_DEPLOYMENT_NAME%")]
    public required SearchableDocument SearchableDocument { get; init; }

    public IActionResult? HttpResponse { get; set; }
}

This example shows how to perform a semantic search on a file.

        return searchableDocument;
    }

}

@FunctionName("PromptFile")
public HttpResponseMessage promptFile(
    @HttpTrigger(
        name = "req", 
        methods = {HttpMethod.POST},
        authLevel = AuthorizationLevel.ANONYMOUS)
        HttpRequestMessage<SemanticSearchRequest> request,
    @SemanticSearch(name = "search", connectionName = "AISearchEndpoint", collection = "openai-index", query = "{prompt}", chatModel = "%CHAT_MODEL_DEPLOYMENT_NAME%", embeddingsModel = "%EMBEDDING_MODEL_DEPLOYMENT_NAME%" ) String semanticSearchContext,
    final ExecutionContext context) {
        String response = new JSONObject(semanticSearchContext).getString("Response");

Examples aren't yet available.

This example shows how to perform a semantic search on a file.

    type: 'embeddings',
    maxChunkLength: 512,
    model: '%EMBEDDING_MODEL_DEPLOYMENT_NAME%'
})

app.http('getEmbeddingsFilePath', {
    methods: ['POST'],
    route: 'embeddings-from-file',
    authLevel: 'function',
    extraInputs: [embeddingsFilePathInput],
    handler: async (request, context) => {
        let requestBody: EmbeddingsFilePath = await request.json();
        let response: any = context.extraInputs.get(embeddingsFilePathInput);

        context.log(
            `Received ${response.count} embedding(s) for input file ${requestBody.FilePath}.`
        );
        
        // TODO: Store the embeddings into a database or other storage.

This example shows how to perform a semantic search on a file.

Here's the function.json file for prompting a file:

{
  "bindings": [
    {
      "authLevel": "function",
      "type": "httpTrigger",
      "direction": "in",
      "name": "Request",
      "methods": [
        "post"
      ]
    },
    {
      "type": "http",
      "direction": "out",
      "name": "Response"
    },
    {
      "name": "SemanticSearchInput",
      "type": "semanticSearch",
      "direction": "in",
      "connectionName": "AISearchEndpoint",
      "collection": "openai-index",
      "query": "{prompt}",
      "chatModel": "%CHAT_MODEL_DEPLOYMENT_NAME%",
      "embeddingsModel": "%EMBEDDING_MODEL_DEPLOYMENT_NAME%"
    }
  ]
}

For more information about function.json file properties, see the Configuration section.

using namespace System.Net

param($Request, $TriggerMetadata, $SemanticSearchInput)

Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
        StatusCode = [HttpStatusCode]::OK
        Body       = $SemanticSearchInput.Response
    })

Examples aren't yet available.

Attributes

Apply the SemanticSearchInput attribute to define a semantic search input binding, which supports these parameters:

Parameter Description
ConnectionName The name of an app setting or environment variable that contains the connection string value. This property supports binding expressions.
Collection The name of the collection or table or index to search. This property supports binding expressions.
Query The semantic query text to use for searching. This property supports binding expressions.
EmbeddingsModel The ID of the model to use for embeddings. The default value is text-embedding-3-small. This property supports binding expressions.
ChatModel Gets or sets the name of the Large Language Model to invoke for chat responses. The default value is gpt-3.5-turbo. This property supports binding expressions.
SystemPrompt Optional. Gets or sets the system prompt to use for prompting the large language model. The system prompt is appended with knowledge that is fetched as a result of the Query. The combined prompt is sent to the OpenAI Chat API. This property supports binding expressions.
MaxKnowledgeCount Optional. Gets or sets the number of knowledge items to inject into the SystemPrompt.

Annotations

The SemanticSearchInput annotation enables you to define a semantic search input binding, which supports these parameters:

Element Description
name Gets or sets the name of the input binding.
connectionName The name of an app setting or environment variable that contains the connection string value. This property supports binding expressions.
collection The name of the collection or table or index to search. This property supports binding expressions.
query The semantic query text to use for searching. This property supports binding expressions.
embeddingsModel The ID of the model to use for embeddings. The default value is text-embedding-3-small. This property supports binding expressions.
chatModel Gets or sets the name of the Large Language Model to invoke for chat responses. The default value is gpt-3.5-turbo. This property supports binding expressions.
systemPrompt Optional. Gets or sets the system prompt to use for prompting the large language model. The system prompt is appended with knowledge that is fetched as a result of the Query. The combined prompt is sent to the OpenAI Chat API. This property supports binding expressions.
maxKnowledgeCount Optional. Gets or sets the number of knowledge items to inject into the SystemPrompt.

Decorators

During the preview, define the input binding as a generic_input_binding binding of type semanticSearch, which supports these parameters:

Parameter Description
arg_name The name of the variable that represents the binding parameter.
connection_name The name of an app setting or environment variable that contains the connection string value. This property supports binding expressions.
collection The name of the collection or table or index to search. This property supports binding expressions.
query The semantic query text to use for searching. This property supports binding expressions.
embeddings_model The ID of the model to use for embeddings. The default value is text-embedding-3-small. This property supports binding expressions.
chat_model Gets or sets the name of the Large Language Model to invoke for chat responses. The default value is gpt-3.5-turbo. This property supports binding expressions.
system_prompt Optional. Gets or sets the system prompt to use for prompting the large language model. The system prompt is appended with knowledge that is fetched as a result of the Query. The combined prompt is sent to the OpenAI Chat API. This property supports binding expressions.
max_knowledge_count Optional. Gets or sets the number of knowledge items to inject into the SystemPrompt.

Configuration

The binding supports these configuration properties that you set in the function.json file.

Property Description
type Must be semanticSearch.
direction Must be in.
name The name of the input binding.
connectionName Gets or sets the name of an app setting or environment variable that contains a connection string value. This property supports binding expressions.
collection The name of the collection or table or index to search. This property supports binding expressions.
query The semantic query text to use for searching. This property supports binding expressions.
embeddingsModel The ID of the model to use for embeddings. The default value is text-embedding-3-small. This property supports binding expressions.
chatModel Gets or sets the name of the Large Language Model to invoke for chat responses. The default value is gpt-3.5-turbo. This property supports binding expressions.
systemPrompt Optional. Gets or sets the system prompt to use for prompting the large language model. The system prompt is appended with knowledge that is fetched as a result of the Query. The combined prompt is sent to the OpenAI Chat API. This property supports binding expressions.
maxKnowledgeCount Optional. Gets or sets the number of knowledge items to inject into the SystemPrompt.

Configuration

The binding supports these properties, which are defined in your code:

Property Description
connectionName The name of an app setting or environment variable that contains the connection string value. This property supports binding expressions.
collection The name of the collection or table or index to search. This property supports binding expressions.
query The semantic query text to use for searching. This property supports binding expressions.
embeddingsModel The ID of the model to use for embeddings. The default value is text-embedding-3-small. This property supports binding expressions.
chatModel Gets or sets the name of the Large Language Model to invoke for chat responses. The default value is gpt-3.5-turbo. This property supports binding expressions.
systemPrompt Optional. Gets or sets the system prompt to use for prompting the large language model. The system prompt is appended with knowledge that is fetched as a result of the Query. The combined prompt is sent to the OpenAI Chat API. This property supports binding expressions.
maxKnowledgeCount Optional. Gets or sets the number of knowledge items to inject into the SystemPrompt.

Usage

See the Example section for complete examples.