Edit

Share via


Quickstart: Use Azure AI Content Understanding REST API

Prerequisites

To get started, you need an active Azure subscription. If you don't have an Azure account, create one for free.

  • Once you have your Azure subscription, create an Azure AI Foundry resource in the Azure portal.

    • This resource is listed under AI Foundry > AI Foundry in the portal.

      Screenshot of the AI Foundry resource page in the Azure portal.

In this guide, we use the cURL command line tool. If it isn't installed, you can download the appropriate version for your dev environment.

Get started with a prebuilt analyzer

Analyzers define how your content is processed and the insights that are extracted. We offer prebuilt analyzers for common use cases. You can customize prebuilt analyzers to better fit your specific needs and use cases. This quickstart uses prebuilt document, image, audio, and video analyzers to help you get started.

Send file for analysis

Before running the following cURL command, make the following changes to the HTTP request:

  1. Replace {endpoint} and {key} with the corresponding values from your Azure AI Foundry instance in the Azure portal.
  2. Replace {analyzerId} with prebuilt-documentAnalyzer. This analyzer extracts text and layout elements such as paragraphs, sections, and tables from a document.
  3. Replace {fileUrl} with a publicly accessible URL of the file to analyze—such as a path to an Azure Storage Blob with a shared access signature (SAS), or use the sample URL: https://github.com/Azure-Samples/azure-ai-content-understanding-python/raw/refs/heads/main/data/invoice.pdf.

POST request

curl -i -X POST "{endpoint}/contentunderstanding/analyzers/{analyzerId}:analyze?api-version=2025-05-01-preview" \
  -H "Ocp-Apim-Subscription-Key: {key}" \
  -H "Content-Type: application/json" \
  -d "{\"url\":\"{fileUrl}\"}"

POST response

The response includes a JSON body containing the resultId, which you use to retrieve the results of the asynchronous analysis operation. Additionally, the Operation-Location header provides the direct URL to access the analysis result.

202 Accepted
Operation-Location: {endpoint}/contentunderstanding/analyzerResults/{resultId}?api-version=2024-12-01-preview
{
  "id": {resultId},
  "status": "Running",
  "result": {
    "analyzerId": {analyzerId},
    "apiVersion": "2025-05-01-preview",
    "createdAt": "YYYY-MM-DDTHH:MM:SSZ",
    "warnings": [],
    "contents": []
  }
}

Get analyze result

Use the resultId from the POST response and retrieve the result of the analysis.

  1. Replace {endpoint} and {key} with the endpoint and key values from your Azure portal Azure AI Foundry instance.
  2. Replace {resultId} with the resultId from the POST response.

GET request

curl -i -X GET "GET {endpoint}/contentunderstanding/analyzers/{analyzerId}?api-version=2025-05-01-preview" \
  -H "Ocp-Apim-Subscription-Key: {key}"

GET response

The 200 (OK) JSON response includes a status field indicating the status of the operation. If the operation isn't complete, the value of status is Running or NotStarted. In such cases, you should send the GET request again, either manually or through a script. Wait an interval of one second or more between calls.

{
  "id": {resultId},
  "status": "Succeeded",
  "result": {
    "analyzerId": "prebuilt-documentAnalyzer",
    "apiVersion": "2025-05-01-preview",
    "createdAt": "YYYY-MM-DDTHH:MM:SSZ",
    "warnings": [],
    "contents": [
      {
        "markdown": "CONTOSO LTD.\n\n\n# INVOICE\n\nContoso Headquarters\n123 456th St...",
        "fields": {
          "Summary": {
            "type": "string",
            "valueString": "This document is an invoice issued by Contoso Ltd. to Microsoft Corporation for services rendered during the period of 10/14/2019 to 11/14/2019..."
          }
        },
        "kind": "document",
        "startPageNumber": 1,
        "endPageNumber": 1,
        "unit": "inch",
        "pages": [
          {
            "pageNumber": 1,
            "angle": -0.0039,
            "width": 8.5,
            "height": 11,
            "spans": [ { "offset": 0, "length": 1650 } ],
            "words": [
              {
                "content": "CONTOSO",
                "span": { "offset": 0, "length": 7 },
                "confidence": 0.998,
                "source": "D(1,0.5739,0.6582,1.7446,0.6595,1.7434,0.8952,0.5729,0.8915)"
              }, ...
            ],
            "lines": [
              {
                "content": "CONTOSO LTD.",
                "source": "D(1,0.5734,0.6563,2.335,0.6601,2.3345,0.8933,0.5729,0.8895)",
                "span": { "offset": 0, "length": 12 }
              }, ...
            ]
          }
        ],
        "paragraphs": [
          {
            "content": "CONTOSO LTD.",
            "source": "D(1,0.5734,0.6563,2.335,0.6601,2.3345,0.8933,0.5729,0.8895)",
            "span": { "offset": 0, "length": 12 }
          },
          {
            "role": "title",
            "content": "INVOICE",
            "source": "D(1,7.0515,0.5614,8.0064,0.5628,8.006,0.791,7.0512,0.7897)",
            "span": { "offset": 15, "length": 9 }
          }, ...
        ],
        "sections": [
          {
            "span": { "offset": 0, "length": 1649 },
            "elements": [ "/sections/1", "/sections/2" ]
          }, ...
        ],
        "tables": [
          {
            "rowCount": 2,
            "columnCount": 6,
            "cells": [
              {
                "kind": "columnHeader",
                "rowIndex": 0,
                "columnIndex": 0,
                "rowSpan": 1,
                "columnSpan": 1,
                "content": "SALESPERSON",
                "source": "D(1,0.5389,4.5514,1.7505,4.5514,1.7505,4.8364,0.5389,4.8364)",
                "span": { "offset": 512, "length": 11 },
                "elements": [ "/paragraphs/19" ]
              }, ...
            ],
            "source": "D(1,0.4885,4.5543,8.0163,4.5539,8.015,5.1207,0.4879,5.1209)",
            "span": { "offset": 495, "length": 228 }
          }, ...
        ]
      }
    ]
  }
}

Next steps

Learn more about creating custom analyzers for your use case.