Edit

Share via


Tutorial: Detect liveness in faces

Learn how to integrate face liveness detection into your workflow by using server-side logic and companion frontend client applications.

Tip

For general information about face liveness detection, see the conceptual guide.

In this tutorial, you learn how to run a frontend application with an app server to perform liveness detection. You can also add face verification across various platforms and languages.

Important

The Face client SDKs for liveness are a gated feature. You must request access to the liveness feature by filling out the Face Recognition intake form. When your Azure subscription is granted access, you can download the Face liveness SDK.

Prerequisites

  • Azure subscription - Create one for free
  • Your Azure account must have a Cognitive Services Contributor role assigned so you can agree to the responsible AI terms and create a resource. To get this role assigned to your account, follow the steps in the Assign roles documentation, or contact your administrator.
  • Once you have your Azure subscription, create a Face resource in the Azure portal to get your key and endpoint. After it deploys, select Go to resource.
    • You need the key and endpoint from the resource you create to connect your application to the Face service.
  • Access to the gated artifacts required for Azure Vision in Foundry Tools Face Client SDK for Mobile (iOS and Android) and Web.
  • Familiarity with the Face liveness detection feature. See the conceptual guide.

Tip

After completing the prerequisites, you can try the liveness experience on the following platforms:

  • iOS: iOS App Store — tap the app screen 10 times after installation to enable developer mode.
  • Android: Google Play Store — tap the app screen 10 times after installation to enable developer mode.
  • Web: Try it directly in Vision Studio.

You can also build and run a complete frontend sample (iOS, Android, or Web) from the Samples section.

Prepare the frontend application

We provide SDKs in multiple languages to simplify integration with your frontend application. Refer to the README for your chosen SDK in the following sections to integrate both the UI and required code.

Important

Each frontend SDK requires access to a gated asset to successfully compile. See the following instructions to set up this access.

For Swift iOS:

For Kotlin/Java Android:

For JavaScript Web:

Once integrated into your frontend application, the SDK starts the camera, guides the user to adjust their position, composes the liveness payload, and sends it to the Azure AI Face service for processing.

Monitor the repository’s Releases section for new SDK version updates and enable automated dependency update alerts—such as GitHub Dependabot (for GitHub repos) or Renovate (GitHub, GitLab, Bitbucket, Azure Repos).

Perform liveness detection

The following steps describe the liveness orchestration process:

Diagram of the liveness workflow in Azure AI Face.

  1. The frontend application starts the liveness check and notifies the app server.

  2. The app server creates a new liveness session with Azure AI Face Service. The service creates a liveness session and responds with a session authorization token. For more information about each request parameter involved in creating a liveness session, see Liveness Create Session Operation.

    var endpoint = new Uri(System.Environment.GetEnvironmentVariable("FACE_ENDPOINT"));
    var key = new AzureKeyCredential(System.Environment.GetEnvironmentVariable("FACE_APIKEY"));
    
    var body = JsonSerializer.Serialize(new
    {
        livenessOperationMode = "PassiveActive",
        deviceCorrelationId = "723d6d03-ef33-40a8-9682-23a1feb7bccd",
        enableSessionImage = true
    });
    
    using var client = new HttpClient();
    client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", key);
    
    var response = await client.PostAsync(
        $"{endpoint}/face/v1.2/detectLiveness-sessions",
        new StringContent(body, Encoding.UTF8, "application/json"));
    
    response.EnsureSuccessStatusCode();
    
    using var doc  = JsonDocument.Parse(await response.Content.ReadAsStringAsync());
    var root       = doc.RootElement;
    
    Console.WriteLine("Session created");
    Console.WriteLine($"sessionId : {root.GetProperty("sessionId").GetString()}");
    Console.WriteLine($"authToken : {root.GetProperty("authToken").GetString()}");
    

    An example of the response body:

    {
        "sessionId": "a6e7193e-b638-42e9-903f-eaf60d2b40a5",
        "authToken": "<session-authorization-token>",
        "status": "NotStarted",
        "modelVersion": "2025-05-20",
        "results": {
            "attempts": []
        }
    }
    
  3. The app server provides the session authorization token back to the frontend application.

  4. The frontend application uses the session authorization token to start the face liveness detector, which kicks off the liveness flow.

        FaceLivenessDetector(
            sessionAuthorizationToken = FaceSessionToken.sessionToken,
            verifyImageFileContent = FaceSessionToken.sessionSetInClientVerifyImage,
            deviceCorrelationId = "null",
            onSuccess = viewModel::onSuccess,
            onError = viewModel::onError
        )
    
  5. The SDK starts the camera, guides the user to position correctly, and then prepares the payload to call the liveness detection service endpoint.

  6. The SDK calls Azure Vision Face service to perform the liveness detection. Once the service responds, the SDK notifies the frontend application that the liveness check is complete. Note: The service response doesn't contain the liveness decision. You need to query this information from the app server.

  7. The frontend application relays the liveness check completion to the app server.

  8. The app server queries for the liveness detection result from Azure Vision Face service.

    using var client = new HttpClient();
    client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", key);
    
    var response = await client.GetAsync(
        $"{endpoint}/face/v1.2/livenessSessions/{sessionId}/result");
    
    response.EnsureSuccessStatusCode();
    
    using var doc = JsonDocument.Parse(await response.Content.ReadAsStringAsync());
    var root = doc.RootElement;
    var attempts = root.GetProperty("results").GetProperty("attempts");
    var latestAttempt = attempts[attempts.GetArrayLength() - 1];
    var attemptStatus = latestAttempt.GetProperty("attemptStatus").GetString();
    
    Console.WriteLine($"Session id: {root.GetProperty("sessionId").GetString()}");
    Console.WriteLine($"Session status: {root.GetProperty("status").GetString()}");
    Console.WriteLine($"Latest attempt status: {attemptStatus}");
    
    if (attemptStatus == "Succeeded")
        Console.WriteLine($"Liveness detection decision: {latestAttempt.GetProperty("result").GetProperty("livenessDecision").GetString()}");
    else
    {
        var error = latestAttempt.GetProperty("error");
        Console.WriteLine($"Error: {error.GetProperty("code").GetString()} - {error.GetProperty("message").GetString()}");
    }
    

    An example of the response body:

    {
        "sessionId": "b12e033e-bda7-4b83-a211-e721c661f30e",
        "authToken": "eyJhbGciOiJFUzI1NiIsIm",
        "status": "NotStarted",
        "modelVersion": "2024-11-15",
        "results": {
            "attempts": [
                {
                    "attemptId": 2,
                    "attemptStatus": "Succeeded",
                    "result": {
                    "livenessDecision": "realface",
                    "targets": {
                        "color": {
                        "faceRectangle": {
                                "top": 669,
                                "left": 203,
                                "width": 646,
                                "height": 724
                            }
                        }
                    },
                    "digest": "B0A803BB7B26F3C8F29CD36030F8E63ED3FAF955FEEF8E01C88AB8FD89CCF761",
                    "sessionImageId": "Ae3PVWlXAmVAnXgkAFt1QSjGUWONKzWiSr2iPh9p9G4I"
                    }
                },
                {
                    "attemptId": 1,
                    "attemptStatus": "Failed",
                    "error": {
                    "code": "FaceWithMaskDetected",
                    "message": "Mask detected on face image.",
                    "targets": {
                            "color": {
                            "faceRectangle": {
                                "top": 669,
                                "left": 203,
                                "width": 646,
                                "height": 724
                            }
                            }
                        }
                    }
                }
            ]
        }
    }
    
  9. The app server deletes the session after it queries all session results.

    using var client = new HttpClient();
    client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", key);
    
    await client.DeleteAsync($"{endpoint}/face/v1.2/livenessSessions/{sessionId}");
    Console.WriteLine($"Session deleted: {sessionId}");
    

Perform liveness detection with face verification

Combining face verification with liveness detection enables biometric verification of a particular person of interest with an added guarantee that the person is physically present in the system.

Diagram of the liveness-with-face-verification workflow of Azure AI Face.

Integrating liveness detection with verification involves two parts:

Step 1 - Select a reference image

To get the most accurate recognition results, follow the tips listed in the composition requirements for ID verification scenarios.

Step 2 - Set up the orchestration of liveness with verification

The following high-level steps show how to orchestrate liveness with verification:

  1. Provide the verification reference image by using one of the following two methods:

    • The app server provides the reference image when creating the liveness session. For more information about each request parameter involved in creating a liveness session with verification, see Liveness With Verify Create Session Operation.

      var endpoint = new Uri(System.Environment.GetEnvironmentVariable("FACE_ENDPOINT"));
      var key      = System.Environment.GetEnvironmentVariable("FACE_APIKEY");
      
      // Create the JSON part
      var jsonPart = new StringContent(
          JsonSerializer.Serialize(new
          {
              livenessOperationMode = "PassiveActive",
              deviceCorrelationId = "723d6d03-ef33-40a8-9682-23a1feb7bcc",
              enableSessionImage = true
          }),
          Encoding.UTF8,
          "application/json"
      );
      jsonPart.Headers.ContentDisposition = new ContentDispositionHeaderValue("form-data")
      {
          Name = "CreateLivenessWithVerifySessionRequest"
      };
      
      // Create the file part
      using var fileStream = File.OpenRead("test.png");
      var filePart = new StreamContent(fileStream);
      filePart.Headers.ContentType = new MediaTypeHeaderValue("image/png");
      filePart.Headers.ContentDisposition = new ContentDispositionHeaderValue("form-data")
      {
          Name = "VerifyImage",
          FileName = "test.png"
      };
      
      // Build multipart form data
      using var formData = new MultipartFormDataContent();
      formData.Add(jsonPart);
      formData.Add(filePart);
      
      using var client = new HttpClient();
      client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", key);
      
      var response = await client.PostAsync($"{endpoint}/face/v1.2/createLivenessWithVerifySession", formData);
      response.EnsureSuccessStatusCode();
      
      using var doc = JsonDocument.Parse(await response.Content.ReadAsStringAsync());
      var root = doc.RootElement;
      
      Console.WriteLine("Session created.");
      Console.WriteLine($"Session id: {root.GetProperty("sessionId").GetString()}");
      Console.WriteLine($"Auth token: {root.GetProperty("authToken").GetString()}");
      

      An example of the response body:

      {
          "sessionId": "3847ffd3-4657-4e6c-870c-8e20de52f567",
          "authToken": "<session-authorization-token>",
          "status": "NotStarted",
          "modelVersion": "2024-11-15",
          "results": {
              "attempts": [],
              "verifyReferences": [
              {
                  "referenceType": "image",
                  "faceRectangle": {
                  "top": 98,
                  "left": 131,
                  "width": 233,
                  "height": 300
                  },
                  "qualityForRecognition": "high"
              }
              ]
          }
      }
      
    • The frontend application provides the reference image when initializing the mobile SDKs. This scenario isn't supported in the web solution.

          FaceLivenessDetector(
              sessionAuthorizationToken = FaceSessionToken.sessionToken,
              verifyImageFileContent = FaceSessionToken.sessionSetInClientVerifyImage,
              deviceCorrelationId = "null",
              onSuccess = viewModel::onSuccess,
              onError = viewModel::onError
          )
      
  2. The app server can now query for the verification result in addition to the liveness result.

    using var client = new HttpClient();
    client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", key);
    
    var response = await client.GetAsync($"{endpoint}/face/v1.2/livenessSessions/{sessionId}/result");
    response.EnsureSuccessStatusCode();
    
    using var doc = JsonDocument.Parse(await response.Content.ReadAsStringAsync());
    var root = doc.RootElement;
    
    var attempts = root.GetProperty("results").GetProperty("attempts");
    var latestAttempt = attempts[attempts.GetArrayLength() - 1];
    var attemptStatus = latestAttempt.GetProperty("attemptStatus").GetString();
    
    Console.WriteLine($"Session id: {root.GetProperty("sessionId").GetString()}");
    Console.WriteLine($"Session status: {root.GetProperty("status").GetString()}");
    Console.WriteLine($"Latest attempt status: {attemptStatus}");
    
    if (attemptStatus == "Succeeded")
    {
        var decision = latestAttempt.GetProperty("result").GetProperty("livenessDecision").GetString();
        var verify   = latestAttempt.GetProperty("verifyResult");
        Console.WriteLine($"Liveness detection decision: {decision}");
        Console.WriteLine($"Verify isIdentical: {verify.GetProperty("isIdentical").GetBoolean()}");
        Console.WriteLine($"Verify matchConfidence: {verify.GetProperty("matchConfidence").GetDouble()}");
    }
    else
    {
        var err = latestAttempt.GetProperty("error");
        Console.WriteLine($"Error: {err.GetProperty("code").GetString()} - {err.GetProperty("message").GetString()}");
    }
    

    An example of the response body:

    {
        "sessionId": "b12e033e-bda7-4b83-a211-e721c661f30e",
        "authToken": "eyJhbGciOiJFUzI1NiIsIm",
        "status": "NotStarted",
        "modelVersion": "2024-11-15",
        "results": {
            "attempts": [
            {
                "attemptId": 2,
                "attemptStatus": "Succeeded",
                "result": {
                "livenessDecision": "realface",
                "targets": {
                    "color": {
                    "faceRectangle": {
                        "top": 669,
                        "left": 203,
                        "width": 646,
                        "height": 724
                    }
                    }
                },
                "verifyResult": {
                    "matchConfidence": 0.08871888,
                    "isIdentical": false
                },
                "digest": "B0A803BB7B26F3C8F29CD36030F8E63ED3FAF955FEEF8E01C88AB8FD89CCF761",
                "sessionImageId": "Ae3PVWlXAmVAnXgkAFt1QSjGUWONKzWiSr2iPh9p9G4I",
                "verifyImageHash": "43B7D8E8769533C3290DBD37A84D821B2C28CB4381DF9C6784DBC4AAF7E45018"
                }
            },
            {
                "attemptId": 1,
                "attemptStatus": "Failed",
                "error": {
                    "code": "FaceWithMaskDetected",
                    "message": "Mask detected on face image.",
                    "targets": {
                        "color": {
                        "faceRectangle": {
                                "top": 669,
                                "left": 203,
                                "width": 646,
                                "height": 724
                            }
                        }
                    }
                }
            }
            ],
            "verifyReferences": [
                {
                    "referenceType": "image",
                    "faceRectangle": {
                    "top": 316,
                    "left": 131,
                    "width": 498,
                    "height": 677
                    },
                    "qualityForRecognition": "high"
                }
            ]
            }
        }
    
  3. The app server can delete the session if you don't need its result anymore.

    using var client = new HttpClient();
    client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", key);
    
    await client.DeleteAsync($"{endpoint}/face/v1.2/livenessWithVerifySessions/{sessionId}");
    Console.WriteLine($"Liveness-with-Verify session deleted: {sessionId}");
    

Perform other face operations after liveness detection

Optionally, you can perform additional face operations after the liveness check, such as face analysis (to get face attributes) and face identity operations.

  1. Set the enableSessionImage parameter to true during the Session-Creation step.
  2. Extract the sessionImageId from the Session-Get-Result step.
  3. Download the session image (referenced in Liveness Get Session Image Operation API), or provide the sessionImageId in the Detect from Session Image ID API operation to continue with other face analysis or face identity operations. For more information on these operations, see Face detection concepts and Face Recognition concepts.

Support options

In addition to using the main Foundry Tools support options, you can also post your questions in the issues section of the SDK repo.

To learn how to integrate the liveness solution into your existing application, see the Azure Vision SDK reference.

To learn more about the features available to orchestrate the liveness solution, see the Session REST API reference.