how to lower the confidence threshold value

Joseph Wan 20 Reputation points
2025-05-17T08:05:17.6733333+00:00

the facial recognition works perfectly after registration but it may not recognize same person next week.

I tried to change the lower confidenceThreshold value to 70% for identify call but it doesn't seems work. the return confidence still over 80%

Azure Face
Azure Face
An Azure service that provides artificial intelligence algorithms that detect, recognize, and analyze human faces in images.
182 questions
0 comments No comments
{count} votes

Accepted answer
  1. Sina Salam 22,031 Reputation points Volunteer Moderator
    2025-05-17T18:32:23.14+00:00

    Hello Joseph Wan,

    Welcome to the Microsoft Q&A and thank you for posting your questions here.

    I understand that you would like to know how you can lower the confidence threshold value.

    This issue is not just about adjusting the confidence threshold parameter, but understanding how facial recognition systems interpret and apply it. Most platforms like AWS Rekognition, Azure Face API, and Face++ treat the confidenceThreshold as a filter. Because it determines the minimum score required for a match to be considered valid, but it does not influence or reduce the actual confidence score returned by the model. So, if you're still seeing confidence scores above 80%, it means the system is confident in the match, and the threshold is working as intended. However, if no match is returned, it likely means no face met the threshold criteria, not that the system failed entirely.

    Therefore, the inconsistency you're experiencing over time is more likely due to changes in facial appearance, lighting conditions, or pose variations rather than a misconfiguration of the threshold. Facial recognition models are sensitive to such variations, and even subtle changes can significantly affect match accuracy. This is a common challenge across platforms and not unique to your implementation. To mitigate this, it's best practice to register multiple images of a person under different lighting, angles, and expressions. This helps the model build a more robust representation of the individual.

    Additionally, make sure you're using the API correctly. For an example, in AWS Rekognition, the FaceMatchThreshold parameter in the SearchFacesByImage call should be set like this:

    {
      "CollectionId": "my-face-collection",
      "Image": {
        "S3Object": {
          "Bucket": "my-bucket",
          "Name": "input.jpg"
        }
      },
      "FaceMatchThreshold": 70,
      "MaxFaces": 1
    }
    

    If you're using Azure Face API, the confidenceThreshold in the Identify call works similarly, It filters out candidates below the threshold but does not alter the confidence score itself. You can refer to the official Azure documentation here: https://docs.aws.amazon.com/rekognition/latest/dg/faces.html

    Now, to improve long-term recognition reliability, consider periodically updating the registered face data, especially if users’ appearances change. Also, log and analyze failed matches to identify patterns and this can help you refine your registration and matching strategies.

    I hope this is helpful! Do not hesitate to let me know if you have any other questions or clarifications.


    Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.

    1 person found this answer helpful.
    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.