@Eddy Angel Welcome to Microsoft Q&A Forum, Thank you for posting your query here!
.
The Face client SDKs for liveness are a gated feature. You must request access to the liveness feature by filling out the Face Recognition intake form.
.
Please note, When your Azure subscription is granted access, you can download the Face liveness SDK.
.
Once you have access to the SDK, follow the instructions and samples included in the github-azure-ai-vision-sdk to integrate the UI and the code needed into your native mobile application. The SDK supports both Java/Kotlin for Android and Swift for iOS mobile applications.
Upon integration into your application, the SDK will handle starting the camera, guide the end-user to adjust their position, compose the liveness payload and then call the Face API to process the liveness payload.
Once your access is approved, For access authorization to the repos, the steps depend on your system Git and your security preferences. You need to get the authentication token as mentioned in the steps here.
https://github.com/Azure-Samples/azure-ai-vision-sdk/blob/main/GET_FACE_ARTIFACTS_ACCESS.md
.
Useful Resources:
SDK for Microsoft's Azure AI Vision: https://github.com/Azure-Samples/azure-ai-vision-sdk?tab=readme-ov-file
To get access to the the SDK artifacts and the Swift_README, Kotlin_README to integrate the SDK into the sample application.
Face Limited Access page.
Sample | Platform | Description |
---|---|---|
Kotlin sample app for Android | Android | App with source code that demonstrates face analysis on Android |
Kotlin sample app for Android | Android | App with source code that demonstrates face analysis on Android |
Swift sample app for iOS | iOS | App with source code that demonstrates face analysis on iOS |
NextJS sample app for Web | Web | App with source code that demonstrates face analysis on Web |
JavaScript sample app for Web | Web | App with source code that demonstrates face analysis on Web |
.
Hope this helps.