你当前正在访问 Microsoft Azure Global Edition 技术文档网站。 如果需要访问由世纪互联运营的 Microsoft Azure 中国技术文档网站,请访问 https://docs.azure.cn 。
教程:人脸活体检测
本文内容
人脸活体检测可用于确定输入视频流中的人脸是真实的(活的)还是虚假的(仿冒的)。 这是生物识别身份验证系统中的重要组成部分,可以防止冒名顶替者使用照片、视频、面具或其他方式冒充他人进入系统。
活体检测的目标是确保系统在身份验证时与实际存在的活人进行交互。 随着数字金融、远程访问控制和在线身份验证流程的兴起,这些系统正变得越来越重要。
Azure AI 人脸活体检测解决方案可以成功防御各种欺骗类型,包括纸质打印输出、2D/3D 面具以及手机和笔记本电脑上的欺骗演示。 活体检测是一个活跃的研究领域,人们不断对其进行改进以应对日益复杂的欺骗攻击。 随着整体解决方案对新型攻击的防御变得越来越强大,我们也会不断向客户端和服务组件推出持续改进。
重要
用于活体的人脸客户端 SDK 是一项门控功能。 你必须填写人脸识别登记表 来请求访问活体功能。 当你的 Azure 订阅获得访问权限后,你可以下载人脸活体 SDK。
介绍
活体检测解决方案集成涉及两个不同的组件:前端移动/Web 应用程序和应用服务器/协调器。
前端应用程序:前端应用程序从应用服务器接收授权,以启动活体检测。 其主要目标是激活相机,并引导最终用户准确完成活体检测过程。
应用服务器:应用服务器充当后端服务器,用于创建活体检测会话并从人脸服务获取特定会话的授权令牌。 此令牌将授权前端应用程序执行活体检测。 应用服务器的目标是管理会话、为前端应用程序授权,以及查看活体检测过程的结果。
此外,我们将人脸验证与活体检测相结合,以验证相应人员是否是你指定的特定人员。 下表详细说明了活体检测功能:
Feature
说明
活体检测
确定输入是真是假,只有应用服务器有权启动活体检查并查询结果。
附带人脸验证的活体检测
确定输入是真是假,并根据你提供的参考图像验证人员的身份。 应用服务器或前端应用程序都可以提供参考图像。 只有应用服务器有权启动活体检查并查询结果。
本教程演示如何在各种语言 SDK 中操作前端应用程序和应用服务器,来执行活体检测 和附带人脸验证的活体检测 。
先决条件
Azure 订阅 - 免费创建订阅
你的 Azure 帐户必须分配有“认知服务参与者”角色,你才能同意负责任 AI 条款并创建资源。 若要将此角色分配给你的帐户,请按照分配角色 文档中的步骤进行操作,或与管理员联系。
拥有 Azure 订阅后,请在 Azure 门户中创建人脸资源 ,以获取密钥和终结点。 部署后,选择”转到资源”。
你需要从创建的资源获取密钥和终结点,以便将应用程序连接到人脸服务。
可以使用免费定价层 (F0
) 试用该服务,然后再升级到付费层进行生产。
访问适用于移动设备(IOS 和 Android)和 Web 的 Azure AI 视觉 Face CLinet SDK。 首先,你需要申请人脸识别受限访问功能 才能访问此 SDK。 有关详细信息,请参阅人脸受限访问 页面。
我们为前端应用程序和应用服务器提供了不同语言的 SDK。 请参阅以下说明来设置前端应用程序和应用服务器。
下载适用于前端应用程序的 SDK
有权访问 SDK 后,请按照 azure-ai-vision-sdk GitHub 存储库中的说明将 UI 和代码集成到本机移动应用程序中。 活性 SDK 支持适用于 Android 移动应用程序的 Java/Kotlin、适用于 iOS 移动应用程序的 Swift、适用于 Web 应用程序的 JavaScript:
当你将代码添加到应用程序中后,SDK 会处理以下事项:启动相机、指导最终用户调整其姿势、编写活体有效负载,以及调用 Azure AI 人脸云服务来处理活体有效负载。
下载适用于应用服务器的 Azure AI 人脸客户端库
应用服务器/协调器负责控制活体检测会话的生命周期。 应用服务器必须先创建一个会话才能执行活体检测,然后它可以查询结果,并在完成活体检查后删除会话。 我们提供了各种语言的库,用于轻松实现应用服务器。 请按照以下步骤安装所需的包:
创建环境变量
在此示例中,将凭据写入运行应用程序的本地计算机上的环境变量。
转到 Azure 门户。 如果在“先决条件 ”部分创建的资源部署成功,请选择“后续步骤 ”下的“转到资源 ”。 在“密钥和终结点 ”页的“资源管理 ”下,可以找到密钥和终结点。 你的资源密钥与你的 Azure 订阅 ID 不同。
若要为密钥和终结点设置环境变量,请打开控制台窗口,并按照操作系统和开发环境的说明进行操作。
若要设置 FACE_APIKEY
环境变量,请将 <your_key>
替换为资源的其中一个密钥。
若要设置 FACE_ENDPOINT
环境变量,请将 <your_endpoint>
替换为资源的终结点。
setx FACE_APIKEY <your_key>
setx FACE_ENDPOINT <your_endpoint>
添加环境变量后,可能需要重启任何正在运行的、将读取环境变量的程序(包括控制台窗口)。
export FACE_APIKEY=<your_key>
export FACE_ENDPOINT=<your_endpoint>
添加环境变量后,请从控制台窗口运行 source ~/.bashrc
,使更改生效。
活体编排涉及的大致步骤如下所示:
前端应用程序启动活性检查并通知应用服务器。
应用服务器使用 Azure AI 人脸服务创建新的活体会话。 该服务创建一个活体会话并使用会话授权令牌进行响应。 有关创建活性会话所涉及的每个请求参数的详细信息,请参阅 Liveness Create Session Operation 。
var endpoint = new Uri(System.Environment.GetEnvironmentVariable("FACE_ENDPOINT"));
var credential = new AzureKeyCredential(System.Environment.GetEnvironmentVariable("FACE_APIKEY"));
var sessionClient = new FaceSessionClient(endpoint, credential);
var createContent = new CreateLivenessSessionContent(LivenessOperationMode.Passive)
{
DeviceCorrelationId = "723d6d03-ef33-40a8-9682-23a1feb7bccd",
SendResultsToClient = false,
};
var createResponse = await sessionClient.CreateLivenessSessionAsync(createContent);
var sessionId = createResponse.Value.SessionId;
Console.WriteLine($"Session created.");
Console.WriteLine($"Session id: {sessionId}");
Console.WriteLine($"Auth token: {createResponse.Value.AuthToken}");
String endpoint = System.getenv("FACE_ENDPOINT");
String accountKey = System.getenv("FACE_APIKEY");
FaceSessionClient sessionClient = new FaceSessionClientBuilder()
.endpoint(endpoint)
.credential(new AzureKeyCredential(accountKey))
.buildClient();
CreateLivenessSessionContent parameters = new CreateLivenessSessionContent(LivenessOperationMode.PASSIVE)
.setDeviceCorrelationId("723d6d03-ef33-40a8-9682-23a1feb7bccd")
.setSendResultsToClient(false);
CreateLivenessSessionResult creationResult = sessionClient.createLivenessSession(parameters);
System.out.println("Session created.");
System.out.println("Session id: " + creationResult.getSessionId());
System.out.println("Auth token: " + creationResult.getAuthToken());
endpoint = os.environ["FACE_ENDPOINT"]
key = os.environ["FACE_APIKEY"]
face_session_client = FaceSessionClient(endpoint=endpoint, credential=AzureKeyCredential(key))
created_session = await face_session_client.create_liveness_session(
CreateLivenessSessionContent(
liveness_operation_mode=LivenessOperationMode.PASSIVE,
device_correlation_id="723d6d03-ef33-40a8-9682-23a1feb7bccd",
send_results_to_client=False,
)
)
print("Session created.")
print(f"Session id: {created_session.session_id}")
print(f"Auth token: {created_session.auth_token}")
const endpoint = process.env['FACE_ENDPOINT'];
const apikey = process.env['FACE_APIKEY'];
const credential = new AzureKeyCredential(apikey);
const client = createFaceClient(endpoint, credential);
const createLivenessSessionResponse = await client.path('/detectLiveness/singleModal/sessions').post({
body: {
livenessOperationMode: 'Passive',
deviceCorrelationId: '723d6d03-ef33-40a8-9682-23a1feb7bccd',
sendResultsToClient: false,
},
});
if (isUnexpected(createLivenessSessionResponse)) {
throw new Error(createLivenessSessionResponse.body.error.message);
}
console.log('Session created.');
console.log(`Session ID: ${createLivenessSessionResponse.body.sessionId}`);
console.log(`Auth token: ${createLivenessSessionResponse.body.authToken}`);
curl --request POST --location "%FACE_ENDPOINT%/face/v1.1-preview.1/detectliveness/singlemodal/sessions" ^
--header "Ocp-Apim-Subscription-Key: %FACE_APIKEY%" ^
--header "Content-Type: application/json" ^
--data ^
"{ ^
""livenessOperationMode"": ""passive"", ^
""deviceCorrelationId"": ""723d6d03-ef33-40a8-9682-23a1feb7bccd"", ^
""sendResultsToClient"": ""false"" ^
}"
curl --request POST --location "${FACE_ENDPOINT}/face/v1.1-preview.1/detectliveness/singlemodal/sessions" \
--header "Ocp-Apim-Subscription-Key: ${FACE_APIKEY}" \
--header "Content-Type: application/json" \
--data \
'{
"livenessOperationMode": "passive",
"deviceCorrelationId": "723d6d03-ef33-40a8-9682-23a1feb7bccd",
"sendResultsToClient": "false"
}'
响应正文的示例:
{
"sessionId": "a6e7193e-b638-42e9-903f-eaf60d2b40a5",
"authToken": "<session-authorization-token>"
}
应用服务器将会话授权令牌返回给前端应用程序。
前端应用程序在 Azure AI 视觉 SDK 初始化期间提供会话授权令牌。
mServiceOptions?.setTokenCredential(com.azure.android.core.credential.TokenCredential { _, callback ->
callback.onSuccess(com.azure.android.core.credential.AccessToken("<INSERT_TOKEN_HERE>", org.threeten.bp.OffsetDateTime.MAX))
})
serviceOptions?.authorizationToken = "<INSERT_TOKEN_HERE>"
azureAIVisionFaceAnalyzer.token = "<INSERT_TOKEN_HERE>"
接着,SDK 会启动相机,指导用户正确调整其姿势,然后准备有效负载以调用活体检测服务终结点。
SDK 调用 Azure AI 视觉人脸服务来执行活体检测。 一旦服务进行响应,SDK 就会通知前端应用程序活体检查已完成。
前端应用程序将活性检查的完成情况中继到应用服务器。
应用服务器现在可以从 Azure AI 视觉人脸服务查询活体检测结果。
var getResultResponse = await sessionClient.GetLivenessSessionResultAsync(sessionId);
var sessionResult = getResultResponse.Value;
Console.WriteLine($"Session id: {sessionResult.Id}");
Console.WriteLine($"Session status: {sessionResult.Status}");
Console.WriteLine($"Liveness detection request id: {sessionResult.Result?.RequestId}");
Console.WriteLine($"Liveness detection received datetime: {sessionResult.Result?.ReceivedDateTime}");
Console.WriteLine($"Liveness detection decision: {sessionResult.Result?.Response.Body.LivenessDecision}");
Console.WriteLine($"Session created datetime: {sessionResult.CreatedDateTime}");
Console.WriteLine($"Auth token TTL (seconds): {sessionResult.AuthTokenTimeToLiveInSeconds}");
Console.WriteLine($"Session expired: {sessionResult.SessionExpired}");
Console.WriteLine($"Device correlation id: {sessionResult.DeviceCorrelationId}");
LivenessSession sessionResult = sessionClient.getLivenessSessionResult(creationResult.getSessionId());
System.out.println("Session id: " + sessionResult.getId());
System.out.println("Session status: " + sessionResult.getStatus());
System.out.println("Liveness detection request id: " + sessionResult.getResult().getRequestId());
System.out.println("Liveness detection received datetime: " + sessionResult.getResult().getReceivedDateTime());
System.out.println("Liveness detection decision: " + sessionResult.getResult().getResponse().getBody().getLivenessDecision());
System.out.println("Session created datetime: " + sessionResult.getCreatedDateTime());
System.out.println("Auth token TTL (seconds): " + sessionResult.getAuthTokenTimeToLiveInSeconds());
System.out.println("Session expired: " + sessionResult.isSessionExpired());
System.out.println("Device correlation id: " + sessionResult.getDeviceCorrelationId());
liveness_result = await face_session_client.get_liveness_session_result(
created_session.session_id
)
print(f"Session id: {liveness_result.id}")
print(f"Session status: {liveness_result.status}")
print(f"Liveness detection request id: {liveness_result.result.request_id}")
print(f"Liveness detection received datetime: {liveness_result.result.received_date_time}")
print(f"Liveness detection decision: {liveness_result.result.response.body.liveness_decision}")
print(f"Session created datetime: {liveness_result.created_date_time}")
print(f"Auth token TTL (seconds): {liveness_result.auth_token_time_to_live_in_seconds}")
print(f"Session expired: {liveness_result.session_expired}")
print(f"Device correlation id: {liveness_result.device_correlation_id}")
const getLivenessSessionResultResponse = await client.path('/detectLiveness/singleModal/sessions/{sessionId}', createLivenessSessionResponse.body.sessionId).get();
if (isUnexpected(getLivenessSessionResultResponse)) {
throw new Error(getLivenessSessionResultResponse.body.error.message);
}
console.log(`Session id: ${getLivenessSessionResultResponse.body.id}`);
console.log(`Session status: ${getLivenessSessionResultResponse.body.status}`);
console.log(`Liveness detection request id: ${getLivenessSessionResultResponse.body.result?.requestId}`);
console.log(`Liveness detection received datetime: ${getLivenessSessionResultResponse.body.result?.receivedDateTime}`);
console.log(`Liveness detection decision: ${getLivenessSessionResultResponse.body.result?.response.body.livenessDecision}`);
console.log(`Session created datetime: ${getLivenessSessionResultResponse.body.createdDateTime}`);
console.log(`Auth token TTL (seconds): ${getLivenessSessionResultResponse.body.authTokenTimeToLiveInSeconds}`);
console.log(`Session expired: ${getLivenessSessionResultResponse.body.sessionExpired}`);
console.log(`Device correlation id: ${getLivenessSessionResultResponse.body.deviceCorrelationId}`);
curl --request GET --location "%FACE_ENDPOINT%/face/v1.1-preview.1/detectliveness/singlemodal/sessions/<session-id>" ^
--header "Ocp-Apim-Subscription-Key: %FACE_APIKEY%"
curl --request GET --location "${FACE_ENDPOINT}/face/v1.1-preview.1/detectliveness/singlemodal/sessions/<session-id>" \
--header "Ocp-Apim-Subscription-Key: ${FACE_APIKEY}"
响应正文的示例:
{
"status": "ResultAvailable",
"result": {
"id": 1,
"sessionId": "a3dc62a3-49d5-45a1-886c-36e7df97499a",
"requestId": "cb2b47dc-b2dd-49e8-bdf9-9b854c7ba843",
"receivedDateTime": "2023-10-31T16:50:15.6311565+00:00",
"request": {
"url": "/face/v1.1-preview.1/detectliveness/singlemodal",
"method": "POST",
"contentLength": 352568,
"contentType": "multipart/form-data; boundary=--------------------------482763481579020783621915",
"userAgent": ""
},
"response": {
"body": {
"livenessDecision": "realface",
"target": {
"faceRectangle": {
"top": 59,
"left": 121,
"width": 409,
"height": 395
},
"fileName": "content.bin",
"timeOffsetWithinFile": 0,
"imageType": "Color"
},
"modelVersionUsed": "2022-10-15-preview.04"
},
"statusCode": 200,
"latencyInMilliseconds": 1098
},
"digest": "537F5CFCD8D0A7C7C909C1E0F0906BF27375C8E1B5B58A6914991C101E0B6BFC"
},
"id": "a3dc62a3-49d5-45a1-886c-36e7df97499a",
"createdDateTime": "2023-10-31T16:49:33.6534925+00:00",
"authTokenTimeToLiveInSeconds": 600,
"deviceCorrelationId": "723d6d03-ef33-40a8-9682-23a1feb7bccd",
"sessionExpired": false
}
如果不再查询会话结果,应用服务器可以删除会话。
await sessionClient.DeleteLivenessSessionAsync(sessionId);
Console.WriteLine($"The session {sessionId} is deleted.");
sessionClient.deleteLivenessSession(creationResult.getSessionId());
System.out.println("The session " + creationResult.getSessionId() + " is deleted.");
await face_session_client.delete_liveness_session(
created_session.session_id
)
print(f"The session {created_session.session_id} is deleted.")
await face_session_client.close()
const deleteLivenessSessionResponse = await client.path('/detectLiveness/singleModal/sessions/{sessionId}', createLivenessSessionResponse.body.sessionId).delete();
if (isUnexpected(deleteLivenessSessionResponse)) {
throw new Error(deleteLivenessSessionResponse.body.error.message);
}
console.log(`The session ${createLivenessSessionResponse.body.sessionId} is deleted.`);
curl --request DELETE --location "%FACE_ENDPOINT%/face/v1.1-preview.1/detectliveness/singlemodal/sessions/<session-id>" ^
--header "Ocp-Apim-Subscription-Key: %FACE_APIKEY%"
curl --request DELETE --location "${FACE_ENDPOINT}/face/v1.1-preview.1/detectliveness/singlemodal/sessions/<session-id>" \
--header "Ocp-Apim-Subscription-Key: ${FACE_APIKEY}"
将人脸验证与活体检测相结合,可以对感兴趣的具体人员进行生物特征验证,进一步保证该人实际存在于系统中。
将活体与验证相结合的过程分为两个部分:
选择一个好的参考图像。
通过验证设置活体编排。
选择引用映像
使用以下提示来确保输入图像提供最准确的识别结果。
技术要求
支持的输入图像格式为 JPEG、PNG、GIF(第一帧)和 BMP。
图像文件不得大于 6 MB。
使用适用的检测模型作为一般准则来判断图像质量是否足以尝试进行人脸识别时,可以在人脸检测 操作中使用 qualityForRecognition
属性。 建议仅将 "high"
质量图像用于人员登记,将 "medium"
或以上质量的图像用于识别方案。
组合要求
照片清晰,不模糊,不像素化,不扭曲,也无损坏。
照片没有通过修改来消除人脸瑕疵或改变人脸外貌。
照片必须采用 RGB 颜色支持的格式(JPEG、PNG、WEBP、BMP)。 建议的人脸大小为 200 像素 x 200 像素。 人脸大小大于 200 像素 x 200 像素不会带来更好的 AI 质量。大小不得超过 6 MB。
用户不佩戴眼镜、面具、帽子、耳机、头罩或面罩。 人脸面前应没有任何障碍物。
允许佩戴面部首饰,前提是它们不会遮盖人脸。
照片中只能看到一张人脸。
人脸应保持端正的姿势,双眼睁开,嘴闭,没有极端的面部表情或头部倾斜情况。
人脸不应有任何阴影或红眼。 如果出现上述禁止情况中的任何一种,请重新拍照。
背景应该均匀、朴素,没有任何阴影。
人脸应位于图像中心并至少占图像的 50%。
通过验证设置活体编排。
活体验证编排涉及的大致步骤如下所示:
通过以下两种方法中的一种提供验证参考图像:
应用服务器在创建活体会话时提供参考图像。 有关创建具有验证的活性会话所涉及的每个请求参数的详细信息,请参阅 Liveness With Verify Create Session Operation 。
var endpoint = new Uri(System.Environment.GetEnvironmentVariable("FACE_ENDPOINT"));
var credential = new AzureKeyCredential(System.Environment.GetEnvironmentVariable("FACE_APIKEY"));
var sessionClient = new FaceSessionClient(endpoint, credential);
var createContent = new CreateLivenessWithVerifySessionContent(LivenessOperationMode.Passive)
{
DeviceCorrelationId = "723d6d03-ef33-40a8-9682-23a1feb7bccd"
};
using var fileStream = new FileStream("test.png", FileMode.Open, FileAccess.Read);
var createResponse = await sessionClient.CreateLivenessWithVerifySessionAsync(createContent, fileStream);
var sessionId = createResponse.Value.SessionId;
Console.WriteLine("Session created.");
Console.WriteLine($"Session id: {sessionId}");
Console.WriteLine($"Auth token: {createResponse.Value.AuthToken}");
Console.WriteLine("The reference image:");
Console.WriteLine($" Face rectangle: {createResponse.Value.VerifyImage.FaceRectangle.Top}, {createResponse.Value.VerifyImage.FaceRectangle.Left}, {createResponse.Value.VerifyImage.FaceRectangle.Width}, {createResponse.Value.VerifyImage.FaceRectangle.Height}");
Console.WriteLine($" The quality for recognition: {createResponse.Value.VerifyImage.QualityForRecognition}");
String endpoint = System.getenv("FACE_ENDPOINT");
String accountKey = System.getenv("FACE_APIKEY");
FaceSessionClient sessionClient = new FaceSessionClientBuilder()
.endpoint(endpoint)
.credential(new AzureKeyCredential(accountKey))
.buildClient();
CreateLivenessWithVerifySessionContent parameters = new CreateLivenessWithVerifySessionContent(LivenessOperationMode.PASSIVE)
.setDeviceCorrelationId("723d6d03-ef33-40a8-9682-23a1feb7bccd")
.setSendResultsToClient(false);
Path path = Paths.get("test.png");
BinaryData data = BinaryData.fromFile(path);
CreateLivenessWithVerifySessionResult creationResult = sessionClient.createLivenessWithVerifySession(parameters, data);
System.out.println("Session created.");
System.out.println("Session id: " + creationResult.getSessionId());
System.out.println("Auth token: " + creationResult.getAuthToken());
System.out.println("The reference image:");
System.out.println(" Face rectangle: " + creationResult.getVerifyImage().getFaceRectangle().getTop() + " " + creationResult.getVerifyImage().getFaceRectangle().getLeft() + " " + creationResult.getVerifyImage().getFaceRectangle().getWidth() + " " + creationResult.getVerifyImage().getFaceRectangle().getHeight());
System.out.println(" The quality for recognition: " + creationResult.getVerifyImage().getQualityForRecognition());
endpoint = os.environ["FACE_ENDPOINT"]
key = os.environ["FACE_APIKEY"]
face_session_client = FaceSessionClient(endpoint=endpoint, credential=AzureKeyCredential(key))
reference_image_path = "test.png"
with open(reference_image_path, "rb") as fd:
reference_image_content = fd.read()
created_session = await face_session_client.create_liveness_with_verify_session(
CreateLivenessWithVerifySessionContent(
liveness_operation_mode=LivenessOperationMode.PASSIVE,
device_correlation_id="723d6d03-ef33-40a8-9682-23a1feb7bccd",
),
verify_image=reference_image_content,
)
print("Session created.")
print(f"Session id: {created_session.session_id}")
print(f"Auth token: {created_session.auth_token}")
print("The reference image:")
print(f" Face rectangle: {created_session.verify_image.face_rectangle}")
print(f" The quality for recognition: {created_session.verify_image.quality_for_recognition}")
const endpoint = process.env['FACE_ENDPOINT'];
const apikey = process.env['FACE_APIKEY'];
const credential = new AzureKeyCredential(apikey);
const client = createFaceClient(endpoint, credential);
const createLivenessSessionResponse = await client.path('/detectLivenessWithVerify/singleModal/sessions').post({
contentType: 'multipart/form-data',
body: [
{
name: 'VerifyImage',
// Note that this utilizes Node.js API.
// In browser environment, please use file input or drag and drop to read files.
body: readFileSync('test.png'),
},
{
name: 'Parameters',
body: {
livenessOperationMode: 'Passive',
deviceCorrelationId: '723d6d03-ef33-40a8-9682-23a1feb7bccd',
},
},
],
});
if (isUnexpected(createLivenessSessionResponse)) {
throw new Error(createLivenessSessionResponse.body.error.message);
}
console.log('Session created:');
console.log(`Session ID: ${createLivenessSessionResponse.body.sessionId}`);
console.log(`Auth token: ${createLivenessSessionResponse.body.authToken}`);
console.log('The reference image:');
console.log(` Face rectangle: ${createLivenessSessionResponse.body.verifyImage.faceRectangle}`);
console.log(` The quality for recognition: ${createLivenessSessionResponse.body.verifyImage.qualityForRecognition}`)
curl --request POST --location "%FACE_ENDPOINT%/face/v1.1-preview.1/detectlivenesswithverify/singlemodal/sessions" ^
--header "Ocp-Apim-Subscription-Key: %FACE_APIKEY%" ^
--form "Parameters=""{\\\""livenessOperationMode\\\"": \\\""passive\\\"", \\\""deviceCorrelationId\\\"": \\\""723d6d03-ef33-40a8-9682-23a1feb7bccd\\\""}""" ^
--form "VerifyImage=@""test.png"""
curl --request POST --location "${FACE_ENDPOINT}/face/v1.1-preview.1/detectlivenesswithverify/singlemodal/sessions" \
--header "Ocp-Apim-Subscription-Key: ${FACE_APIKEY}" \
--form 'Parameters="{
\"livenessOperationMode\": \"passive\",
\"deviceCorrelationId\": \"723d6d03-ef33-40a8-9682-23a1feb7bccd\"
}"' \
--form 'VerifyImage=@"test.png"'
响应正文的示例:
{
"verifyImage": {
"faceRectangle": {
"top": 506,
"left": 51,
"width": 680,
"height": 475
},
"qualityForRecognition": "high"
},
"sessionId": "3847ffd3-4657-4e6c-870c-8e20de52f567",
"authToken": "<session-authorization-token>"
}
前端应用程序将在初始化 SDK 时提供参考图像。 Web 解决方案不支持此方案。
val singleFaceImageSource = VisionSource.fromFile("/path/to/image.jpg")
mFaceAnalysisOptions?.setRecognitionMode(RecognitionMode.valueOfVerifyingMatchToFaceInSingleFaceImage(singleFaceImageSource))
if let path = Bundle.main.path(forResource: "<IMAGE_RESOURCE_NAME>", ofType: "<IMAGE_RESOURCE_TYPE>"),
let image = UIImage(contentsOfFile: path),
let singleFaceImageSource = try? VisionSource(uiImage: image) {
try methodOptions.setRecognitionMode(.verifyMatchToFaceIn(singleFaceImage: singleFaceImageSource))
}
除了活体结果之外,应用服务器现在还可以查询验证结果。
var getResultResponse = await sessionClient.GetLivenessWithVerifySessionResultAsync(sessionId);
var sessionResult = getResultResponse.Value;
Console.WriteLine($"Session id: {sessionResult.Id}");
Console.WriteLine($"Session status: {sessionResult.Status}");
Console.WriteLine($"Liveness detection request id: {sessionResult.Result?.RequestId}");
Console.WriteLine($"Liveness detection received datetime: {sessionResult.Result?.ReceivedDateTime}");
Console.WriteLine($"Liveness detection decision: {sessionResult.Result?.Response.Body.LivenessDecision}");
Console.WriteLine($"Verification result: {sessionResult.Result?.Response.Body.VerifyResult.IsIdentical}");
Console.WriteLine($"Verification confidence: {sessionResult.Result?.Response.Body.VerifyResult.MatchConfidence}");
Console.WriteLine($"Session created datetime: {sessionResult.CreatedDateTime}");
Console.WriteLine($"Auth token TTL (seconds): {sessionResult.AuthTokenTimeToLiveInSeconds}");
Console.WriteLine($"Session expired: {sessionResult.SessionExpired}");
Console.WriteLine($"Device correlation id: {sessionResult.DeviceCorrelationId}");
LivenessWithVerifySession sessionResult = sessionClient.getLivenessWithVerifySessionResult(creationResult.getSessionId());
System.out.println("Session id: " + sessionResult.getId());
System.out.println("Session status: " + sessionResult.getStatus());
System.out.println("Liveness detection request id: " + sessionResult.getResult().getRequestId());
System.out.println("Liveness detection received datetime: " + sessionResult.getResult().getReceivedDateTime());
System.out.println("Liveness detection decision: " + sessionResult.getResult().getResponse().getBody().getLivenessDecision());
System.out.println("Verification result: " + sessionResult.getResult().getResponse().getBody().getVerifyResult().isIdentical());
System.out.println("Verification confidence: " + sessionResult.getResult().getResponse().getBody().getVerifyResult().getMatchConfidence());
System.out.println("Session created datetime: " + sessionResult.getCreatedDateTime());
System.out.println("Auth token TTL (seconds): " + sessionResult.getAuthTokenTimeToLiveInSeconds());
System.out.println("Session expired: " + sessionResult.isSessionExpired());
System.out.println("Device correlation id: " + sessionResult.getDeviceCorrelationId());
liveness_result = await face_session_client.get_liveness_with_verify_session_result(
created_session.session_id
)
print(f"Session id: {liveness_result.id}")
print(f"Session status: {liveness_result.status}")
print(f"Liveness detection request id: {liveness_result.result.request_id}")
print(f"Liveness detection received datetime: {liveness_result.result.received_date_time}")
print(f"Liveness detection decision: {liveness_result.result.response.body.liveness_decision}")
print(f"Verification result: {liveness_result.result.response.body.verify_result.is_identical}")
print(f"Verification confidence: {liveness_result.result.response.body.verify_result.match_confidence}")
print(f"Session created datetime: {liveness_result.created_date_time}")
print(f"Auth token TTL (seconds): {liveness_result.auth_token_time_to_live_in_seconds}")
print(f"Session expired: {liveness_result.session_expired}")
print(f"Device correlation id: {liveness_result.device_correlation_id}")
const getLivenessSessionResultResponse = await client.path('/detectLivenessWithVerify/singleModal/sessions/{sessionId}', createLivenessSessionResponse.body.sessionId).get();
if (isUnexpected(getLivenessSessionResultResponse)) {
throw new Error(getLivenessSessionResultResponse.body.error.message);
}
console.log(`Session id: ${getLivenessSessionResultResponse.body.id}`);
console.log(`Session status: ${getLivenessSessionResultResponse.body.status}`);
console.log(`Liveness detection request id: ${getLivenessSessionResultResponse.body.result?.requestId}`);
console.log(`Liveness detection received datetime: ${getLivenessSessionResultResponse.body.result?.receivedDateTime}`);
console.log(`Liveness detection decision: ${getLivenessSessionResultResponse.body.result?.response.body.livenessDecision}`);
console.log(`Verification result: ${getLivenessSessionResultResponse.body.result?.response.body.verifyResult.isIdentical}`);
console.log(`Verification confidence: ${getLivenessSessionResultResponse.body.result?.response.body.verifyResult.matchConfidence}`);
console.log(`Session created datetime: ${getLivenessSessionResultResponse.body.createdDateTime}`);
console.log(`Auth token TTL (seconds): ${getLivenessSessionResultResponse.body.authTokenTimeToLiveInSeconds}`);
console.log(`Session expired: ${getLivenessSessionResultResponse.body.sessionExpired}`);
console.log(`Device correlation id: ${getLivenessSessionResultResponse.body.deviceCorrelationId}`);
curl --request GET --location "%FACE_ENDPOINT%/face/v1.1-preview.1/detectlivenesswithverify/singlemodal/sessions/<session-id>" ^
--header "Ocp-Apim-Subscription-Key: %FACE_APIKEY%"
curl --request GET --location "${FACE_ENDPOINT}/face/v1.1-preview.1/detectlivenesswithverify/singlemodal/sessions/<session-id>" \
--header "Ocp-Apim-Subscription-Key: ${FACE_APIKEY}"
响应正文的示例:
{
"status": "ResultAvailable",
"result": {
"id": 1,
"sessionId": "3847ffd3-4657-4e6c-870c-8e20de52f567",
"requestId": "f71b855f-5bba-48f3-a441-5dbce35df291",
"receivedDateTime": "2023-10-31T17:03:51.5859307+00:00",
"request": {
"url": "/face/v1.1-preview.1/detectlivenesswithverify/singlemodal",
"method": "POST",
"contentLength": 352568,
"contentType": "multipart/form-data; boundary=--------------------------590588908656854647226496",
"userAgent": ""
},
"response": {
"body": {
"livenessDecision": "realface",
"target": {
"faceRectangle": {
"top": 59,
"left": 121,
"width": 409,
"height": 395
},
"fileName": "content.bin",
"timeOffsetWithinFile": 0,
"imageType": "Color"
},
"modelVersionUsed": "2022-10-15-preview.04",
"verifyResult": {
"matchConfidence": 0.9304124,
"isIdentical": true
}
},
"statusCode": 200,
"latencyInMilliseconds": 1306
},
"digest": "2B39F2E0EFDFDBFB9B079908498A583545EBED38D8ACA800FF0B8E770799F3BF"
},
"id": "3847ffd3-4657-4e6c-870c-8e20de52f567",
"createdDateTime": "2023-10-31T16:58:19.8942961+00:00",
"authTokenTimeToLiveInSeconds": 600,
"deviceCorrelationId": "723d6d03-ef33-40a8-9682-23a1feb7bccd",
"sessionExpired": true
}
如果不再查询会话结果,应用服务器可以删除会话。
await sessionClient.DeleteLivenessWithVerifySessionAsync(sessionId);
Console.WriteLine($"The session {sessionId} is deleted.");
sessionClient.deleteLivenessWithVerifySession(creationResult.getSessionId());
System.out.println("The session " + creationResult.getSessionId() + " is deleted.");
await face_session_client.delete_liveness_with_verify_session(
created_session.session_id
)
print(f"The session {created_session.session_id} is deleted.")
await face_session_client.close()
const deleteLivenessSessionResponse = await client.path('/detectLivenessWithVerify/singleModal/sessions/{sessionId}', createLivenessSessionResponse.body.sessionId).delete();
if (isUnexpected(deleteLivenessSessionResponse)) {
throw new Error(deleteLivenessSessionResponse.body.error.message);
}
console.log(`The session ${createLivenessSessionResponse.body.sessionId} is deleted.`);
curl --request DELETE --location "%FACE_ENDPOINT%/face/v1.1-preview.1/detectlivenesswithverify/singlemodal/sessions/<session-id>" ^
--header "Ocp-Apim-Subscription-Key: %FACE_APIKEY%"
curl --request DELETE --location "${FACE_ENDPOINT}/face/v1.1-preview.1/detectlivenesswithverify/singlemodal/sessions/<session-id>" \
--header "Ocp-Apim-Subscription-Key: ${FACE_APIKEY}"
清理资源
如果想要清理并移除 Azure AI 服务订阅,可以删除资源或资源组。 删除资源组同时也会删除与之相关联的任何其他资源。
相关内容
若要了解活体检测 API 中的其他选项,请参阅 Azure AI 视觉 SDK 参考。
若要详细了解可用于协调实时解决方案的功能,请参阅会话 REST API 参考。