Facematch and Liveness
Accura Scan face biometrics solution matches the selfie image with the image on the id card and also confirms if the user was live or not during the process.
Facematch
The method use to Start Facematch is
AccuraFacematch.startFaceMatch(accuraConfs)
In above method accuraConfs is an array consist of Face Uri in String Format, value will be provided by MRZ or OCR response.
var accuraConfs = [
{"face_uri":faceMatchURL}
];
Methods use to set Facematch configs are as follows
AccuraFacematch.setFaceMatchFeedbackTextSize(18);
AccuraFacematch.setFaceMatchFeedBackframeMessage("Frame Your Face");
AccuraFacematch.setFaceMatchFeedBackAwayMessage("Move Phone Away");
AccuraFacematch.setFaceMatchFeedBackOpenEyesMessage("Keep Your Eyes Open");
AccuraFacematch.setFaceMatchFeedBackCloserMessage("Move Phone Closer");
AccuraFacematch.setFaceMatchFeedBackCenterMessage("Move Phone Center");
AccuraFacematch.setFaceMatchFeedbackMultipleFaceMessage("Multiple Face Detected");
AccuraFacematch.setFaceMatchFeedBackFaceSteadymessage("Keep Your Head Straight");
AccuraFacematch.setFaceMatchFeedBackLowLightMessage("Low light detected");
AccuraFacematch.setFaceMatchFeedBackBlurFaceMessage("Blur Detected Over Face");
AccuraFacematch.setFaceMatchFeedBackGlareFaceMessage("Glare Detected");
AccuraFacematch.setFaceMatchBlurPercentage(80);
AccuraFacematch.setFaceMatchGlarePercentage_0(-1);
AccuraFacematch.setFaceMatchGlarePercentage_1(-1);
The example function is shown below
Future<void> startFaceMatch() async{
SystemChrome.setPreferredOrientations([DeviceOrientation.portraitUp]);
try{
var accuraConfs = {
"face_uri":this.faceMatchURL
};
await AccuraFacematch.setFaceMatchFeedbackTextSize(18);
await AccuraFacematch.setFaceMatchFeedBackframeMessage("Frame Your Face");
await AccuraFacematch.setFaceMatchFeedBackAwayMessage("Move Phone Away");
await AccuraFacematch.setFaceMatchFeedBackOpenEyesMessage("Keep Your Eyes Open");
await AccuraFacematch.setFaceMatchFeedBackCloserMessage("Move Phone Closer");
await AccuraFacematch.setFaceMatchFeedBackCenterMessage("Move Phone Center");
await AccuraFacematch.setFaceMatchFeedbackMultipleFaceMessage("Multiple Face Detected");
await AccuraFacematch.setFaceMatchFeedBackFaceSteadymessage("Keep Your Head Straight");
await AccuraFacematch.setFaceMatchFeedBackLowLightMessage("Low light detected");
await AccuraFacematch.setFaceMatchFeedBackBlurFaceMessage("Blur Detected Over Face");
await AccuraFacematch.setFaceMatchFeedBackGlareFaceMessage("Glare Detected");
await AccuraFacematch.setFaceMatchBlurPercentage(80);
await AccuraFacematch.setFaceMatchGlarePercentage_0(-1);
await AccuraFacematch.setFaceMatchGlarePercentage_1(-1);
await AccuraFacematch.startFaceMatch([accuraConfs])
.then((value) => {
setState((){
dynamic result = json.decode(value);
})
}).onError((error, stackTrace) => {
});
}on PlatformException{}
}
Response:
On Success: JSON Response { detect: (URI), score: (Float) }
Error: String
Liveness
The method use to Start Liveness is
AccuraLiveness.startLiveness(accuraConfs)
In above method accuraConfs is an array consist of Face Uri in String Format, value will be provided by MRZ or OCR response.
var accuraConfs = [
{"face_uri":faceMatchURL}
];
Methods use to set Liveness configs are as follows
AccuraLiveness.setLivenessFeedbackTextSize(18);
AccuraLiveness.setLivenessFeedBackframeMessage("Frame Your Face");
AccuraLiveness.setLivenessFeedBackAwayMessage("Move Phone Away");
AccuraLiveness.setLivenessFeedBackOpenEyesMessage("Keep Your Eyes Open");
AccuraLiveness.setLivenessFeedBackCloserMessage("Move Phone Closer");
AccuraLiveness.setLivenessFeedBackCenterMessage("Move Phone Closer");
AccuraLiveness.setLivenessFeedbackMultipleFaceMessage("Multiple Face Detected");
AccuraLiveness.setLivenessFeedBackFaceSteadymessage("Keep Your Head Straight");
AccuraLiveness.setLivenessFeedBackBlurFaceMessage("Blur Detected Over Face");
AccuraLiveness.setLivenessFeedBackGlareFaceMessage("Glare Detected");
AccuraLiveness.setLivenessBlurPercentage(80);
AccuraLiveness.setLivenessGlarePercentage_0(-1);
AccuraLiveness.setLivenessGlarePercentage_1(-1);
AccuraLiveness.setLivenessFeedBackLowLightMessage("Low light detected");
AccuraLiveness.setLivenessfeedbackLowLightTolerence(39);
AccuraLiveness.setLivenessURL("You Liveness Url");
The example function is shown below
Future<void> startLiveness() async{
SystemChrome.setPreferredOrientations([DeviceOrientation.portraitUp]);
try{
var accuraConfs = {
"face_uri":this.faceMatchURL
};
await AccuraLiveness.setLivenessFeedbackTextSize(18);
await AccuraLiveness.setLivenessFeedBackframeMessage("Frame Your Face");
await AccuraLiveness.setLivenessFeedBackAwayMessage("Move Phone Away");
await AccuraLiveness.setLivenessFeedBackOpenEyesMessage("Keep Your Eyes Open");
await AccuraLiveness.setLivenessFeedBackCloserMessage("Move Phone Closer");
await AccuraLiveness.setLivenessFeedBackCenterMessage("Move Phone Closer");
await AccuraLiveness.setLivenessFeedbackMultipleFaceMessage("Multiple Face Detected");
await AccuraLiveness.setLivenessFeedBackFaceSteadymessage("Keep Your Head Straight");
await AccuraLiveness.setLivenessFeedBackBlurFaceMessage("Blur Detected Over Face");
await AccuraLiveness.setLivenessFeedBackGlareFaceMessage("Glare Detected");
await AccuraLiveness.setLivenessBlurPercentage(80);
await AccuraLiveness.setLivenessGlarePercentage_0(-1);
await AccuraLiveness.setLivenessGlarePercentage_1(-1);
await AccuraLiveness.setLivenessFeedBackLowLightMessage("Low light detected");
await AccuraLiveness.setLivenessfeedbackLowLightTolerence(39);
await AccuraLiveness.setLivenessURL("You Liveness Url");
await AccuraLiveness.startLiveness([accuraConfs])
.then((value) => {
setState((){
dynamic result = json.decode(value);
})
}).onError((error, stackTrace) => {
});
}on PlatformException{}
}
Response:
On Success: JSON Response
Error: String
Last updated