Implementation Guide

Authentication

A unique access token should be created upon each SDK entry. In order to generate an access token, please refer to Valify's Authentication Documentation.

Importing

Import the module’s function in App.js/App.tsx file using the following line

import { startLiveness } from '@valifysolutions/react-native-vidvliveness';

Configurations

The plugin builder is separated into two components.

Required Configurations

Initialize the required parameters as follows while adding the required configurations

const token = "token";
const baseUrl = "ValifyEnvironment_base_url";
const bundleKey = "bundle_key";

Optional Configurations

Initialize the required parameters as follows while adding the desired configurations

const language = "<insert_language>"; // "en" is set as default
const enableSmile= <boolean>; // true is set as default
const enableLookLeft= <boolean>; // true is set as default
const enableLookRight= <boolean>; // true is set as default
const enableCloseEyes= <boolean>;// true is set as default
const trials= <string>; // default is "3"
const instructions= <string>; // default is "4"
const timer= <string>; // default is "10"
const primaryColor = ”<hex_color_code>”; //
const enableVoiceover= <boolean>; // true is set as default

//One of the following lines is required to enable face match 
const image=byte[];
const ocrTransactionID="<ocr_transaction_id>";//string

const showErrorMessage=true; // true is set as default
const headers={}; // default is empty

Configurations Breakdown

This section shows the breakdown of all optional builder configurations.

  1. The following line is where the user interface language is set.

const language = "<insert_language>"; // ["ar" or "en"] 

The currently supported languages are Arabic and English

  1. If the following line is set to true, the SDK requires the user to smile during the experience

const enableSmile= <boolean>;
  1. If the following line is set to true, the SDK requires the user to look to the left during the experience

const enableLookLeft= <boolean>;
  1. If the following line is set to true, the SDK requires the user to look to the right during the experience

const enableLookRight= <boolean>;
  1. If the following line is set to true, the SDK requires the user to close their eyes during the experience

const enableCloseEyes= <boolean>;
  1. The integer set in the following line determines how many failed attempts the user has during a single SDK experience.

const trials= <string>;
  1. The integer set in the following line determines how many instructions the user will need to follow to successfully complete a single SDK experience.

const instructions= <string>;
  1. The integer set in the following line determines how much time (in seconds) the user will be given to complete each task.

const timer= <string>;
  1. The following line is optional and can be used to set your company's branding color to the SDK's user interface.

const primaryColor =<hex_color_code>”;
  1. If the following line is set to true, voice-over will be added to the SDK experience dictating the actions expected from the user.

const enableVoiceover= <boolean>;
  1. If this field has the OCR transaction ID filled, then the face match service is enabled and the referenced transaction ID will be compared with an image captured of the user's face during the SDK experience.

const ocrTransactionID=<ocr_transaction_id>”;//string
  1. If the following line is set to true, then an error message will appear in the user interface in the case that the user did not pass the face match or the liveness service.

const showErrorMessage=true;
  1. The following line is optional and can be used to set any headers that may be required for purposes specific to your application. Any headers set will be sent along with each API call made by the SDK.

const headers = {};

Parameter Declaration

Declare the SDK parameters with the configuration variables previously created

const liveness_params={
access_token:token,
base_url:baseUrl,
bundle_key:bundleKey,
language:language,
enable_smile:enableSmile,
enable_look_left:enableLookLeft,
enable_look_right:enableLookRight,
enable_close_eyes:enableCloseEyes,
Liveness_number_of_failed_trials:trials,
liveness_number_of_instructions:instructions,
liveness_time_per_action:timer,
facematch_ocr_transactionId:ocrTransactionID,
enable_voiceover:enableVoiceover,
"facematch_image": image,
show_error_message:showErrorMessage,
primary_color:primaryColor,
headers:headers
};

Start the SDK

Use the following code snippet to run the plugin

export default function App() {

    startLiveness(liveness_params).then(function(value){
        console.log(value);
        const s = value.toString();
        const json = JSON.parse(s);
        if(json.nameValuePairs.state=="SUCCESS"){
        //an example code for using the response of the SDK , base64 will have the base64 String of the captured image during the successfull liveness process.
        final base64 = nameValuePairs.livenessResult.livenessResult.capturedImage;
        alert("success");
        }
        
        else if(json.nameValuePairs.state=="ERROR")
        alert("error");
        else alert("exit");
    },
        function(error){
            alert(error);
        })
}

Last updated