# Build a Basic Face Tracking Experience

## Introduction

This guide walks you through the process of building a basic scene using face tracking and facial effects feature of WebAR SDK. Get creative and explore. Use any face as a digital canvas for expressing AR creativity alongside driving deeper user engagement.&#x20;

Face tracking and facial effects with Blippar WebAR SDK v2.0.3.&#x20;

## 3D Face Tracking&#x20;

**WebAR SDK v2.0.3** is an industry-leading facial detection system that enables tracking & augmentation. Following are few features:

* The end user has been a key consideration in the design and optimisation of our computer vision technologies. Increase immersion by using face tracking without effort using realism AR features that adjust to different device capabilities.
* The robust facial tracking technology is built to track the face by identifying various face morphs including user facial expressions and position in the video frame.
* It precisely detects and tracks your face in a video stream in real-time enabling you to create realistic feature-rich facial animation.
* Overlay 3D objects on the face, apply effects and trigger the experience to enjoy Face AR.
* Our facial recognition software is mobile, web, and desktop compatible. It offers excellent Face AR performance and adjusts to the capabilities of the device.

## Landmark Face Tracking: 3D Facial Effects

**WebAR SDK v1.7.0** and above, supports facial landmark tracking to apply various facial effects. Following are the features:

* Blippar WebAR SDK creates a facial mesh and detects the morphs to enable increased efficiency and stability in in applying facial effects.
* Landmark points such as nose, eyes etc including facial expressions and spatial position can be tracked.
* Easily enable various facial effects and virtual try-on of various products as a use case application. It creates its own virtual mask that attaches to your face. The effects adjust to any angel applied over it. For eg: makeup, 3D effects, hats, face etc.&#x20;

## Types of Face Tracking

There are two types of face tracking that is supported with **WebAR SDK v2.0.3** as listed below:

1. **WebAR Face:** The SDK applies the 3D models on to the entire face.
2. **WebAr Face Mesh:** The SDK creates a mesh of the entire face and detects landmark points (eyes, nose etc) on the face  and creates a 3D model of the face upon which the facial effects can be applied.

## Procedure

Follow the steps to build a basic experience with face tracking and facial effects:

* Download the latest SDK v2.0.3 and open it using your favourite editor. For eg: Visual Studio Code
* Navigate to .vscode and replace the settings.json file with the applicable ssl certificate and key.

{% hint style="info" %}
For more information on obtaining SSL Certificate and Key refer article [Develop Locally](https://docs.blippar.com/webar-sdk/v2.0.3/publish-your-creation/develop-locally).
{% endhint %}

* Navigate to A-frame and open the face.html file as shown below:

<figure><img src="https://content.gitbook.com/content/x7Ftek2jJ9rdxTo9JirP/blobs/zqYdrZcAAK1qHgSqaHL0/Screenshot%202023-10-11%20at%207.22.43%20AM.png" alt=""><figcaption><p> A-frame Face.html File</p></figcaption></figure>

* The latest updated script tag v1.4.2 ia available with the SDK kit. Ensure the same as shown below:

```html
<title>Blippar WebAR SDK</title>
<script src="https://aframe.io/releases/1.4.2/aframe.min.js"></script>
```

## STEP 1

Include the latest WebAR SDK v2.0.3 in the script tag and ensure the <mark style="color:blue;">webar-mode</mark>=<mark style="color:red;">"face-tracking"</mark>

{% hint style="info" %}
Include any API customization. For more information refer article [API Customization](https://docs.blippar.com/webar-sdk/v2.0.3/api/api-customization).
{% endhint %}

## STEP 2

* Add webar-scene attribute to A-Frame's <mark style="color:red;">\<a-scene></mark> tag
* Provide a valid Blippar license-key value in the key: property <mark style="color:blue;">webar-scene</mark>=<mark style="color:red;">"key: xxxxxxxx-1111-2222-3333-yyyyyyyyyyyy"</mark> as shown below

```html
<a-scene
  webar-scene="key: xxxxxxxx-1111-2222-3333-yyyyyyyyyyyy"
  vr-mode-ui="enabled: false"
  device-orientation-permission-ui="enabled: false"
  loading-screen="enabled: false"
  renderer="colorManagement: false; antialias: true; physicallyCorrectLights: false;">
```

{% hint style="info" %}
Setting <<mark style="color:red;">a-scene</mark>> renderer's colorManagement: <mark style="color:blue;">true</mark>; above will reduce the face tracking fps significantly. Set it to <mark style="color:blue;">true</mark> ONLY when complex 3D models needs the best rendering.
{% endhint %}

## STEP 3

Add the default camera, <mark style="color:red;">webar-camera</mark> attribute to AFrame's tag as shown below

```html
<a-camera webar-camera></a-camera>
```

## ASSET MANAGER

* Download the required models and images from the asset manager. For eg: 'Fuel Glass' and 'Lipstick' as shown below. You can include multiple models, images and effects.

{% hint style="info" %}
In this example, both **WebAR Face** and **WebAR Face Mesh** is enabled to include face tracking to place 'Fuel Glasses' on the user's face and also include face mesh to apply 'Lipstick' on the user's lips.
{% endhint %}

* Download the asset item (glb model) and update the <mark style="color:red;">\<a-asset item id></mark> along with the source path. Also download an image and update the <mark style="color:red;">\<image id></mark> along with the source path. To access the models, navigate to A-frame and open the models file as shown below:

<figure><img src="https://content.gitbook.com/content/x7Ftek2jJ9rdxTo9JirP/blobs/GcnkGFzlgxtk3JZT677q/Screenshot%202023-10-11%20at%207.24.15%20AM.png" alt=""><figcaption><p>Asset Manager- Download Models and Images</p></figcaption></figure>

A-frame will automatically download the image or the model after the id and path (or file name) are specified.

```html
<a-assets timeout="60000">
  <a-asset-item id="fuel_glass" src="models/Fuel-glass.glb"></a-asset-item>
  <img id="liptexture" src="images/lipmask.png" alt=""/>
</a-assets>
```

{% hint style="success" %}
Replace the model name and the image name based on the selection of the glb models or images from the Asset Library. In the above example 'Fuel Glass' and 'Lipstick' are utilized as model and image. You can choose any model or image of your choice.&#x20;
{% endhint %}

## STEP 4 (OPTIONAL)

Display the webcam video in the background and set <mark style="color:blue;">visible</mark>=<mark style="color:red;">true</mark>. If you want to display a white screen, set <mark style="color:blue;">visible</mark>=<mark style="color:red;">false</mark> to this entity

```html
<a-plane id="webar-video-plane" color="tomato" visible="true"></a-plane>
```

### WebAR Video Pane

* In General, the webcam will display the background video of the user when the experience is triggered. In order to customize the 'background' video pane update the colour and set <mark style="color:blue;">visible</mark>=<mark style="color:red;">false</mark>,and the result will appear as below:

<figure><img src="https://content.gitbook.com/content/x7Ftek2jJ9rdxTo9JirP/blobs/GF2mxM5BuXBGTBXLAMky/Screenshot%202023-10-11%20at%207.25.54%20AM.png" alt=""><figcaption><p>Visibility is set to False </p></figcaption></figure>

* If the <mark style="color:blue;">colour</mark>=<mark style="color:red;">tomato</mark> and <mark style="color:blue;">visible</mark>=<mark style="color:red;">true</mark>, this will modify the background colour from the orginal display but will retain the visibility of the orginal background since it is set to true.&#x20;
* Create a <mark style="color:red;">\<a-plane></mark> and ensure the id is updated as mentioned below:

{% hint style="success" %}
For modification, Ensure the <mark style="color:blue;">id</mark>=<mark style="color:red;">"webar-video-plane"</mark> and update the colour and visibility. Note that even without the above update SDK will create the mesh, but will not allow customization of the colour and the visibility.&#x20;
{% endhint %}

## STEP 5&#x20;

Add webar-facemesh attribute to tag. This creates a mesh of the entire face and detects landmark points (eyes, nose etc) on the face  and creates a 3D model of the face upon which the facial effects can be applied. For more information, refer [Landmark Face Tracking](#landmark-face-tracking-3d-facial-effects).

* Create a <mark style="color:red;">\<a-plane></mark> and also add the attribute <mark style="color:red;">webar-facemesh material</mark> and a geometry of the landmark points of the face will be created to apply any material.&#x20;

To Apply mesh and colour utilise the code below and ensure colour and transparency is adjusted to make the mesh and the facial effect visible.

{% hint style="info" %}
Material: for any 3D model or Mesh in case you want to modify, provide a surface or pattern such as wooden texture, shiny glass texture etc. Apply default A-frame material.
{% endhint %}

```html
<a-plane webar-facemesh material="src: #liptexture; side: front; alphaTest: 0; color: #E32531; shader: flat; transparent: true; opacity: 1;"></a-plane>
```

## STEP 6

* Add webar-face attribute to an AFrame's parent \<a-entity> tag.
* The child elements of the webar-face displays the 3d model on the tracked face.
* Origin of the face is at the center of the line connecting the two eyes.&#x20;
* Add webar-loadmonitor attribute to the entities to display loading progress screen before starting the face tracking&#x20;

{% hint style="info" %}
For more information refer article [API Reference](https://docs.blippar.com/webar-sdk/v2.0.3/api/api-ref-1.5.3).
{% endhint %}

* Create \<a-entity>, an empty node that has an attirbute webar-face and include a gltf model.&#x20;
* Define or adjust the position, rotation, scale etc based on the location where you want to place the model on the face.
* Modify the above parameters to adjust the position of the chosen model

## Example Face Tracking and Facial Effects Video

{% embed url="<https://drive.google.com/file/d/13uq8KtSzMs6sDPnNYcX2NMwffG7dh-cK/view?usp=sharing>" %}
Example Face Tracking and Facial Effects
{% endembed %}
