Build a Basic Face Tracking Experience

Learn how to apply facial effect and face tracking


This guide walks you through the process of building a basic scene using face tracking and facial effects feature of WebAR SDK. Get creative and explore. Use any face as a digital canvas for expressing AR creativity alongside driving deeper user engagement.

Face tracking and facial effects with Blippar WebAR SDK v1.7.0.

3D Face Tracking

WebAR SDK v1.7.0 is an industry-leading facial detection system that enables tracking & augmentation. Following are few features:

  • The end user has been a key consideration in the design and optimisation of our computer vision technologies. Increase immersion by using face tracking without effort using realism AR features that adjust to different device capabilities.

  • The robust facial tracking technology is built to track the face by identifying various face morphs including user facial expressions and position in the video frame.

  • It precisely detects and tracks your face in a video stream in real-time enabling you to create realistic feature-rich facial animation.

  • Overlay 3D objects on the face, apply effects and trigger the experience to enjoy Face AR.

  • Our facial recognition software is mobile, web, and desktop compatible. It offers excellent Face AR performance and adjusts to the capabilities of the device.

Landmark Face Tracking: 3D Facial Effects

WebAR SDK v1.7.0 and above, supports facial landmark tracking to apply various facial effects. Following are the features:

  • Blippar WebAR SDK creates a facial mesh and detects the morphs to enable increased efficiency and stability in in applying facial effects.

  • Landmark points such as nose, eyes etc including facial expressions and spatial position can be tracked.

  • Easily enable various facial effects and virtual try-on of various products as a use case application. It creates its own virtual mask that attaches to your face. The effects adjust to any angel applied over it. For eg: makeup, 3D effects, hats, face etc.

Types of Face Tracking

There are two types of face tracking that is supported with WebAR SDK v1.7.0 as listed below:

  1. WebAR Face: The SDK applies the 3D models on to the entire face.

  2. WebAr Face Mesh: The SDK creates a mesh of the entire face and detects landmark points (eyes, nose etc) on the face and creates a 3D model of the face upon which the facial effects can be applied.


Follow the steps to build a basic experience with face tracking and facial effects:

  • Download the latest SDK v.1.7.0 and open it using your favourite editor. For eg: Visual Studio Code

  • Navigate to .vscode and replace the settings.json file with the applicable ssl certificate and key.

For more information on obtaining SSL Certificate and Key refer article Develop Locally.

  • Navigate to A-frame and open the face.html file as shown below:

  • The latest updated script tag v1.4.2 ia available with the SDK kit. Ensure the same as shown below:

<title>Blippar WebAR SDK</title>
<script src=""></script>


Include the latest WebAR SDK v.1.7.0 in the script tag and ensure the webar-mode="face-tracking"

Include any API customization. For more information refer article API Customization.


  • Add webar-scene attribute to A-Frame's <a-scene> tag

  • Provide a valid Blippar license-key value in the key: property webar-scene="key: xxxxxxxx-1111-2222-3333-yyyyyyyyyyyy" as shown below

  webar-scene="key: xxxxxxxx-1111-2222-3333-yyyyyyyyyyyy"
  vr-mode-ui="enabled: false"
  device-orientation-permission-ui="enabled: false"
  loading-screen="enabled: false"
  renderer="colorManagement: false; antialias: true; physicallyCorrectLights: false;">

Setting <a-scene> renderer's colorManagement: true; above will reduce the face tracking fps significantly. Set it to true ONLY when complex 3D models needs the best rendering.


Add the default camera, webar-camera attribute to AFrame's tag as shown below

<a-camera webar-camera></a-camera>


  • Download the required models and images from the asset manager. For eg: 'Fuel Glass' and 'Lipstick' as shown below. You can include multiple models, images and effects.

In this example, both WebAR Face and WebAR Face Mesh is enabled to include face tracking to place 'Fuel Glasses' on the user's face and also include face mesh to apply 'Lipstick' on the user's lips.

  • Download the asset item (glb model) and update the <a-asset item id> along with the source path. Also download an image and update the <image id> along with the source path. To access the models, navigate to A-frame and open the models file as shown below:

A-frame will automatically download the image or the model after the id and path (or file name) are specified.

<a-assets timeout="60000">
  <a-asset-item id="fuel_glass" src="models/Fuel-glass.glb"></a-asset-item>
  <img id="liptexture" src="images/lipmask.png" alt=""/>

Replace the model name and the image name based on the selection of the glb models or images from the Asset Library. In the above example 'Fuel Glass' and 'Lipstick' are utilized as model and image. You can choose any model or image of your choice.


Display the webcam video in the background and set visible=true. If you want to display a white screen, set visible=false to this entity

<a-plane id="webar-video-plane" color="tomato" visible="true"></a-plane>

WebAR Video Pane

  • In General, the webcam will display the background video of the user when the experience is triggered. In order to customize the 'background' video pane update the colour and set visible=false,and the result will appear as below:

  • If the colour=tomato and visible=true, this will modify the background colour from the orginal display but will retain the visibility of the orginal background since it is set to true.

  • Create a <a-plane> and ensure the id is updated as mentioned below:

For modification, Ensure the id="webar-video-plane" and update the colour and visibility. Note that even without the above update SDK will create the mesh, but will not allow customization of the colour and the visibility.


Add webar-facemesh attribute to tag. This creates a mesh of the entire face and detects landmark points (eyes, nose etc) on the face and creates a 3D model of the face upon which the facial effects can be applied. For more information, refer Landmark Face Tracking.

  • Create a <a-plane> and also add the attribute webar-facemesh material and a geometry of the landmark points of the face will be created to apply any material.

To Apply mesh and colour utilise the code below and ensure colour and transparency is adjusted to make the mesh and the facial effect visible.

Material: for any 3D model or Mesh in case you want to modify, provide a surface or pattern such as wooden texture, shiny glass texture etc. Apply default A-frame material.

<a-plane webar-facemesh material="src: #liptexture; side: front; alphaTest: 0; color: #E32531; shader: flat; transparent: true; opacity: 1;"></a-plane>


  • Add webar-face attribute to an AFrame's parent <a-entity> tag.

  • The child elements of the webar-face displays the 3d model on the tracked face.

  • Origin of the face is at the center of the line connecting the two eyes.

  • Add webar-loadmonitor attribute to the entities to display loading progress screen before starting the face tracking

For more information refer article API Reference.

  • Create <a-entity>, an empty node that has an attirbute webar-face and include a gltf model.

  • Define or adjust the position, rotation, scale etc based on the location where you want to place the model on the face.

  • Modify the above parameters to adjust the position of the chosen model

Example Face Tracking and Facial Effects Video

Last updated