Release Notes

Release 1.7.4

What's New?

  • WebAR SDK Added new API SetFaceScaleFactor() for face tracking It configures the scale at which AR objects are displayed relative to the user’s face in a web-based AR experience.
  • Unity As part of Release 1.7.4 an enhancement has been deployed to Unity build release v.1.7.4-1 with peel-away mode. Enable this feature to allow the object to be displayed even when the user moves the camera away from the marker, the AR object will still remain on the screen. For more information refer 'Build a marker tracking experience'

Release 1.7.3

What’s New?

  • Face tracking released for all rendering engines A-Frame, Babylon.js PlayCanvas
  • Enhanced UX for Playcanvas Surface Tracking Introduced a new and improved user experience for PlayCanvas Surface tracking, streamlining and enhancing user interactions. For more information refer article 'Build a Surface tracking experience using A-frame- Step 6'
  • Lazy-Mode Support for Face Tracking This update brings lazy-mode support, optimizing face tracking for efficiency and performance.
  • WEBARSDK.EnableTrackingOnDesktop() Function A new function to enable desktop-only tracking specifically for face tracking. This is particularly useful in scenarios like BB, where the app initially starts in lazy-mode and subsequently determines the appropriate tracking mode.
  • webar-face-pivot - AFrame Attribute A new attribute that allows an entity to move in sync with the user’s head movements, without rotating with the head. This enhances the realism and interactivity of AR experiences. For more information refer article 'Build a Face Tracking experience using A-frame'
  • webar-raycaster - AFrame Attribute A specialized attribute for improved AR interaction in face-tracking mode. Developers can now use el.addEventListener('click', (evt) => {}) to receive raycast events. This attribute replaces AFrame’s native raycaster and cursor attributes in face tracking mode, addressing compatibility issues with AFrame’s built-in raycaster component in face-tracking scenarios. Our custom raycaster component ensures better performance and reliability.
webar-scene="key: <%= htmlWebpackPlugin.options.licenseKey %>"
webar-raycaster="objects: .clickable; enabled: true;"
vr-mode-ui="enabled: false"
device-orientation-permission-ui="enabled: false"
loading-screen="enabled: false"
renderer="colorManagement: false; antialias: true;
physicallyCorrectLights: false;">

Release 1.7.1

What’s New

Enhanced User Experience for Surface Tracking

  • New UX for Surface Tracking: Added a ‘safe zone’ that appears once SLAM anchors, allowing users to place objects anywhere using the cursor. Just tap on the screen to place an object.
  • Interactive Animations & Cues: Introduced animations and visual cues to further enhance the user experience during interaction.Model Interaction:
  • Pinch to Zoom: Implemented pinch-to-zoom functionality to allow users to zoom in and out for a better model viewing experience.
  • Swipe to Rotate: Added the ability to rotate the model by swiping, making it easier to adjust the model’s orientation.

Enabling New UX Experience

New APIs

Assigns a callback to execute after an AR model is placed within the scene, enabling custom post-placement interactions.
Links a custom callback to the reset button’s action for customized reset logic. Requires webar-ux-control with stageCursorUX: true
Controls the reset button’s visibility within the AR interface, allowing for UI customization.

SDK Configuration Properties

Controls the sensitivity of the AR object’s rotation in response to user gestures. Accepts values greater than 0 and less than 1.
Sets the maximum scale factor for enlarging an AR object using pinch gestures.
Sets the minimum scale factor for reducing an AR object’s size in a single pinch gesture.
Optionally hides the reset button from the UI, active only when webar-ux-control with stageCursorUX: true is configured.
Dependency on webar-ux-control: All new functionalities—including callbacks, visibility controls, and interactive attributes—require a properly configured webar-ux-control. This is essential for activating the full range of UX and interactive features in both A-Frame and Babylon.js environments.

Release 1.7.0

What's New?

Face Tracking for A-frame

Face tracking released for A-frame rendering engine. Will be followed by other rendering engines. Blippar is excited to announce that Face Tracking is now available within our WebAR SDK. The all new Blippar WebAR SDK is an industry-leading facial detection system which allows you to use any face as a digital canvas to build exciting AR experiences and increase user engagement for your brand.
SDK Attributes
A-Frame Attributes
webar-mode="face-tracking" sets to face tracking mode. For more information refer article 'Build a face tracking experience using A-frame'
<a-entity webar-face> to display 3D models on the tracked face
GetMouthOpenedMagnitude() - To get the magnitude of the distance between the upper and lower lips. It gives a constant value in the range of 0 to 1.5.
<a-plane webar-facemesh> to display image/video texture on the tracked facemesh
<a-plane webar-plane> to display webcam video background(optional). Developer can hide this to display a face model on a white background.

Try Now

  • webar-sdk zip file has face.html which has a simple 3D face model and facemesh filter.
  • The default face.html example in the zip is simplified in order to make an easy to create face tracking tutorial documentation.
  • For more complex demo testing, you may try the Newsletter Try-now examples.
New face.html example has:
  • Try ons - Coolers and lipstick try-on.
  • To test face and facemesh tracking accuracy