Release Notes

Release 2.0.0

Introduction

We are excited to introduce the latest update to our SLAM surface tracking technology, designed to offer enhanced performance and reliability for AR experiences. This release supports AFrame, Babylon, and Playcanvas. Key Features and Improvements

  1. Enhanced Accuracy:

  • Improved tracking precision for more stable and accurate surface detection.

  • Reduced drift and increased robustness in diverse environments.

  1. Faster Initialization:

  • Quicker setup times allowing for near-instantaneous surface recognition and interaction.

  1. Optimized Performance:

  • Smoother and more responsive tracking with reduced computational overhead.

  • Enhanced performance on a wide range of devices, ensuring a seamless user experience.

  1. Improved Stability:

  • Enhanced algorithms for maintaining stable tracking even in challenging conditions such as low-light or dynamic environments.

  1. Increased Range:

  • Extended tracking range enabling larger and more complex AR interactions.

  • Improved handling of occlusions and dynamic changes in the environment.

  1. User-Friendly Integration:

  • Simplified API and SDK integration process, making it easier for developers to incorporate advanced surface tracking into their applications.

  1. Support for Multiple Engines:

  • AFrame: Full compatibility with AFrame, allowing developers to create immersive WebAR experiences with ease.

  • Babylon: Seamless integration with Babylon, enabling high-quality 3D graphics and interactive AR applications.

  • Playcanvas: Support for Playcanvas, providing a powerful platform for developing interactive 3D WebAR content.

Enhanced User Interaction Control using New APIs

  1. SetUserGestureRotation(enabled):

  • This API allows you to enable or disable user rotation.

  • enabled = true: Rotation is enabled.

  • enabled = false: Rotation is disabled.

  1. SetUserGestureScale(enabled):

  • This API enables or disables user scaling.

  • enabled = true: Scale is enabled.

  • enabled = false: Scale is disabled.

Release 1.7.4

What's New?

  • WebAR SDK Added new API SetFaceScaleFactor() for face tracking It configures the scale at which AR objects are displayed relative to the user’s face in a web-based AR experience.

  • Unity As part of Release 1.7.4 an enhancement has been deployed to Unity build release v.1.7.4-1 with peel-away mode. Enable this feature to allow the object to be displayed even when the user moves the camera away from the marker, the AR object will still remain on the screen. For more information refer 'Build a marker tracking experience'

Release 1.7.3

What’s New?

  • Face tracking released for all rendering engines A-Frame, Babylon.js PlayCanvas

  • Enhanced UX for Playcanvas Surface Tracking Introduced a new and improved user experience for PlayCanvas Surface tracking, streamlining and enhancing user interactions. For more information refer article 'Build a Surface tracking experience using A-frame- Step 6'

  • Lazy-Mode Support for Face Tracking This update brings lazy-mode support, optimizing face tracking for efficiency and performance.

  • WEBARSDK.EnableTrackingOnDesktop() Function A new function to enable desktop-only tracking specifically for face tracking. This is particularly useful in scenarios like BB, where the app initially starts in lazy-mode and subsequently determines the appropriate tracking mode.

  • webar-face-pivot - AFrame Attribute A new attribute that allows an entity to move in sync with the user’s head movements, without rotating with the head. This enhances the realism and interactivity of AR experiences. For more information refer article 'Build a Face Tracking experience using A-frame'

  • webar-raycaster - AFrame Attribute A specialized attribute for improved AR interaction in face-tracking mode. Developers can now use el.addEventListener('click', (evt) => {}) to receive raycast events. This attribute replaces AFrame’s native raycaster and cursor attributes in face tracking mode, addressing compatibility issues with AFrame’s built-in raycaster component in face-tracking scenarios. Our custom raycaster component ensures better performance and reliability.

Example

<a-scene
      webar-scene="key: <%= htmlWebpackPlugin.options.licenseKey %>"
      webar-raycaster="objects: .clickable; enabled: true;"
      vr-mode-ui="enabled: false"
      device-orientation-permission-ui="enabled: false"
      loading-screen="enabled: false"
      renderer="colorManagement: false; antialias: true; 
      physicallyCorrectLights: false;">

Release 1.7.1

What’s New

Enhanced User Experience for Surface Tracking

  • New UX for Surface Tracking: Added a ‘safe zone’ that appears once SLAM anchors, allowing users to place objects anywhere using the cursor. Just tap on the screen to place an object.

  • Interactive Animations & Cues: Introduced animations and visual cues to further enhance the user experience during interaction.Model Interaction:

  • Pinch to Zoom: Implemented pinch-to-zoom functionality to allow users to zoom in and out for a better model viewing experience.

  • Swipe to Rotate: Added the ability to rotate the model by swiping, making it easier to adjust the model’s orientation.

Enabling New UX Experience

New APIs

SDK Configuration Properties

Dependency on webar-ux-control: All new functionalities—including callbacks, visibility controls, and interactive attributes—require a properly configured webar-ux-control. This is essential for activating the full range of UX and interactive features in both A-Frame and Babylon.js environments.

Release 1.7.0

What's New?

Face Tracking for A-frame

Face tracking released for A-frame rendering engine. Will be followed by other rendering engines. Blippar is excited to announce that Face Tracking is now available within our WebAR SDK. The all new Blippar WebAR SDK is an industry-leading facial detection system which allows you to use any face as a digital canvas to build exciting AR experiences and increase user engagement for your brand.

Try Now

  • webar-sdk zip file has face.html which has a simple 3D face model and facemesh filter.

  • The default face.html example in the zip is simplified in order to make an easy to create face tracking tutorial documentation.

  • For more complex demo testing, you may try the Newsletter Try-now examples.

New face.html example has:

  • Try ons - Coolers and lipstick try-on.

  • To test face and facemesh tracking accuracy

Last updated