Release Notes
Release 2.0.0
Introduction
We are excited to introduce the latest update to our SLAM surface tracking technology, designed to offer enhanced performance and reliability for AR experiences. This release supports AFrame, Babylon, and Playcanvas. Key Features and Improvements
Enhanced Accuracy:
Improved tracking precision for more stable and accurate surface detection.
Reduced drift and increased robustness in diverse environments.
Faster Initialization:
Quicker setup times allowing for near-instantaneous surface recognition and interaction.
Optimized Performance:
Smoother and more responsive tracking with reduced computational overhead.
Enhanced performance on a wide range of devices, ensuring a seamless user experience.
Improved Stability:
Enhanced algorithms for maintaining stable tracking even in challenging conditions such as low-light or dynamic environments.
Increased Range:
Extended tracking range enabling larger and more complex AR interactions.
Improved handling of occlusions and dynamic changes in the environment.
User-Friendly Integration:
Simplified API and SDK integration process, making it easier for developers to incorporate advanced surface tracking into their applications.
Support for Multiple Engines:
AFrame: Full compatibility with AFrame, allowing developers to create immersive WebAR experiences with ease.
Babylon: Seamless integration with Babylon, enabling high-quality 3D graphics and interactive AR applications.
Playcanvas: Support for Playcanvas, providing a powerful platform for developing interactive 3D WebAR content.
Enhanced User Interaction Control using New APIs
SetUserGestureRotation(enabled):
This API allows you to enable or disable user rotation.
enabled = true
: Rotation is enabled.enabled = false
: Rotation is disabled.
SetUserGestureScale(enabled):
This API enables or disables user scaling.
enabled = true
: Scale is enabled.enabled = false
: Scale is disabled.
Release 1.7.4
What's New?
WebAR SDK Added new API
SetFaceScaleFactor()
for face tracking It configures the scale at which AR objects are displayed relative to the user’s face in a web-based AR experience.Unity As part of Release 1.7.4 an enhancement has been deployed to Unity build release v.1.7.4-1 with peel-away mode. Enable this feature to allow the object to be displayed even when the user moves the camera away from the marker, the AR object will still remain on the screen. For more information refer 'Build a marker tracking experience'
Release 1.7.3
What’s New?
Face tracking released for all rendering engines A-Frame, Babylon.js PlayCanvas
Enhanced UX for Playcanvas Surface Tracking Introduced a new and improved user experience for PlayCanvas Surface tracking, streamlining and enhancing user interactions. For more information refer article 'Build a Surface tracking experience using A-frame- Step 6'
Lazy-Mode Support for Face Tracking This update brings
lazy-mode
support, optimizing face tracking for efficiency and performance.WEBARSDK.EnableTrackingOnDesktop() Function A new function to enable desktop-only tracking specifically for face tracking. This is particularly useful in scenarios like BB, where the app initially starts in lazy-mode and subsequently determines the appropriate tracking mode.
webar-face-pivot
- AFrame Attribute A new attribute that allows an entity to move in sync with the user’s head movements, without rotating with the head. This enhances the realism and interactivity of AR experiences. For more information refer article 'Build a Face Tracking experience using A-frame'webar-raycaster
- AFrame Attribute A specialized attribute for improved AR interaction in face-tracking mode. Developers can now useel.addEventListener('click', (evt) => {})
to receive raycast events. This attribute replaces AFrame’s native raycaster and cursor attributes in face tracking mode, addressing compatibility issues with AFrame’s built-in raycaster component in face-tracking scenarios. Our custom raycaster component ensures better performance and reliability.
Example
Release 1.7.1
What’s New
Enhanced User Experience for Surface Tracking
New UX for Surface Tracking: Added a ‘safe zone’ that appears once SLAM anchors, allowing users to place objects anywhere using the cursor. Just tap on the screen to place an object.
Interactive Animations & Cues: Introduced animations and visual cues to further enhance the user experience during interaction.Model Interaction:
Pinch to Zoom: Implemented pinch-to-zoom functionality to allow users to zoom in and out for a better model viewing experience.
Swipe to Rotate: Added the ability to rotate the model by swiping, making it easier to adjust the model’s orientation.
Enabling New UX Experience
New APIs
SetARModelPlaceCallback(callback)
Assigns a callback to execute after an AR model is placed within the scene, enabling custom post-placement interactions.
SetResetButtonCallback(callback)
Links a custom callback to the reset button’s action for customized reset logic. Requires webar-ux-control
with stageCursorUX: true
SetResetButtonVisibility(isVisible)
Controls the reset button’s visibility within the AR interface, allowing for UI customization.
SDK Configuration Properties
rotation-speed
Controls the sensitivity of the AR object’s rotation in response to user gestures. Accepts values greater than 0 and less than 1.
gesture-scale-max
Sets the maximum scale factor for enlarging an AR object using pinch gestures.
gesture-scale-min
Sets the minimum scale factor for reducing an AR object’s size in a single pinch gesture.
hide-reset-button
Optionally hides the reset button from the UI, active only when webar-ux-control
with stageCursorUX: true
is configured.
Dependency on webar-ux-control
: All new functionalities—including callbacks, visibility controls, and interactive attributes—require a properly configured webar-ux-control
. This is essential for activating the full range of UX and interactive features in both A-Frame and Babylon.js environments.
Release 1.7.0
What's New?
Face Tracking for A-frame
Face tracking released for A-frame rendering engine. Will be followed by other rendering engines. Blippar is excited to announce that Face Tracking is now available within our WebAR SDK. The all new Blippar WebAR SDK is an industry-leading facial detection system which allows you to use any face as a digital canvas to build exciting AR experiences and increase user engagement for your brand.
webar-mode="face-tracking"
sets to face tracking mode. For more information refer article 'Build a face tracking experience using A-frame'
<a-entity webar-face>
to display 3D models on the tracked face
GetMouthOpenedMagnitude()
- To get the magnitude of the distance between the upper and lower lips. It gives a constant value in the range of 0 to 1.5.
<a-plane webar-facemesh>
to display image/video texture on the tracked facemesh
<a-plane webar-plane>
to display webcam video background(optional). Developer can hide this to display a face model on a white background.
Try Now
webar-sdk zip file has
face.html
which has a simple 3D face model and facemesh filter.The default face.html example in the zip is simplified in order to make an easy to create face tracking tutorial documentation.
For more complex demo testing, you may try the Newsletter Try-now examples.
New face.html
example has:
Try ons - Coolers and lipstick try-on.
To test face and facemesh tracking accuracy
Last updated