Release Notes
Last updated
Last updated
WebAR SDK
Added new API SetFaceScaleFactor()
for face tracking
It configures the scale at which AR objects are displayed relative to the user’s face in a web-based AR experience.
Unity As part of Release 1.7.4 an enhancement has been deployed to Unity build release v.1.7.4-1 with peel-away mode. Enable this feature to allow the object to be displayed even when the user moves the camera away from the marker, the AR object will still remain on the screen. For more information refer 'Build a marker tracking experience'
Downloads | Example Links |
---|---|
Face tracking released for all rendering engines A-Frame, Babylon.js PlayCanvas
Enhanced UX for Playcanvas Surface Tracking Introduced a new and improved user experience for PlayCanvas Surface tracking, streamlining and enhancing user interactions. For more information refer article 'Build a Surface tracking experience using A-frame- Step 6'
Lazy-Mode Support for Face Tracking
This update brings lazy-mode
support, optimizing face tracking for efficiency and performance.
WEBARSDK.EnableTrackingOnDesktop() Function A new function to enable desktop-only tracking specifically for face tracking. This is particularly useful in scenarios like BB, where the app initially starts in lazy-mode and subsequently determines the appropriate tracking mode.
webar-face-pivot
- AFrame Attribute
A new attribute that allows an entity to move in sync with the user’s head movements, without rotating with the head. This enhances the realism and interactivity of AR experiences. For more information refer article 'Build a Face Tracking experience using A-frame'
webar-raycaster
- AFrame Attribute
A specialized attribute for improved AR interaction in face-tracking mode. Developers can now use el.addEventListener('click', (evt) => {})
to receive raycast events. This attribute replaces AFrame’s native raycaster and cursor attributes in face tracking mode, addressing compatibility issues with AFrame’s built-in raycaster component in face-tracking scenarios. Our custom raycaster component ensures better performance and reliability.
Example
New UX for Surface Tracking: Added a ‘safe zone’ that appears once SLAM anchors, allowing users to place objects anywhere using the cursor. Just tap on the screen to place an object.
Interactive Animations & Cues: Introduced animations and visual cues to further enhance the user experience during interaction.Model Interaction:
Pinch to Zoom: Implemented pinch-to-zoom functionality to allow users to zoom in and out for a better model viewing experience.
Swipe to Rotate: Added the ability to rotate the model by swiping, making it easier to adjust the model’s orientation.
Dependency on webar-ux-control
: All new functionalities—including callbacks, visibility controls, and interactive attributes—require a properly configured webar-ux-control
. This is essential for activating the full range of UX and interactive features in both A-Frame and Babylon.js environments.
Face tracking released for A-frame rendering engine. Will be followed by other rendering engines. Blippar is excited to announce that Face Tracking is now available within our WebAR SDK. The all new Blippar WebAR SDK is an industry-leading facial detection system which allows you to use any face as a digital canvas to build exciting AR experiences and increase user engagement for your brand.
webar-sdk zip file has face.html
which has a simple 3D face model and facemesh filter.
The default face.html example in the zip is simplified in order to make an easy to create face tracking tutorial documentation.
For more complex demo testing, you may try the Newsletter Try-now examples.
New face.html
example has:
Try ons - Coolers and lipstick try-on.
To test face and facemesh tracking accuracy
Downloads | Examples |
---|---|
API | Function |
---|---|
Configuration | Description |
---|---|
Downloads | Examples |
---|---|
SDK Attributes | A-Frame Attributes | Functions |
---|---|---|
PlayCanvas
PlayCanvas
SetARModelPlaceCallback(callback)
Assigns a callback to execute after an AR model is placed within the scene, enabling custom post-placement interactions.
SetResetButtonCallback(callback)
Links a custom callback to the reset button’s action for customized reset logic. Requires webar-ux-control
with stageCursorUX: true
SetResetButtonVisibility(isVisible)
Controls the reset button’s visibility within the AR interface, allowing for UI customization.
rotation-speed
Controls the sensitivity of the AR object’s rotation in response to user gestures. Accepts values greater than 0 and less than 1.
gesture-scale-max
Sets the maximum scale factor for enlarging an AR object using pinch gestures.
gesture-scale-min
Sets the minimum scale factor for reducing an AR object’s size in a single pinch gesture.
hide-reset-button
Optionally hides the reset button from the UI, active only when webar-ux-control
with stageCursorUX: true
is configured.
PlayCanvas
webar-mode="face-tracking"
sets to face tracking mode. For more information refer article 'Build a face tracking experience using A-frame'
<a-entity webar-face>
to display 3D models on the tracked face
GetMouthOpenedMagnitude()
- To get the magnitude of the distance between the upper and lower lips. It gives a constant value in the range of 0 to 1.5.
<a-plane webar-facemesh>
to display image/video texture on the tracked facemesh
<a-plane webar-plane>
to display webcam video background(optional). Developer can hide this to display a face model on a white background.