Implement Virtual Try-on With Hand Skeleton Tracking

Photo by Adrien King on Unsplash

Implement Virtual Try-on With Hand Skeleton Tracking

You have likely seen user reviews complaining about how the online shopping experiences, in particular the inability to try on clothing items before purchase. Augmented reality (AR) enabled virtual try-on has resolved this longstanding issue, making it possible for users to try on items before purchase.

Virtual try-on allows the user to try on clothing, or accessories like watches, glasses, and makeup, virtually on their phone. Apps that offer AR try-on features empower their users to make informed purchases, based on which items look best and fit best, and therefore considerably improve the online shopping experience for users. For merchants, AR try-on can both boost conversion rates and reduce return rates, as customers are more likely to be satisfied with what they have purchased after the try-on. That is why so many online stores and apps are now providing virtual try-on features of their own.

When developing an online shopping app, AR is truly a technology that you can't miss. For example, if you are building an app or platform for watch sellers, you will want to provide a virtual watch try-on feature, which is dependent on real-time hand recognition and tracking. This can be done with remarkable ease in HMS Core AR Engine, which provides a wide range of basic AR capabilities, including hand skeleton tracking, human body tracking, and face tracking. Once you have integrated this tool kit, your users will be able to try on different watches virtually within your app before purchases. Better yet, the development process is highly streamlined. During the virtual try-on, the user's hand skeleton is recognized in real time by the engine, with a high degree of precision, and virtual objects are superimposed on the hand. The user can even choose to place an item on their fingertip! Next I will show you how you can implement this marvelous capability.

Demo

Virtual watch try-on.gif

Implementation

AR Engine provides a hand skeleton tracking capability, which identifies and tracks the positions and postures of up to 21 hand skeleton points, forming a hand skeleton model.

Thanks to the gesture recognition capability, the engine is able to provide AR apps with fun, interactive features. For example, your app will allow users to place virtual objects in specific positions, such as on the fingertips or in the palm, and enable the virtual hand to perform intricate movements.

Now I will show you how to develop an app that implements AR watch virtual try-on based on this engine.

Integration Procedure

Requirements on the development environment:

  • JDK: 1.8.211 or later

  • Android Studio: 3.0 or later

  • minSdkVersion: 26 or later

  • targetSdkVersion: 29 (recommended)

  • compileSdkVersion: 29 (recommended)

  • Gradle version: 6.1.1 or later (recommended)

Make sure that you have downloaded the AR Engine APK from AppGallery and installed it on the device.

If you need to use multiple HMS Core kits, use the latest versions required for these kits.

Preparations

i. Before getting started, you will need to register as a Huawei developer and complete identity verification on the HUAWEI Developers website. You can click here to find out the detailed registration and identity verification procedure.

ii. Before getting started, integrate the AR Engine SDK via the Maven repository into your development environment.

iii. The procedure for configuring the Maven repository address in Android Studio varies for Gradle plugin earlier than 7.0, Gradle plugin 7.0, and Gradle plugin 7.1 or later. You need to configure it according to the specific Gradle plugin version.

iv. Take Gradle plugin 7.0 as an example:

Open the project-level build.gradle file in your Android Studio project and configure the Maven repository address.

Go to buildscript > repositories and configure the Maven repository address for the SDK.

buildscript {
     repositories {
         google()
         jcenter()
         maven {url "https://developer.huawei.com/repo/" }
     }
}

Open the project-level settings.gradle file and configure the Maven repository address for the HMS Core SDK.

dependencyResolutionManagement {
    repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
      repositories {
           repositories {
                google()
               jcenter()
               maven {url "https://developer.huawei.com/repo/" }
           }
       }
}

v. Add the following build dependency in the dependencies block.

dependencies {
    implementation 'com.huawei.hms:arenginesdk:{version}
}

App Development

i. Check whether AR Engine has been installed on the current device. If so, your app will be able to run properly on the device. If not, you need to prompt the user to install AR Engine on the device, for example, by redirecting the user to AppGallery and prompting the user to install it. The sample code is as follows:

boolean isInstallArEngineApk =AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk) {
    // ConnectAppMarketActivity.class is the activity for redirecting users to AppGallery.
startActivity(new Intent(this, com.huawei.arengine.demos.common.ConnectAppMarketActivity.class));
     isRemindInstall = true;
}

ii. Initialize an AR scene. AR Engine supports five scenes, including motion tracking (ARWorldTrackingConfig) scene, face tracking (ARFaceTrackingConfig) scene, hand recognition (ARHandTrackingConfig) scene, human body tracking (ARBodyTrackingConfig) scene, and image recognition (ARImageTrackingConfig) scene.

Call ARHandTrackingConfig to initialize the hand recognition scene.

mArSession = new ARSession(context);
ARHandTrackingConfig config = new ARHandTrackingconfig(mArSession);

iii. After obtaining an ARhandTrackingconfig object, you can set the front or rear camera. The sample code is as follows:

Config.setCameraLensFacing(ARConfigBase.CameraLensFacing.FRONT);

iv. After obtaining config, configure it in ArSession, and start hand recognition.

mArSession.configure(config);
mArSession.resume();

v. Initialize the HandSkeletonLineDisplay class, which draws the hand skeleton based on the coordinates of the hand skeleton points.

Class HandSkeletonLineDisplay implements HandRelatedDisplay{
// Methods used in this class are as follows:
// Initialization method.
public void init(){
}
// Method for drawing the hand skeleton. When calling this method, you need to pass the ARHand object to obtain data.
public void onDrawFrame(Collection<ARHand> hands,){

    // Call the getHandskeletonArray() method to obtain the coordinates of hand skeleton points.
        Float[] handSkeletons  =  hand.getHandskeletonArray();

        // Pass handSkeletons to the method for updating data in real time.
        updateHandSkeletonsData(handSkeletons);

}
// Method for updating the hand skeleton point connection data. Call this method when any frame is updated.
public void updateHandSkeletonLinesData(){

// Method for creating and initializing the data stored in the buffer object.
GLES20.glBufferData(…,mVboSize,…);

//Update the data in the buffer object.
GLES20.glBufferSubData(…,mPointsNum,…);

}
}

vi. Initialize the HandRenderManager class, which is used to render the data obtained from AR Engine.

Public class HandRenderManager implements GLSurfaceView.Renderer{

// Set the ARSession object to obtain the latest data in the onDrawFrame method.
 Public void setArSession(){
 }
}

vii. Initialize the onDrawFrame() method in the HandRenderManager class.

Public void onDrawFrame(){
// In this method, call methods such as setCameraTextureName() and update() to update the calculation result of ArEngine.
// Call this API when the latest data is obtained.
mSession.setCameraTextureName();
ARFrame arFrame = mSession.update();
ARCamera arCamera = arFrame.getCamera();
// Obtain the tracking result returned during hand tracking.
Collection<ARHand> hands =  mSession.getAllTrackables(ARHand.class);
     // Pass the obtained hands object in a loop to the method for updating gesture recognition information cyclically for processing.
     For(ARHand hand  :  hands){
         updateMessageData(hand);
}
}

viii. On the HandActivity page, set a render for SurfaceView.

mSurfaceView.setRenderer(mHandRenderManager);
Setting the rendering mode.
mSurfaceView.setRenderMode(GLEurfaceView.RENDERMODE_CONTINUOUSLY);

Conclusion

Augmented reality creates immersive, digital experiences that bridge the digital and real worlds, making human-machine interactions more seamless than ever. Fields like gaming, online shopping, tourism, medical training, and interior decoration have seen surging demand for AR apps and devices. In particular, AR is expected to dominate the future of online shopping, as it offers immersive experiences based on real-time interactions with virtual products, which is what younger generations are seeking for. This considerably improves user's shopping experience, and as a result helps merchants a lot in improving the conversion rate and reducing the return rate. If you are developing an online shopping app, virtual try-on is a must-have feature for your app, and AR Engine can give you everything you need. Try the engine to experience what smart, interactive features it can bring to users, and how it can streamline your development.