Build an Emoji Making App Effortlessly

Build an Emoji Making App Effortlessly

Emojis are a must-have tool in today's online communications as they help add color to text-based chatting and allow users to better express the emotions behind their words. Since the number of preset emojis is always limited, many apps now allow users to create their own custom emojis to keep things fresh and exciting.

For example, in a social media app, users who do not want to show their faces when making video calls can use an animated character to protect their privacy, with their facial expressions applied to the character; in a live streaming or e-commerce app, virtual streamers with realistic facial expressions are much more likely to attract watchers; in a video or photo shooting app, users can control the facial expressions of an animated character when taking a selfie, and then share the selfie via social media; and in an educational app for kids, a cute animated character with detailed facial expressions will make online classes much more fun and engaging for students.

I myself am developing such a messaging app. When chatting with friends and wanting to express themselves in ways other than words, users of my app can take a photo to create an emoji of themselves, or of an animated character they have selected. The app will then identify users' facial expressions, and apply their facial expressions to the emoji. In this way, users are able to create an endless amount of unique emojis. During the development of my app, I used the capabilities provided by HMS Core AR Engine to track users' facial expressions and convert the facial expressions into parameters, which greatly reduced the development workload. Now I will show you how I managed to do this.

Implementation

AR Engine provides apps with the ability to track and recognize facial expressions in real time, which can then be converted into facial expression parameters and used to accurately control the facial expressions of virtual characters.

Currently, AR Engine provides 64 facial expressions, including eyelid, eyebrow, eyeball, mouth, and tongue movements. It supports 21 eye-related movements, including eyeball movement and opening and closing the eyes; 28 mouth movements, including opening the mouth, puckering, pulling, or licking the lips, and moving the tongue; as well as 5 eyebrow movements, including raising or lowering the eyebrows.

Demo

Facial expression based emoji.gif

Development Procedure

Requirements on the Development Environment

  • JDK: 1.8.211 or later

  • Android Studio: 3.0 or later

  • minSdkVersion: 26 or later

  • targetSdkVersion: 29 (recommended)

  • compileSdkVersion: 29 (recommended)

  • Gradle version: 6.1.1 or later (recommended)

Make sure that you have downloaded the AR Engine APK from AppGallery and installed it on the device.

Test device: see Software and Hardware Requirements of AR Engine Features

If you need to use multiple HMS Core kits, use the latest versions required for these kits.

Preparations

i. Before getting started, you will need to register as a Huawei developer and complete identity verification on HUAWEI Developers. You can click here to find out the detailed registration and identity verification procedure.

ii. Before development, integrate the AR Engine SDK via the Maven repository into your development environment.

iii. The procedure for configuring the Maven repository address in Android Studio varies for Gradle plugin earlier than 7.0, Gradle plugin 7.0, and Gradle plugin 7.1 or later. You need to configure it according to the specific Gradle plugin version.

iv. Take Gradle plugin 7.0 as an example:

Open the project-level build.gradle file in your Android Studio project and configure the Maven repository address.

Go to buildscript > repositories and configure the Maven repository address for the SDK.

buildscript {
     repositories {
         google()
         jcenter()
         maven {url "https://developer.huawei.com/repo/" }
     }
}

Open the project-level settings.gradle file and configure the Maven repository address for the HMS Core SDK.

dependencyResolutionManagement {
    repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
      repositories {
           repositories {
                google()
               jcenter()
               maven {url "https://developer.huawei.com/repo/" }
           }
       }
}

v. Add the following build dependency in the dependencies block.

dependencies {
    implementation 'com.huawei.hms:arenginesdk:{version}
}

App Development

i. Check whether AR Engine has been installed on the current device. If yes, your app can run properly. If not, you need to prompt the user to install it, for example, by redirecting the user to AppGallery. The sample code is as follows:

boolean isInstallArEngineApk =AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk) {
    // ConnectAppMarketActivity.class is the activity for redirecting users to AppGallery.
startActivity(new Intent(this, com.huawei.arengine.demos.common.ConnectAppMarketActivity.class));
    isRemindInstall = true;
}

ii. Create an AR scene. AR Engine supports five scenes, including motion tracking (ARWorldTrackingConfig), face tracking (ARFaceTrackingConfig), hand recognition (ARHandTrackingConfig), human body tracking (ARBodyTrackingConfig), and image recognition(ARImageTrackingConfig).

The following takes creating a face tracking scene by calling ARFaceTrackingConfig as an example.

// Create an ARSession object.
mArSession = new ARSession(this);
// Select a specific Config to initialize the ARSession object based on the application scenario.
ARFaceTrackingConfig config = new ARFaceTrackingConfig(mArSession);

Set scene parameters using the config.setXXX method.

// Set the camera opening mode, which can be external or internal. The external mode can only be used in ARFace. Therefore, you are advised to use the internal mode.
mArConfig.setImageInputMode(ARConfigBase.ImageInputMode.EXTERNAL_INPUT_ALL);

iii. Set the AR scene parameters for face tracking and start face tracking.

mArSession.configure(mArConfig);
mArSession.resume();

iv. Initialize the FaceGeometryDisplay class to obtain the facial geometric data and render the data on the screen.

public class FaceGeometryDisplay {
// Initialize the OpenGL ES rendering related to face geometry, including creating the shader program.
void init(Context context) {...
}
}

v. Initialize the onDrawFrame method in the FaceGeometryDisplay class, and call face.getFaceGeometry() to obtain the face mesh.

public void onDrawFrame(ARCamera camera, ARFace face) {
    ARFaceGeometry faceGeometry = face.getFaceGeometry();
    updateFaceGeometryData(faceGeometry);
    updateModelViewProjectionData(camera, face);
    drawFaceGeometry();
    faceGeometry.release();
}

vi. Initialize updateFaceGeometryData() in the FaceGeometryDisplay class.

Pass the face mesh data for configuration and set facial expression parameters using OpenGl ES.

private void updateFaceGeometryData (ARFaceGeometry faceGeometry) {
FloatBuffer faceVertices = faceGeometry.getVertices();
FloatBuffer textureCoordinates =faceGeometry.getTextureCoordinates();
// Obtain an array consisting of face mesh texture coordinates, which is used together with the vertex data returned by getVertices() during rendering.
}

vii. Initialize the FaceRenderManager class to manage facial data rendering.

public class FaceRenderManager implements GLSurfaceView.Renderer {
public FaceRenderManager(Context context, Activity activity) {
    mContext = context;
    mActivity = activity;
}
// Set ARSession to obtain the latest data.
public void setArSession(ARSession arSession) {
    if (arSession == null) {
        LogUtil.error(TAG, "Set session error, arSession is null!");
        return;
    }
    mArSession = arSession;
}
// Set ARConfigBase to obtain the configuration mode.
public void setArConfigBase(ARConfigBase arConfig) {
    if (arConfig == null) {
        LogUtil.error(TAG, "setArFaceTrackingConfig error, arConfig is null.");
        return;
    }
    mArConfigBase = arConfig;
}
// Set the camera opening mode.
public void setOpenCameraOutsideFlag(boolean isOpenCameraOutsideFlag) {
    isOpenCameraOutside = isOpenCameraOutsideFlag;
}
...
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
mFaceGeometryDisplay.init(mContext);
}
}

viii. Implement the face tracking effect by calling methods like setArSession and setArConfigBase of FaceRenderManager in FaceActivity.

public class FaceActivity extends BaseActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
mFaceRenderManager = new FaceRenderManager(this, this);
mFaceRenderManager.setDisplayRotationManage(mDisplayRotationManager);
mFaceRenderManager.setTextView(mTextView);

glSurfaceView.setRenderer(mFaceRenderManager);
glSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
}
}

Conclusion

Emojis allow users to express their moods and excitement in a way words can't. Instead of providing users with a selection of the same old boring preset emojis that have been used a million times, you can now make your app more fun by allowing users to create emojis themselves! Users can easily create an emoji with their own smiles, simply by facing the camera, selecting an animated character they love, and smiling. With such an ability to customize emojis, users will be able to express their feelings in a more personalized and interesting manner. If you have any interest in developing such an app, AR Engine is a great choice for you. With accurate facial tracking capabilities, it is able to identify users' facial expressions in real time, convert the facial expressions into parameters, and then apply them to virtual characters. Integrating the capability can help you considerably streamline your app development process, leaving you with more time to focus on how to provide more interesting features to users and improve your app's user experience.

Reference

AR Engine Sample Code

AR Engine Development Guide

Face Tracking Capability