Augmented reality (AR) bridges real and virtual worlds, by integrating digital content into real-world environments. It allows people to interact with virtual objects as if they are real. Examples include product displays in shopping apps, interior design layouts in home design apps, accessible learning materials, real-time navigation, and immersive AR games. AR technology makes digital services and experiences more accessible than ever.
This has enormous implications in daily life. For instance, when shooting short videos or selfies, users can switch between different special effects or control the shutter button with specific gestures, which spares them from having to touch the screen. When browsing clothes or accessories, on an e-commerce website, users can use AR to "wear" the items virtually, and determine which clothing articles fit them, or which accessories match with which outfits. All of these services are dependent on precise hand gesture recognition, which HMS Core AR Engine provides via its hand skeleton tracking capability. If you are considering developing an app providing AR features, you would be remiss not to check out this capability, as it can streamline your app development process substantially.
The hand skeleton tracking capability works by detecting and tracking the positions and postures of up to 21 hand skeleton joints, and generating true-to-life hand skeleton models with attributes like fingertip endpoints and palm orientation, as well as the hand skeleton itself. Please note that when there is more than one hand in an image, the service will only send back results and coordinates from the hand in which it has the highest degree of confidence. Currently, this service is only supported on certain Huawei phone models that are capable of obtaining image depth information.
AR Engine detects the hand skeleton in a precise manner, allowing your app to superimpose virtual objects on the hand with a high degree of accuracy, including on the fingertips or palm. You can also perform a greater number of precise operations on virtual hands, to enrich your AR app with fun new experiences and interactions.
Hand skeleton diagram
Application Scenarios
Simple Sign Language Translation
The hand skeleton tracking capability can also be used to translate simple gestures in sign languages. By detecting key hand skeleton joints, it predicts how the hand posture will change, and maps movements like finger bending to a set of predefined gestures, based on a set of algorithms. For example, holding up the hand in a fist with the index finger sticking out is mapped to the gesture number one (1). This means that the kit can help equip your app with sign language recognition and translation features.
Building a Contactless Operation Interface
In science fiction movies, it is quite common to see a character controlling a computer panel with air gestures. With the skeleton tracking capability in AR Engine, this mind-bending technology is no longer out of reach.
With the phone's camera tracking the user's hand in real time, key skeleton joints like the fingertips are identified with a high degree of precision, which allows the user to interact with virtual objects with specific simple gestures. For example, pressing down on a virtual button can trigger an action, pressing and holding a virtual object can display the menu options, spreading two fingers apart on a small object across a larger object can show the details, or resizing a virtual object and placing it in a virtual pocket.
Such contactless gesture-based controls have been widely used in fields as diverse as medical equipment and vehicle head units.
Interactive Short Videos & Live Streaming
The hand skeleton tracking capability in AR Engine can help with adding gesture-based special effects to short videos or live streams. For example, when the user is shooting a short video, or starting a live stream, the capability will enable your app to identify their gestures, such as a V-sign, thumbs up, or finger heart, and then apply the corresponding special effects or stickers to the short video or live stream. This makes the interactions more engaging and immersive, and makes your app more appealing to users than competitor apps.
Hand skeleton tracking is also ideal in contexts like animation, course material presentation, medical training and imaging, and smart home controls.
The rapid development of AR technologies has made human-computer interactions based on gestures a hot topic throughout the industry. Implementing natural and human-friendly gesture recognition solutions is key to making these interactions more engaging. Hand skeleton tracking is the foundation for gesture recognition. By integrating AR Engine, you will be able to use this tracking capability to develop AR apps that provide users with more interesting and effortless features. Apps that offer such outstanding AR features will undoubtedly provide an enhanced user experience that helps them stand out from the myriad of competitor apps.
Conclusion
Augmented reality is one of the most exciting new technological developments in the past few years, and a proven method for presenting a variety of digital content, including text, graphics, and videos, in a visually immersive manner. Now an increasing number of apps are opting to provide AR-based features of their own, in order to provide an interactive and easy-to-use experience in fields as diverse as medical training, interior design and modeling, real-time navigation, virtual classrooms, health care, and entertainment. Hand gesture recognition is at the core of this trend. If you are currently developing an app, the right development kit, which offers all the preset capabilities that you need for your app, is key to reducing the development workload and building the features that you want. It can also let you focus on optimizing the app's feature design and the user experience. AR Engine offers an effective and easy-to-use hand gesture tracking capability for AR apps. By integrating this kit, your app will be able to identify user hand gestures in real time in a highly precise manner, implement responsive user-device interactions based on these detected hand gestures, and therefore provide users with highly immersive and engaging AR experience.