Researchers present new hand gesture recognition algorithm combining adaptive hand-like algorithm and efficient area ratio for efficient edge calculation

Almost all of our computer interactions occur through the mouse, keyboards, and touch screens. A key step in making human-machine interactions more efficient would be to move towards a more non-contact form of communication like speaking, facial expressions, and gestures, which are typically used when communicating with other humans. Specifically, previous studies on hand gesture recognition failed to achieve high accuracy while maintaining low computational complexity. This poor recognition accuracy is mainly due to the rotation, translation and scaling that often occurs when capturing images of hand gestures and the differences in hand types which vary from one person to another.

To improve the ability to recognize hand gestures and ensure a low computational load, a group of researchers from Sun Yat-sen University came up with an adaptive hand algorithm. The algorithm first classifies the input images according to their hand types into thin, normal, and wide categories. The hand type classification uses three characteristics: palm length, palm width, and finger length. The recognition task addressed in the research involves 9 hand gestures. After the hand type classification step, 360 images of hand gestures were captured (9 from each of the 40 volunteers who participated), which formed the overall library of hand gestures. Then, libraries dedicated to each type of hand are used to perform further classification. The research group selected characteristics of area-to-perimeter ratio (C) and effective area ratio (E) as well as seventh-order Hu moments that offer low complexity and are invariant for rotation, translation and setting. to scale to a large extent. Thus, the characteristic vector of each of the hand gesture images can be denoted: {C, E, Hu1,…, Hu7}.

The first step in the reconnaissance pipeline is the Hand-type adaptation, carried out using an area-perimeter ratio. The user’s area-to-perimeter (C) ratio is taken for the 9 gestures and is compared to obtain the lowest Euclidean distance correspondence among the thin, normal and wide classes. After that, gesture pre-recognition (which is a shortcut method) is performed using only the effective area ratio (E) to select 3 candidate gestures out of the total of 9 gestures based on Euclidean distance. Then, a complex high precision algorithm using the seventh order Hu characteristics based on Euclidean distance is implemented on the 3 candidate gestures to recognize the last hand gesture. This pipeline design can reduce computational load, increase recognition speed, and improve accuracy as long as the selection of hand gestures is done carefully.

Under fixed positions of the images, the recognition accuracy was better than 94% for different test cases. Even adding deformations to the image of the gesture, the accuracy remained above 93%. The algorithm is also implemented on an FPGA to verify that it works under hardware with limited resources. The FPGA achieved an accuracy score of 94.99%, which is comparable to that achieved on Intel Core series chip hardware. With the advent of smart technologies in our daily life like smart TVs, contactless vending machines, virtual in-store displays or any device with camera sensors, it is becoming more and more important to control these devices at home. using hand gestures. Therefore, there is a need for lightweight algorithms capable of operating on on-board systems installed in these smart devices.

Unlike the current trend of developing algorithms based on AI and deep learning (DL), the method proposed by the researchers relies on lower hardware requirements and is ideally suited for embedded computing platforms. The research team plans to continue innovation in this area to increase the number of hand gestures and bring the ability to process more complex backgrounds and poor lighting conditions. Incredible breakthroughs in gesture-based human-machine interactions can be expected in the near future.

Article: and / 10.1117 / 1.JEI.30.6.063026.full? SSO = 1


Comments are closed.