Using The Hausdorff Algorithm to Enhance Kinect's Recognition of Arabic Sign Language Gestures

, and . International Journal of Experimental Algorithms (IJEA), 7 (1): 1-18 (April 2017)


The objective of this research is to utilize mathematical algorithms to overcome the limitations of the depth sensor Kinect in detecting the movement and details of fingers and joints used for the Arabic alphabet sign language (ArSL). This research proposes a model to accurately recognize and interpret a specific ArSL alphabet using Microsoft's Kinect SDK Version 2 and a supervised machine learning classifier algorithm (Hausdorff distance) with the Candescent Library. The dataset of the model, considered prior knowledge for the algorithm, was collected by allowing volunteers to gesture certain letters of the Arabic alphabet. To accurately classify the gestured letters, the algorithm matches the letters by first filtering each sign based on the number of fingers and their visibility, then by calculating the Euclidean distance between contour points of a gestured sign and a stored sign, while comparing the results with an appropriate threshold. To be able to evaluate the classifier, participants gesturing different letters with the same Euclidean distance value of the stored gestures. The class name that was closest to the gestured sign appeared directly on the display sign window. Then the results of the classifier were analyzed mathematically. When comparing unknown incoming gestures signed with the stored gestures from the dataset collected, the model matched the gesture with the correct letter with high accuracy.

Links and resources