Abstract
The purpose of this paper is to design, develop and evaluate four devices capable of identifying configuration, orientation and movement of the hands, verifying which one has better performance recognition of sign language gestures. The methodology starts from the definition of the layout and the components of data acquisition and processing, the construction of the database treated for each gesture to be recognized and validation of the proposed devices. Signs of flex sensors, accelerometers and gyroscopes are collected, positioned differently on each device. The recognition of the patterns of each gesture is performed using artificial neural networks. After being trained, validated and tested, the neural network interconnected to the devices obtain a hit rate of up to 96.8%. The validated device offers efficacy and efficiency to identify sign language gestures and demonstrates that the use of the sensory approach is promising.
Users
Please
log in to take part in the discussion (add own reviews or comments).