Research Article Open Access

Arabic Static and Dynamic Gestures Recognition Using Leap Motion

Basma Hisham1 and Alaa Hamouda1
  • 1 Al-Azhar University, Egypt

Abstract

Across the world, several millions of people use sign language as their main way of communication with their society, daily they face a lot of obstacles with their families, teachers, neighbours, employers. According to the most recent statistics of World Health Organization, there are 360 million persons in the world with disabling hearing loss i.e. (5.3% of the world’s population), around 13 million in the Middle East. Hence, the development of automated systems capable of translating sign languages into words and sentences becomes a necessity. We propose a model to recognize both of static gestures like numbers, letters, ...etc and dynamic gestures which includes movement and motion in performing the signs. Additionally, we propose a segmentation method in order to segment a sequence of continuous signs in real time based on tracking the palm velocity and this is useful in translating not only pre-segmented signs but also continuous sentences. We use an affordable and compact device called Leap Motion controller, which detects and tracks the hands' and fingers' motion and position in an accurate manner. The proposed model applies several machine learning algorithms as Support Vector Machine (SVM), K- Nearest Neighbour (KNN), Artificial Neural Network (ANN) and Dynamic Time Wrapping (DTW) depending on two different features sets. This research will increase the chance for the Arabic hearing-impaired and deaf persons to communicate easily using Arabic Sign language(ArSLR).  The proposed model works as an interface between hearing-impaired and normal persons who are not familiar with Arabic sign language, overcomes the gap between them and it is also valuable for social respect. The proposed model is applied on Arabic signs with 38 static gestures (28 letters, numbers (1:10) and 16 static words) and 20 dynamic gestures. Features selection process is maintained and we get two different features sets. For static gestures, KNN model dominates other models for both of palm features set and bone features set with accuracy 99 and 98% respectively. For dynamic gestures, DTW model dominates other models for both palm features set and bone features set with accuracy 97.4% and 96.4% respectively.

Journal of Computer Science
Volume 13 No. 8, 2017, 337-354

DOI: https://doi.org/10.3844/jcssp.2017.337.354

Submitted On: 15 May 2017 Published On: 12 August 2017

How to Cite: Hisham, B. & Hamouda, A. (2017). Arabic Static and Dynamic Gestures Recognition Using Leap Motion. Journal of Computer Science, 13(8), 337-354. https://doi.org/10.3844/jcssp.2017.337.354

  • 4,248 Views
  • 3,134 Downloads
  • 67 Citations

Download

Keywords

  • Sign Language
  • Leap Motion Controller
  • Static Gestures
  • Dynamic Gestures