Development of a PC-based sign language translator
Abstract
While a hearing-impaired individual depends on sign language and gestures, non-hearing-impaired person uses verbal language. Thus, there is need for means of arbitration to forestall situation when a non-hearing-impaired individual who does not understand the sign language wants to communicate with a hearing-impaired person. This paper is concerned with the development of a PC-based sign language translator to facilitate effective communication between hearing-impaired and non-hearing-impaired persons. Database of hand gestures in American sign language (ASL) is created using Python scripts. TensorFlow (TF) is used in the creation of a pipeline configuration model for machine learning of annotated images of gestures in the database with the real time gestures. The implementation is done in Python software environment and it runs on a PC equipped with a web camera to capture real time gestures for comparison and interpretations. The developed sign language translator is able to translate ASL/gestures to written texts along with corresponding audio renderings at an average duration of about one second. In addition, the translator is able to match real time gestures with the equivalent gesture images stored in the database even at 44% similarity.
Keywords
Communication aid; Gestures; Hearing-impaired; Sign language; Translator
Full Text:
PDFDOI: http://doi.org/10.11591/ijict.v12i1.pp23-31
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
The International Journal of Informatics and Communication Technology (IJ-ICT)
p-ISSN 2252-8776, e-ISSNĀ 2722-2616
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).