General Information
    • ISSN: 1793-8201 (Print), 2972-4511 (Online)
    • Abbreviated Title: Int. J. Comput. Theory Eng.
    • Frequency: Quarterly
    • DOI: 10.7763/IJCTE
    • Editor-in-Chief: Prof. Mehmet Sahinoglu
    • Associate Editor-in-Chief: Assoc. Prof. Alberto Arteta, Assoc. Prof. Engin Maşazade
    • Managing Editor: Ms. Mia Hu
    • Abstracting/Indexing: Scopus (Since 2022), INSPEC (IET), CNKI,  Google Scholar, EBSCO, etc.
    • Average Days from Submission to Acceptance: 192 days
    • E-mail: ijcte@iacsitp.com
    • Journal Metrics:

Editor-in-chief
Prof. Mehmet Sahinoglu
Computer Science Department, Troy University, USA
I'm happy to take on the position of editor in chief of IJCTE. We encourage authors to submit papers concerning any branch of computer theory and engineering.

IJCTE 2012 Vol.4(6): 998-1001 ISSN: 1793-8201
DOI: 10.7763/IJCTE.2012.V4.625

An Intelligent Multi-Gesture Spotting Robot to Assist Persons with Disabilities

Amarjot Singh, Devinder Kumar, Phani Srikanth, Srikrishna Karanam, and Niraj Acharya

Abstract—Sign language is used all over the world by the hearing-impaired and disabled to communicate with each other and the rest of the world. Sign language is more than just moving fingers or hands; it is a viable and visible language in which gestures and facial expressions play a very important role. These signs can be effectively used as far as communication with humans is concerned but with respect to communication with machines better methodologies and algorithms have to be developed. This paper presents a system which can be used by disabled people (who can communicate only with sign languages) to communicate with machines in order to employ it to their daily life. The system focuses on pose estimation of gestures used by physically disabled people to give suitable signals to the machines. Pose is generated using silhouettes followed by gesture recognition using Mahalanobis distance metric. The system further was to control a wireless robot from different gestures used by disabled people. The basic American sign language symbols I Hand, II Hand, Aboard, All Gone, etc were recognized and a specific signal with respect to each gesture is transmitted which controls the robot. The robot is made to perform forward, backward, left, right and stop actions for each gesture presented to the system by the disabled person. The present system can also be modeled to send wireless SMS or emails in worse situations. The system demonstrated potency in the estimation of motion and pose for interpreting the sign language by using the silhouettes of the pose. The proposed system recognizes sign language, thus providing disabled people a medium to communicate with machines, leading to simplicity in their day to day work.

Index Terms—Silhouette, motion segmentation, wireless robot, gesture recognition.

The authors are with the National Institute of Technology, Wrangal, 506004 (e-mail: amarjotsingh@ieee.org).

[PDF]

Cite: Amarjot Singh, Devinder Kumar, Phani Srikanth, Srikrishna Karanam, and Niraj Acharya, "An Intelligent Multi-Gesture Spotting Robot to Assist Persons with Disabilities," International Journal of Computer Theory and Engineering vol. 4, no. 6, pp.  998-1001, 2012.


Copyright © 2008-2024. International Association of Computer Science and Information Technology. All rights reserved.