Recognition of gestures in Arabic sign language using neuro-fuzzy systems (Q5958394): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
ReferenceBot (talk | contribs)
Changed an Item
(3 intermediate revisions by 3 users not shown)
Property / describes a project that uses
 
Property / describes a project that uses: ANFIS / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3999407 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Revision as of 23:23, 3 June 2024

scientific article; zbMATH DE number 1715395
Language Label Description Also known as
English
Recognition of gestures in Arabic sign language using neuro-fuzzy systems
scientific article; zbMATH DE number 1715395

    Statements

    Recognition of gestures in Arabic sign language using neuro-fuzzy systems (English)
    0 references
    0 references
    0 references
    3 March 2002
    0 references
    Hand gestures play an important role in communication between people during their daily lives. But the extensive use of hand gestures as a mean of communication can be found in sign languages. Sign language is the basic communication method between deaf people. A translator is usually needed when an ordinary person wants to communicate with a deaf one. The work presented in this paper aims at developing a system for automatic translation of gestures of the manual alphabets in the Arabic sign language. In doing so, we have designed a collection of ANFIS networks, each of which is trained to recognize one gesture. Our system does not rely on using any gloves or visual markings to accomplish the recognition job. Instead, it deals with images of bare hands, which allows the user to interact with the system in a natural way. An image of the hand gesture is processed and converted into a set of features that comprises of the lengths of some vectors which are selected to span the fingertips' region. The extracted features are rotation, scale, and translation invariat, which makes the system more flexible. The subtractive clustering algorithm and the least-squares estimator are used to identify the fuzzy inference system, and the training is achieved using the hybrid learning algorithm. Experiments revealed that our system was able to recognize the 30 Arabic manual alphabets with an accuracy of \(93.55\)\%.
    0 references
    0 references
    hand gestures
    0 references
    sign language
    0 references
    recognition
    0 references
    neuro-fuzzy
    0 references
    Arabic sign language
    0 references
    deaf people
    0 references
    0 references