Chrome Extension
WeChat Mini Program
Use on ChatGLM

Deaf and Mute Sign Language Translator on Static Alphabets Gestures using MobileNet

Venkatesh Kandukuri, Srujal Reddy Gundedi,Vipin Kamble,Vishal Satpute

2023 2nd International Conference on Paradigm Shifts in Communications Embedded Systems, Machine Learning and Signal Processing (PCEMS)(2023)

Cited 0|Views1
No score
Abstract
Sign language is the language used by deaf and dumb people to communicate with others. Deaf and mute people express their thoughts and ideas through hand movements or facial expressions or gestures. However, interpreting sign language can be challenging for individuals who are not fluent in it. The current sign language recognition methods often rely on expensive hardware such as depth cameras or specialized gloves, which can be a barrier to widespread adoption. In this paper, we propose a low-cost solution for sign language recognition using MobileNet, a lightweight convolutional neural network architecture. This Paper deals with the static American Sign alphabet (j and z dynamic). The proposed model extracts the features and classifies them. The Model is able to predict the alphabet successfully corresponding to the sign. A finger Spelling dataset is used to train and test the model. The proposed model was successfully recognized with an accuracy of 99.93%. The obtained results and graphs show that the system is able to predict the sign correctly.
More
Translated text
Key words
Depthwise Convolution,Depthwise Separable Convolution,Pointwise Convolution,Width Multiplier,Mobilenet
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined