MAVI: Mobility Assistant for Visually Impaired with Optional Use of Local and Cloud Resources

2019 32nd International Conference on VLSI Design and 2019 18th International Conference on Embedded Systems (VLSID)(2019)

Cited 11|Views8
No score
Abstract
Independent mobility of visually impaired people is key to making an inclusive society for them. Unstructured infrastructure in developing countries pose significant challenges in developing aids to address the mobility problem of visually impaired. Most of the assistive devices available internationally assume a structured and controlled environment severely restricting the applicability of such devices. In this paper, we assess the ability of state-of-the-art assistive devices for addressing the independent outdoor mobility needs of the visually impaired in an unstructured environment. We have created realistic datasets for various scenarios and evaluate deep neural networks for object detection on these datasets. We also present a portable prototype for the task. Further, we have also developed a cloud based solution to address the mobility requirements. We compare the local device based and cloud based solutions in terms of accuracy, latency, and energy. We present and discuss results from these two implementations that can provide insights for an effective solution. The results and insights open up novel research problems for embedded systems.
More
Translated text
Key words
MAVI,visually impaired people,unstructured infrastructure,deep neural networks,object detection,assistive devices,independent outdoor mobility,cloud resource,local resource,portable prototype,embedded system
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined