Real-Time Navigation Smart Devices Can Be Worn by the Visually Impaired

In an article recently published in the Alexandria Engineering Journal, researchers discussed the development of an intelligent and real-time wearable navigation support system for blind and visually impaired people (BVIP).

Study: Smart real time wearable navigation support system for BVIP. Alexandria Engineering Journal 62, 223-235 (2022). https://www.sciencedirect.com/science/article/pii/S1110016822004434 Image Credit: PH888/Shutterstock.com

Background

1.3 billion individuals worldwide have some form of visual impairment, according to the world health organization (WHO). 

In order to better allow these people to perform everyday tasks, the design of assistance devices necessitates that the information is delivered quickly, in an easy-to-understand format, and without the use of visuals. The device itself should be light, simple to operate, and not a cause of annoyance or distraction from a design perspective.

A cane is often the most widely used design for blind navigation. The cane-based navigation system, however, is subject to a number of size and weight limitations. Many techniques have been created using smart sensor technology and digital data processing as a result of these limitations.

About the Study

In this study, the authors discussed the development of a sophisticated navigational tool for BVIP. Fuzzy logic-based decision assistance, a collection of high-performance sensors, a Raspberry Pi4 board for real-time processing, and a haptic voice interface for user guidance made up the proposed design.

The Robot Operating System (ROS), which connected all the various system nodes and generated the decision as a voice haptic message, was the foundation of the control architecture. The human security path was determined by a security assessment system that combined sensor data with a fuzzy classifier.

The team demonstrated the efficiency of the proposed system for BVIP through experimental tests conducted in various contexts. The study suggested a brand-new wearable gadget design that instantly converted obstacle depth into speech and vibration. The embedded fuzzy controller used speech technology to inform the subject of the final decision after converting distance from the sensor into vibration intensity.

A ROS-based embedded controller, an eyeglass frame with three ultrasonic sensors and the associated vibrator modules for obstacle detection, and a similar accessory that could be worn in the hand for ramp and up-down barrier detection were used to build the proposed system's three primary parts.

The researchers employed the vibrations produced by the vibrator modules as tactile interfaces to excite the plantar area. The study provided an accessible navigation system to let BVIP move around both indoor and outdoor environments. Materials and methods along with the prototype process, mechanical design, and electronic design of the embedded controller with an integrated obstacle detection mechanism were demonstrated.

Observations

For the signaling system to be able to alert the subject to the presence of an impediment in the designated scanning region using haptic feedback, both vibrating bands must be firmly fastened at the head and hand. The designed guidance system was subjected to a preliminary evaluation on 15 visually challenged individuals. Prior to the test, each participant received a brief description of the proposed navigation aid system so they would be familiar with the overall setup and operation of the exam. All individuals provided positive feedback regarding their initial impressions of the gadget due to its high degree of flexibility and lightweight.

The results showed that people using the proposed navigation system walk less slowly than those using a traditional white cane in both indoor and outdoor settings, which proved the proposed system's capability to guide users through unfamiliar situations quickly. The impediments hanging in low and mid-air were also difficult for the white cane to detect, especially when the shape of the obstacle was complex. In contrast, with the proposed method, the subjects could avoid such collisions even with the dynamic obstacle.

However, due to the dynamic behavior of the obstacles and the difficulty in distinguishing small-size items with a depth value roughly equivalent to the depth of the ground, the subjects using the proposed system still experienced a few collisions in the paths of the outdoor environment.

The proposed system exhibited a good capacity to recognize risky obstacles in the way, which were typically a gap and a ramp in the ground, and to design a new safe path that avoided the hazardous path. The proposed technology was secure for navigation as well.

Conclusions

In conclusion, this study elucidated the development and testing of a navigation system for BVIP in both indoor and outdoor settings. The team provided an assessment indication based on the total number of collisions and the total number of risky barriers detected in order to encourage subjects to use the proposed system. The average path score that constituted the rated score was given to the user. The suggested navigation system was comprised of a hand band accessory, an eyeglass frame, and an embedded controller that runs on ROS.

Advanced exteroceptive and proprioceptive sensors included in the architecture provided information on obstructions in front of the user or on the ground, as well as navigational parameters. The safe path was determined using a safety assessment system that used data fusion, sensors, and fuzzy classifiers. The proposed guidance system was put to the test by BVIP in real-time in various indoor and outdoor settings.

The authors mentioned that the suggested navigation aid system improved obstacle identification, ramp and cavity recognition, and interaction with dynamic impediments, according to actual tests with BVIP.

Further Reading

Bouteraa, Y., Smart real time wearable navigation support system for BVIP. Alexandria Engineering Journal 62, 223-235 (2022). https://www.sciencedirect.com/science/article/pii/S1110016822004434

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Surbhi Jain

Written by

Surbhi Jain

Surbhi Jain is a freelance Technical writer based in Delhi, India. She holds a Ph.D. in Physics from the University of Delhi and has participated in several scientific, cultural, and sports events. Her academic background is in Material Science research with a specialization in the development of optical devices and sensors. She has extensive experience in content writing, editing, experimental data analysis, and project management and has published 7 research papers in Scopus-indexed journals and filed 2 Indian patents based on her research work. She is passionate about reading, writing, research, and technology, and enjoys cooking, acting, gardening, and sports.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Jain, Surbhi. (2022, August 18). Real-Time Navigation Smart Devices Can Be Worn by the Visually Impaired. AZoM. Retrieved on April 19, 2024 from https://www.azom.com/news.aspx?newsID=59794.

  • MLA

    Jain, Surbhi. "Real-Time Navigation Smart Devices Can Be Worn by the Visually Impaired". AZoM. 19 April 2024. <https://www.azom.com/news.aspx?newsID=59794>.

  • Chicago

    Jain, Surbhi. "Real-Time Navigation Smart Devices Can Be Worn by the Visually Impaired". AZoM. https://www.azom.com/news.aspx?newsID=59794. (accessed April 19, 2024).

  • Harvard

    Jain, Surbhi. 2022. Real-Time Navigation Smart Devices Can Be Worn by the Visually Impaired. AZoM, viewed 19 April 2024, https://www.azom.com/news.aspx?newsID=59794.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.