Project
| # | Title | Team Members | TA | Documents | Sponsor |
|---|---|---|---|---|---|
| 3 | Wearable mobility-assistance device for Blind and visually impaired (BVI) |
Darui Xu Haoyu Zhu Jiashen Ren Jinnan Zhang |
design_document1.pdf |
Bo Zhao | |
| # Problem Blind and visually impaired (BVI) individuals rely heavily on hearing to navigate safely in daily environments. While walking, they must continuously monitor critical environmental sounds such as approaching vehicles, crosswalk signals, bicycles, and nearby pedestrians. However, many existing assistive navigation devices communicate obstacle information mainly through audio alerts or voice prompts. This creates a major usability and safety issue because the device competes with the same auditory channel that the user depends on for situational awareness. In addition, audio-based systems often require earphones or louder playback in noisy environments, which can further reduce a user’s ability to perceive surrounding hazards. As a result, these systems may unintentionally compromise safety instead of improving it. The problem addressed by this project is therefore: **how to provide intuitive and timely obstacle-location information to BVI users without occupying their auditory channel**. This problem is important because an effective mobility-assistance device must do more than detect obstacles. It must communicate actionable information in a way that is fast, intuitive, wearable, and compatible with the user’s natural navigation behavior. Our project focuses on preserving hearing for environmental awareness while shifting obstacle communication to the tactile channel. # Solution Overview This project proposes a **wearable haptic navigation-assistance device** for blind and visually impaired users. The system will detect nearby obstacles in front of the user using an AI vision-based sensing approach and communicate their relative direction and distance through **vibration feedback** rather than sound or speech. The proposed system will use a camera and onboard processing hardware to capture visual information from the environment. AI-based vision algorithms will analyze the scene in real time to identify nearby obstacles and estimate their relative position with respect to the user. Based on this information, the system will activate vibration motors to convey obstacle direction and distance. For example, vibration on the left side may indicate an obstacle on the left, while stronger or faster vibration may indicate a closer obstacle. The key innovation of this design is a **non-auditory feedback mapping** that allows users to receive obstacle information while keeping their hearing fully available for environmental sounds. Compared with conventional audio-based systems, this approach is intended to improve safety, reduce sensory conflict, and provide a more intuitive navigation aid in realistic walking scenarios. To keep the project feasible within the course scope, the prototype will focus on short-range obstacle awareness and vibration-based haptic communication rather than full autonomous navigation or large-scale scene understanding. # Components The system will be organized into the following major subsystems: ## AI Vision Sensing Subsystem The sensing subsystem detects nearby obstacles and estimates their relative position using visual data. Possible components include: - Camera module - Embedded AI processing unit or microprocessor - Computer vision / object detection algorithm - Distance or relative-position estimation logic This subsystem is responsible for acquiring environmental information and identifying obstacles in real time. ## Processing and Control Subsystem The processing subsystem interprets the sensing results and determines the correct haptic response. Possible components include: - Embedded controller or processor - Obstacle localization logic - Haptic feedback mapping algorithm - Timing and control logic This subsystem converts vision-based obstacle information into control commands for the feedback device. ## Vibration Feedback Subsystem The feedback subsystem communicates obstacle information to the user through tactile vibration cues. Possible components include: - Vibration motors - Motor driver circuitry - Wearable actuator placement - Feedback mapping design for direction and distance This subsystem is responsible for conveying obstacle direction and relative distance in an intuitive and distinguishable way. ## Power Subsystem The power subsystem provides portable and stable power to all electronics. Possible components include: - Rechargeable battery - Voltage regulation circuit - Charging interface - Power switch and protection circuitry This subsystem enables continuous wearable operation. ## Wearable Integration Subsystem The wearable integration subsystem packages the prototype into a form suitable for real use. Possible components include: - Wearable mounting structure - Sensor and actuator supports - Wiring and enclosure management - Adjustable fastening mechanism This subsystem ensures that the device is practical, lightweight, and wearable. # Criteria of Success The project will be considered successful if the final prototype satisfies the following criteria: 1. The device must detect nearby obstacles within the intended range with reliable performance during indoor testing. 2. The system must communicate obstacle direction and relative distance through vibration feedback in a way that users can correctly interpret during controlled testing. 3. The device must provide obstacle information without using audio output, thereby preserving the user’s auditory awareness of the surrounding environment. 4. The final prototype must function as a wearable, battery-powered system capable of real-time operation during demonstration. |
|||||