Project

# Title Team Members TA Documents Sponsor
16 Smart Assistive Walking Stick for the Visually Impaired
Haoyang Zhou
Sanhe Fu
Yihan Huang
Yucheng Zhang
design_document1.pdf
final_paper1.pdf
final_paper2.pdf
proposal1.pdf
Yushi Cheng
# Problem
More than 250 million people worldwide suffer from varying degrees of visual impairment, which has a profound impact on their physical health, mental well-being, and overall quality of life. Individuals with impaired vision face three key challenges when navigating their surroundings: obstacle avoidance, indoor path planning, and key object localization. Especially in the intersections on the road, the road conditions in this area are complicated, and the blind cannot directly identify the traffic lights and traffic signals. In China, most intersections have no voice prompt system designed for the blind, and there are fast-moving vehicles in this section, which is very dangerous for the blind.
The most commonly used assistive tool for visually impaired individuals is the white cane, which provides users with tactile feedback. However, the standard white cane has a limited detection range, only sensing obstacles within its physical length, and cannot identify distant or elevated obstacles. But these situations are very common in the intersections. Moreover, the white cane provides only basic physical feedback and lacks the capability to convey detailed environmental information, such as road intersections and navigation directions. As a result, in unfamiliar or complex environments, relying solely on a white cane makes precise navigation difficult, forcing users to depend on external assistance or their memory of previously traveled routes.

# Solution overview
Our solution to this problem is the development of an intelligent smart cane. The smart cane can improve walking speed and safety both outdoors and indoors, and we have designed it with a focus on blind people crossing intersections. This advanced cane is equipped with sensors to measure the distance to obstacles, a GPS system for precise outdoor positioning, and computer vision technology to capture detailed environmental information, such as traffic signs and other critical landmarks. Additionally, the smart cane features motor-controlled omnidirectional wheels for directional guidance and provides real-time voice feedback to assist users in navigating their surroundings with greater ease, speed and confidence. In outdoor environments, aid from GPS can not only help the user to walk in strange environments that are not similar but also help them to be more confident in their familiar environments. It will also show a great ability when navigating the users to walk in indoor environments, where the obstacles are usually many and unexpectable. When the user passes through the intersection area, GPS will help give the alert, the camera takes information about the surrounding environment, such as traffic lights and their duration, traffic signs, and whether there are vehicles around. This information is identified by computer vision algorithms and then prompted by voice to the user. And the strong detecting ability provided by the laser sensor of our smart cane can help to avoid crashing into obstacles, especially to avoid crashing into the people and objects that are moving fast speed.

# Solution components
## Main Control Module
-Raspberry Pi serves as the core processor, responsible for receiving and processing data from various modules and controlling other components to provide the corresponding feedback. It will process the information, like the distance measured by the laser sensor, location information from GPS, and the environment signals fron vision system, to determine the direction to go and give feedback to the user by sending signals to the motor and earphones.
-Use Computer Vision algorithms, i.e. YOLO to utilize the images captured by camera and recognize the signals in the environment, and thus gain the information for better guidance.

## Data Collection Module
-Laser sensor is used to measure the distance between the user and obstacles.
-Inertial measurement unit provides orientation estimates.
-GPS is used for precise outdoor positioning.
-Camera captures environmental images.
## Feedback Module
-Motor is used for direction control and guidance.
-Earphone voice prompts provide environmental information to the user.

# Criterion For Success
Our design incorporates advanced sensor-based distance detection to prevent collisions, camera-enabled object recognition for identifying road signs and other key elements in the environment, and GPS-oriented navigation to ensure accurate positioning. Compared to traditional white canes, its success can be measured by several key improvements, including faster walking speeds in both indoor, outdoor settings and intersections, more precise and efficient navigation, more accurate road inforamtion and an overall enhancement in user independence and mobility.

Prosthetic Control Board

Featured Project

Psyonic is a local start-up that has been working on a prosthetic arm with an impressive set of features as well as being affordable. The current iteration of the main hand board is functional, but has limitations in computational power as well as scalability. In lieu of this, Psyonic wishes to switch to a production-ready chip that is an improvement on the current micro controller by utilizing a more modern architecture. During this change a few new features would be added that would improve safety, allow for easier debugging, and fix some issues present in the current implementation. The board is also slated to communicate with several other boards found in the hand. Additionally we are looking at the possibility of improving the longevity of the product with methods such as conformal coating and potting.

Core Functionality:

Replace microcontroller, change connectors, and code software to send control signals to the motor drivers

Tier 1 functions:

Add additional communication interfaces (I2C), and add temperature sensor.

Tier 2 functions:

Setup framework for communication between other boards, and improve board longevity.

Overview of proposed changes by affected area:

Microcontroller/Architecture Change:

Teensy -> Production-ready chip (most likely ARM based, i.e. STM32 family of processors)

Board:

support new microcontroller, adding additional communication interfaces (I2C), change to more robust connector. (will need to design pcb for both main control as well as finger sensors)

Sensor:

Addition of a temperature sensor to provide temperature feedback to the microcontroller.

Software:

change from Arduino IDE to new toolchain. (ARM has various base libraries such as mbed and can be configured for use with eclipse to act as IDE) Lay out framework to allow communication from other boards found in other parts of the arm.