Project

# Title Team Members TA Documents Sponsor
19 Aftermarket Hazard Detection System for Cyclists
Adam Snedden
Erik Ji
Ozgur Tufekci
Tianxiang Zheng design_document1.pdf
final_paper1.pdf
photo1.jpg
photo2.jpg
photo3.jpg
photo4.jpg
photo5.jpg
photo6.jpg
photo7.jpg
presentation1.pdf
proposal1.pdf
proposal2.pdf
video
# Hazard Detection for Cyclists
Erik Ji, Adam Snedden, Ozgur Tufekci

[Google Docs mirror](https://docs.google.com/document/d/15jHzsdSbN0LpCIDwTOREneb6Yw3i28q1tb8iM1LLuuc/preview)

## Problem
According to a study from the U.S. Department of Transportation, only 17 percent of personal vehicles have blind spot technology as a standard feature and 57 percent have it as an upgrade option [1]. The number of personal vehicles equipped with the capabilities are on the rise, preventing an estimated 50,000 accidents [2]. The same can’t be said for cyclists. While there are some methods of detection being implemented, the market industry has less variety and is much newer to the game compared to vehicle technology. On top of that, bicycle injuries can become fatal very quickly if one is unaware of their surroundings. Why should bicyclists suffer and not have the same capabilities?

[1] S. Zhu, “Blind spot warning technology contributes to a 23 percent reduction in lane change injury crashes,” Real-world benefits of car safety technology, 2019

[2] J. B. Cicchino, “Effects of blind spot monitoring systems on police-reported lane-change crashes,” Traffic injury prevention vol. 19,6, 2018

## Solution
To address the problem, we are proposing to develop and implement a hazard detection system that can be used by bicyclists with a few upgrades on top of normal “blind spot” systems. The system will utilize an Infineon radar module to detect objects in the cyclist's blind spots. After detecting an object (presumably an approaching vehicle), a visual alert along with a buzzer will cue the cyclist of the potential hazard behind them. Depending on the distance the hazard is from the rider, one or both of the alerts will switch on. The buzzer will be used for more imminent threats while the light can notify the rider early on. Another light will be attached to the rear of the bike to notify the approaching object. With no permanent power source on a bike (like a car’s battery) we will need to tie in a battery system that can be easily operated. The majority of the system will be housed under the seat with other necessary components run where needed in user friendly positions. The main goal will be to not hinder the cyclists comfortability or maneuverability with the system in place while maintaining reliability.

## Project Components
### Processing Unit/Indicator System
The processing unit for our project will be composed of two primary sub processors: the data collection and processing unit and the indication system. Both units will be powered via battery.

#### Processing Unit
The processing unit will need to take in critical sensor data like ultrasound distance and camera input data to accurately pinpoint and determine what poses a threat to cyclists.

#### Indication System Controller (Wireless)
The indication system is composed of a microcontroller and wireless interface to drive the final cyclist interface. Then the wireless interface will be responsible for accepting wireless signals from the processing unit, and delivering it to the microcontroller. The microcontroller is then responsible for driving the LED and buzzer in the indication system.

### Sensor Suite
This subsystem will be the “eyes on the back of your head” to collect a variety of data on the cyclist’s surroundings and provide additional data

Sensors we intend to implement are:
-Infineon Radar Module
- Utilized for tracking surrounding objects and measuring distance to approaching vehicles

This sensor will be directly mounted to the processing unit, which will be responsible for processing and triggering indicators.


### Indication System
This subsystem will provide audio-visual cues for imminent hazards detected by the sensors.

Indicators we intend to implement are:
- LED
- Utilized for strong visual cueing of direction and proximity of potential hazards
- Piezo-electric Buzzer
- Utilized for strong auditory cueing urgent and imminent threats to the cyclist
- Variable pitch and rhythm can be used to describe level of urgency

## Criterion for Success
At the end of our project, we hope to have a functioning hazard detection system. For the system to be deemed functional, the sensor will need to be able to detect (or not detect) objects at various distances. When the sensor detects these objects, the processing unit will need to interpret and decide whether to alert the cyclist. The alert system will then notify the cyclist by the visual and audio cue with the LED indicator coming on when dangers are low and the buzzer coming in when more imminent threats are approaching. Along with that our camera will be able to record incidents and provide visual evidence of situations to ensure cyclists aren’t taken advantage of which happens often. On top of the actual functionality of the system, potential external issues will be addressed. These will include weather proofing, cyclist hinder-ability, ease of use, reliability, install time and method, etc.

Oxygen Delivery Robot

Aidan Dunican, Nazar Kalyniouk, Rutvik Sayankar

Oxygen Delivery Robot

Featured Project

# Oxygen Delivery Robot

Team Members:

- Rutvik Sayankar (rutviks2)

- Aidan Dunican (dunican2)

- Nazar Kalyniouk (nazark2)

# Problem

Children's interstitial and diffuse lung disease (ChILD) is a collection of diseases or disorders. These diseases cause a thickening of the interstitium (the tissue that extends throughout the lungs) due to scarring, inflammation, or fluid buildup. This eventually affects a patient’s ability to breathe and distribute enough oxygen to the blood.

Numerous children experience the impact of this situation, requiring supplemental oxygen for their daily activities. It hampers the mobility and freedom of young infants, diminishing their growth and confidence. Moreover, parents face an increased burden, not only caring for their child but also having to be directly involved in managing the oxygen tank as their child moves around.

# Solution

Given the absence of relevant solutions in the current market, our project aims to ease the challenges faced by parents and provide the freedom for young children to explore their surroundings. As a proof of concept for an affordable solution, we propose a three-wheeled omnidirectional mobile robot capable of supporting filled oxygen tanks in the size range of M-2 to M-9, weighing 1 - 6kg (2.2 - 13.2 lbs) respectively (when full). Due to time constraints in the class and the objective to demonstrate the feasibility of a low-cost device, we plan to construct a robot at a ~50% scale of the proposed solution. Consequently, our robot will handle simulated weights/tanks with weights ranging from 0.5 - 3 kg (1.1 - 6.6 lbs).

The robot will have a three-wheeled omni-wheel drive train, incorporating two localization subsystems to ensure redundancy and enhance child safety. The first subsystem focuses on the drivetrain and chassis of the robot, while the second subsystem utilizes ultra-wideband (UWB) transceivers for triangulating the child's location relative to the robot in indoor environments. As for the final subsystem, we intend to use a camera connected to a Raspberry Pi and leverage OpenCV to improve directional accuracy in tracking the child.

As part of the design, we intend to create a PCB in the form of a Raspberry Pi hat, facilitating convenient access to information generated by our computer vision system. The PCB will incorporate essential components for motor control, with an STM microcontroller serving as the project's central processing unit. This microcontroller will manage the drivetrain, analyze UWB localization data, and execute corresponding actions based on the information obtained.

# Solution Components

## Subsystem 1: Drivetrain and Chassis

This subsystem encompasses the drive train for the 3 omni-wheel robot, featuring the use of 3 H-Bridges (L298N - each IC has two H-bridges therefore we plan to incorporate all the hardware such that we may switch to a 4 omni-wheel based drive train if need be) and 3 AndyMark 245 RPM 12V Gearmotors equipped with 2 Channel Encoders. The microcontroller will control the H-bridges. The 3 omni-wheel drive system facilitates zero-degree turning, simplifying the robot's design and reducing costs by minimizing the number of wheels. An omni-wheel is characterized by outer rollers that spin freely about axes in the plane of the wheel, enabling sideways sliding while the wheel propels forward or backward without slip. Alongside the drivetrain, the chassis will incorporate 3 HC-SR04 Ultrasonic sensors (or three bumper-style limit switches - like a Roomba), providing a redundant system to detect potential obstacles in the robot's path.

## Subsystem 2: UWB Localization

This subsystem suggests implementing a module based on the DW1000 Ultra-Wideband (UWB) transceiver IC, similar to the technology found in Apple AirTags. We opt for UWB over Bluetooth due to its significantly superior accuracy, attributed to UWB's precise distance-based approach using time-of-flight (ToF) rather than meer signal strength as in Bluetooth.

This project will require three transceiver ICs, with two acting as "anchors" fixed on the robot. The distance to the third transceiver (referred to as the "tag") will always be calculated relative to the anchors. With the transceivers we are currently considering, at full transmit power, they have to be at least 18" apart to report the range. At minimum power, they work when they are at least 10 inches. For the "tag," we plan to create a compact PCB containing the transceiver, a small coin battery, and other essential components to ensure proper transceiver operation. This device can be attached to a child's shirt using Velcro.

## Subsystem 3: Computer Vision

This subsystem involves using the OpenCV library on a Raspberry Pi equipped with a camera. By employing pre-trained models, we aim to enhance the reliability and directional accuracy of tracking a young child. The plan is to perform all camera-related processing on the Raspberry Pi and subsequently translate the information into a directional command for the robot if necessary. Given that most common STM chips feature I2C buses, we plan to communicate between the Raspberry Pi and our microcontroller through this bus.

## Division of Work:

Given that we already have a 3 omni wheel robot, it is a little bit smaller than our 50% scale but it allows us to immediately begin work on UWB localization and computer vision until a new iteration can be made. Simultaneously, we'll reconfigure the drive train to ensure compatibility with the additional systems we plan to implement, and the ability to move the desired weight. To streamline the process, we'll allocate specific tasks to individual group members – one focusing on UWB, another on Computer Vision, and the third on the drivetrain. This division of work will allow parallel progress on the different aspects of the project.

# Criterion For Success

Omni-wheel drivetrain that can drive in a specified direction.

Close-range object detection system working (can detect objects inside the path of travel).

UWB Localization down to an accuracy of < 1m.

## Current considerations

We are currently in discussion with Greg at the machine shop about switching to a four-wheeled omni-wheel drivetrain due to the increased weight capacity and integrity of the chassis. To address the safety concerns of this particular project, we are planning to implement the following safety measures:

- Limit robot max speed to <5 MPH

- Using Empty Tanks/ simulated weights. At NO point ever will we be working with compressed oxygen. Our goal is just to prove that we can build a robot that can follow a small human.

- We are planning to work extensively to design the base of the robot to be bottom-heavy & wide to prevent the tipping hazard.