Project

# Title Team Members TA Documents Sponsor
43 Autonomous Featherweight (30lb) Battlebot
Jason Mei
Michael Ko
Qinghuai Yao
Michael Gamota design_document1.pdf
final_paper1.pdf
grading_sheet1.pdf
proposal1.pdf
video
# Autonomous Featherweight (30lb) Battlebot

Team Members:
- Jason Mei (jasonm5)
- Qinghuai Yao (qyao6)
- Michael Ko (ykko2)

# Problem

iRobotics, a RSO on campus, has built multiple battlebots that are entered into competitions across the U.S. One of the robots that has been developed is called "CRACK?", a 30lb hammer-axe battlebot. The robot has already been designed and completed - however, the project would be to upgrade this robot from manual control to autonomous control.

# Solution

For this project, the plan is to use a camera mounted just outside the polycarbonate walls for a live state of the arena, sending information to a computer. The computer can then use image transforms to get an accurate top-down view of the field, which allows the computer to then calculate the next movements, either directly using a pure-pursuit algorithm, or a machine learning algorithm potentially. The control is then passed over to a microcontroller board mounted within the robot, which sends signals to the motors, and drives the robot or fires the hammer.

# Solution Components

## Camera Subsystem

The main computer takes in the data from a camera (ALPCAM 2MP Varifocus USB Camera) mounted on the outside of the arena. The camera uploads a standard size (640x480 or 1280x720) to the computer. For every frame, the python program (utilizing OpenCV) creates a binary image with perspective transforms, color filters, and other information. It will also scan for april tags, which will be mounted on specific sides of the robot, allowing for the computer to identify the both of the robots’ full pose (position and orientation) within the arena.

## Autonomous Control Subsystem

After gaining both of the robot’s poses, the computer will identify the next actions for the robot to perform. Initially, we will use a standard pure pursuit algorithm, where the robot will simply minimize the distance between itself and the opponent without regard for orientation. Potentially, we may switch to using a reinforcement learning algorithm, utilizing machine learning within a custom OpenAI environment. The computer will then use bluetooth to connect wirelessly to the robot, and then send over the instructions.

## On-robot Subsystem

The motors on the robot itself are typically controlled by a receiver, which uses PWM signals (1.5 ms on a 50 ms period is a “neutral” signal). We will be inserting a microcontroller (ESP32S3) board in between the receiver and the motor ESCs (electronic speed controllers), to analyze the information from both the receiver and the computer. Additionally, to maximize the information available to the user, we will be adding both a voltage divider to analyze battery voltage, as well as an accelerometer sensor (MPU6050) to display the robot’s movement.

# Criterion For Success

We would define a successful project with a specific set of goals:
The system must identify the robot and track the location and pose live (using the april tag).
The system must be able to allow the robot to drive to any specific location directly at close to full speed, similarly to how a human would.
The system must be able to shut off safely and immediately if there are ever any safety violations.
The robot will compete at Robobrawl X, an event that is held this year on campus (April 4th and 5th, 2025).

Bracelet Aid for deaf people/hard of hearing

Aarushi Biswas, Yash Gupta, Anit Kapoor

Bracelet Aid for deaf people/hard of hearing

Featured Project

# PROJECT TITLE: Bracelet Aid for deaf people/hard of hearing

# TEAM MEMBERS:

- Aarushi Biswas (abiswas7)

- Anit Kapoor (anityak3)

- Yash Gupta (yashg3)

# PROBLEM

We are constantly hearing sounds around us that notify us of events occurring, such as doorbells, fire alarms, phone calls, alarms, or vehicle horns. These sounds are not enough to catch the attention of a d/Deaf person and sometimes can be serious (emergency/fire alarms) and would require the instant attention of the person. In addition, there are several other small sounds produced by devices in our everyday lives such as washing machines, stoves, microwaves, ovens, etc. that cannot be identified by d/Deaf people unless they are observing these machines constantly.

Many people in the d/Deaf community combat some of these problems such as the doorbell by installing devices that will cause the light in a room to flicker. However, these devices are generally not installed in all rooms and will also obviously not be able to notify people if they are asleep. Another common solution is purchasing devices like smartwatches that can interact with their mobile phones to notify them of their surroundings, however, these smartwatches are usually expensive, do not fulfill all their needs, and require nightly charging cycles that diminish their usefulness in the face of the aforementioned issues.

# SOLUTION

A low-cost bracelet aid with the ability to convert sounds into haptic feedback in the form of vibrations will be able to give d/Deaf people the independence of recognizing notification sounds around them. The bracelet will recognize some of these sounds and create different vibration patterns to catch the attention of the wearer as well as inform them of the cause of the notification. Additionally, there will be a visual component to the bracelet in the form of an OLED display which will provide visual cues in the form of emojis. The bracelet will also have buttons for the purpose of stopping the vibration and showing the battery on the OLED.

For instance, when the doorbell rings, the bracelet will pick up the doorbell sound after filtering out any other unnecessary background noise. On recognizing the doorbell sound, the bracelet will vibrate with the pattern associated with the sound in question which might be something like alternating between strong vibrations and pauses. The OLED display will also additionally show a house emoji to denote that the house doorbell is ringing.

# SOLUTION COMPONENTS

Based on this solution we have identified that we need the following components:

- INMP441 (Microphone Component)

- Brushed ERM (Vibration Motor)

- Powerboost 1000 (Power subsystem)

- 1000 mAh LiPo battery x 2 (hot swappable)

- SSD1306 (OLED display)

## SUBSYSTEM 1 → SOUND DETECTION SUBSYSTEM

This subsystem will consist of a microphone and will be responsible for picking up sounds from the environment and conducting a real-time FFT on them. After this, we will filter out lower frequencies and use a frequency-matching algorithm to infer if a pre-programmed sound was picked up by the microphone. This inference will be outputted to the main control unit in real-time.

## SUBSYSTEM 2 → VIBRATION SUBSYSTEM

This subsystem will be responsible for vibrating the bracelet on the wearer’s wrist. Using the vibration motor mentioned above, we should have a frequency range of 30Hz~500Hz, which should allow for the generation of a variety of distinguishable patterns. This subsystem will be responsible for the generation of the patterns and control of the motor, as well as prompting the Display subsystem to visualize the type of notification detected.

## SUBSYSTEM 3 → DISPLAY SUBSYSTEM

The Display subsystem will act as a set of visual cues in addition to the vibrations, as well as a visual feedback system for user interactions. This system should not draw a lot of power as it will be active only when prompted by user interaction or by a recognized sound. Both of these scenarios are relatively uncommon over the course of a day, which means that the average power draw for our device should still remain low.

## SUBSYSTEM 4 → USER INTERACTION SUBSYSTEM

This subsystem is responsible for the interaction of the user with the bracelet. This subsystem will include a set of buttons for tasks such as checking the charge left on the battery or turning off a notification. Checking the charge will also display the charge on the OLED display thus interacting and controlling the display subsystem as well.

## SUBSYSTEM 5 → POWER SUBSYSTEM

This subsystem is responsible for powering the device. One of our success criteria is that we want long battery life and low downtime. In order to achieve this we will be using a power boost circuit in conjunction with two rechargeable 1000 mAh batteries. While one is charging the other can be used so the user doesn’t have to go without the device for more than a few seconds at a time. We are expecting our device to use anywhere from 20-50mA which would mean we get an effective use time of more than a day. The power boost circuit and LiPo battery’s JST connector allow the user to secure and quick battery swaps as well.

# CRITERION FOR SUCCESS

- The bracelet should accurately identify only the crucial sounds in the wearer’s environment with each type of sound having a fixed unique vibration + LED pattern associated with it

- The vibration patterns should be distinctly recognizable by the wearer

- Should be relatively low cost

- Should have prolonged battery life (so the power should focus on only the use case of converting sound to vibration)

- Should have a small profile and a sleek form factor

Project Videos