Project

# Title Team Members TA Documents Sponsor
37 Ant-Weight Battlebot - DC Hammer
Carson Sprague
Gage Gathman
Ian Purkis
Haocheng Bill Yang design_document1.pdf
other1.pdf
proposal1.docx
proposal2.pdf
# Ant-Weight Battlebot - DC Hammer

Team Members:
- Ian Purkis (ipurkis2)
- Carson Sprague (cs104)
- Gage Gathman (gagemg2)
# Problem Statement
Many battlebot designs struggle with balancing movement control, durability, offense, and defense within the limitations of competition regulations. We need to design a robust and versatile battlebot while following competition requirements (namely weight requirements) that can outlast and subdue a variety of competitors. Primary design challenges for most battlebots stem from the diversity of opponent designs and abilities, often leaning on a particular design element to win. Our bot must be able to remain competitive throughout the full match regardless of the opponent or sustained damage.
# Solution
Our proposed solution/design will take a well-rounded approach to offense and defense, ensuring that our bot can sustain damage and last the full length of the match. Our primary offensive tool will be a motor-powered, sensor-enabled hammer and wedge attachment allowing for multiple methods of opponent submission by housing two “attack modes”, allowing the driver to adapt attack strategy depending on the design of opposing bots. Our design also includes a significant defensive tool in inversion adjustment, by utilizing sensors and physical shape to prevent knockouts via flips. Our bot will remain functional even if fully inverted. Physical components, especially the hammer, must be modular for quick replacement between matches if damage is taken. This well-rounded design will enable the driver’s creativity during the match by automating the offensive tool (hammer/wedge) and defensive tool (flip adjustment), providing the bot significant competitive advantage against all types of opposing bots.
# Solution Components
## Subsytem 1 - Ultrasonic Sensor Enabled Hammer/Wedge Attachment (Attack Arm)
We will embed an ultrasonic sensor into the front of our bot. The sensor will be used as a proximity detector to activate the attack arm motion. The attack arm will have two default configurations for either orientation of the bot. A low position, running near parallel to the arena surface will be used for the wedge attack, upon sensor OR driver input an upward swing will execute, effectively flipping objects in front of the bot. The other arm resting position will stick upward, perpendicular to the ground and upon sensor or driver input perform a downward swing to strike objects in front of the robot.
- Ultrasonic Sensor;
If we can use a pre-emplemented sensor - Adafruit 4007 (https://www.digikey.com/en/products/detail/adafruit-industries-llc/4007/9857020).
If we cannot, alternatively, an infrared LED/detector combo could be used
- Motor (Weapon)
TBD, but something of the sort as follows, primary characteristic is a high torque motor for flipping/smashing
12V 50RPM 694 oz-in Brushed DC Motor (210 grams)
(https://www.robotshop.com/products/12v-50rpm-694-oz-in-brushed-dc-motor)
- Microcontroller Unit
ESP32-S3-WROOM-1 (not dev board, just chip + antenna)

## Subsystem 2 - Gyroscopic Sensor Enabled Control Inversion
We will embed a gyroscopic sensor inside the body of the robot. This will allow the software responsible for translating driver input into motor movement to adjust based on the orientation of the bot. If the bot is flipped over, left turns become right turns and vice versa, which would be a challenge for the driver to quickly adjust. This feature/subsystem will allow the software to make the appropriate adjustments to maintain driver input continuity. Additionally, the orientation measured by the gyroscopic sensor will modify the resting/default positions of the attack arm to continue operation (resting positions and rotation direction must be inverted to continue operation).
- Gyroscopic Sensor
(Potential alternate sensor - Accelerometer - something like MC3416 would do, this should be able to detect orientation satisfactorily)
(https://www.digikey.com/en/products/detail/memsic-inc/MC3416/15292804)
- Microcontroller Unit - ESP32-S3 see above

## Subsystem 3 - Wireless Control/Driver Input + Steering and Wheel Configuration
Our driver will utilize a keyboard for robot control and steering. The W and S keys will control forward and backward motion with A and D controlling left and right rotation. We will also program the F key to switch attack modes between the hammer and wedge and the Space bar as an alternative manual attack trigger. These inputs will be wirelessly communicated to the onboard PCB and microcontroller via bluetooth and translated to the appropriate motors. To enable the tank-turning we will use 4 wheel drive as each wheel/motor will require isolated control. The height of the robot’s body will be thinner than the diameter of the wheels, with the wheels’ axles fixed at the midpoint relative to the thickness of the body. This will allow all four wheels to make contact with the ground regardless of orientation, and maintain drivability.
- Microcontroller Unit - ESP32-S3 see above
- Keyboard (Simply from a laptop, Laptop will also run the “server” that communicates with the MCU/PCB)
- Drive Motors
12mm Diameter 50:1 Micro Metal Gearmotor 12V 600RPM (2 x 10 grams)
(https://www.robotshop.com/products/dyna-engine-12mm-diameter-501-micro-metal-gearmotor-12v-600rpm)
## Subsystem 4 - Battery/Power
Onboard power source for sensors/controllers/motors as well as components to regulate and distribute power.
- Battery
3S (11.1 V) around 500 mAH battery (starting point estimation)
(https://hobbyking.com/en_us/turnigy-nano-tech-450mah-3s-45c-lipo-pack-w-xt30)
- Control Circuit Regulator
AZ1117CH-3.3TRG1 - 3.3 V w/ 18 V max input, output current is 1.7 mA min, and 1 A max, well within range
(https://www.digikey.com/en/products/detail/diodes-incorporated/AZ1117CH-3-3TRG1/4470985)
- Gate Drivers
DGD0211C - 3.3v to 12 v gate drivers, plenty of overhead in ability
(https://www.digikey.com/en/products/detail/diodes-incorporated/DGD0211CWT-7/12702560)
- H-Bridge MOSFETs
FDC655BN - 30v, 6.3 A NMOSfets
(https://www.digikey.com/en/products/detail/onsemi/FDC655BN/979810)

# Criteria for Success
- Ultrasonic sensor accurately triggers attack arm when an object comes into close proximity
- Gyroscopic sensor accurately registers when robot has been flipped and inverts controls
- Microcontroller takes in driver keyboard inputs for fluid steering
- Attack arm’s default position changes based on driver input (horizontal for wedge, vertical for hammer)
- Attack arm’s default position changes based on gyroscopic sensor input (default position adjusts to bot’s orientation)
- Tank turning and wheel alignment allows for 360 degree rotation
- Robot movements follow driver input: i.e. forward/backward motion, turns etc.

Electronic Replacement for COVID-19 Building Monitors @ UIUC

Patrick McBrayer, Zewen Rao, Yijie Zhang

Featured Project

Team Members: Patrick McBrayer, Yijie Zhang, Zewen Rao

Problem Statement:

Students who volunteer to monitor buildings at UIUC are at increased risk of contracting COVID-19 itself, and passing it on to others before they are aware of the infection. Due to this, I propose a project that would create a technological solution to this issue using physical 2-factor authentication through the “airlock” style doorways we have at ECEB and across campus.

Solution Overview:

As we do not have access to the backend of the Safer Illinois application, or the ability to use campus buildings as a workspace for our project, we will be designing a proof of concept 2FA system for UIUC building access. Our solution would be composed of two main subsystems, one that allows initial entry into the “airlock” portion of the building using a scannable QR code, and the other that detects the number of people that entered the space, to determine whether or not the user will be granted access to the interior of the building.

Solution Components:

Subsystem #1: Initial Detection of Building Access

- QR/barcode scanner capable of reading the code presented by the user, that tells the system whether that person has been granted or denied building access. (An example of this type of sensor: (https://www.amazon.com/Barcode-Reading-Scanner-Electronic-Connector/dp/B082B8SVB2/ref=sr_1_11?dchild=1&keywords=gm65+scanner&qid=1595651995&sr=8-11)

- QR code generator using C++/Python to support the QR code scanner.

- Microcontroller to receive the information from the QR code reader and decode the information, then decide whether to unlock the door, or keep it shut. (The microcontroller would also need an internal timer, as we plan on encoding a lifespan into the QR code, therefore making them unusable after 4 days).

- LED Light to indicate to the user whether or not access was granted.

- Electronic locking mechanism to open both sets of doors.

Subsystem #2: Airlock Authentication of a Single User

- 2 aligned sensors ( one tx and other is rx) on the bottom of the door that counts the number of people crossing a certain line. (possibly considering two sets of these, so the person could not jump over, or move under the sensors. Most likely having the second set around the middle of the door frame.

- Microcontroller to decode the information provided by the door sensors, and then determine the number of people who have entered the space. Based on this information we can either grant or deny access to the interior building.

- LED Light to indicate to the user if they have been granted access.

- Possibly a speaker at this stage as well, to tell the user the reason they have not been granted access, and letting them know the

incident has been reported if they attempted to let someone into the building.

Criterion of Success:

- Our system generates valid QR codes that can be read by our scanner, and the data encoded such as lifespan of the code and building access is transmitted to the microcontroller.

- Our 2FA detection of multiple entries into the space works across a wide range of users. This includes users bound to wheelchairs, and a wide range of heights and body sizes.