Projects
# | Title | Team Members | TA | Professor | Documents | Sponsor |
---|---|---|---|---|---|---|
1 | RFA: Any-Screen to Touch-Screen Device |
Ganesh Arunachalam Sakhi Yunalfian |
Chi Zhang | Arne Fliflet | design_document1.pdf proposal1.pdf |
|
# Any-Screen to Touch-Screen Device Team Members: \- Sakhi Yunalfian (sfy2) \- Muthu Arunachalam (muthuga2) \- Zhengjie Fan (zfan11) # Problem While touchscreens are becoming increasingly popular, not all screens come equipped with touch capabilities. Upgrading or replacing non-touch displays with touch-enabled ones can be costly and impractical. Users need an affordable and portable solution that can turn any screen into a fully functional touchscreen. # Solution The any-screen-to-touch-screen device uses four ultra-wideband sensors attached to the four corners of a screen to detect the position of a specially designed pen or hand wearable. Ultrawideband (UWB) is a positioning technology that is lower-cost than Lidar/Camera yet more accurate than Bluetooth/Wifi/RFID. Since UWB is highly accurate we will use these sensors to track the location of a UWB antenna (placed in the pen). In addition to the UWB tag, the pen will also feature a touch-sensitive tip to detect contact with the screen (along with a redundant button to simulate screen contact if the user prefers to not constantly make contact with the screen). The pen will also have a gyroscope and low profile buttons to track tilt data and offer customizable hotkeys/shortcuts. The pen and sensors communicate wirelessly with the microcontroller which converts the pen’s input data along with its location on the screen into touchscreen-like interactions. # Solution Components ## Location Sensing Subsystem (Hardware) This subsystem will employ Spark Microsystems SR1010 digitally programmable ultra-wideband wireless transceiver. The transceiver will be housed in a enclosure that can be attached to the corners of a screen or monitor. Each sensor unit will also need a bluetooth module in order to communicate with the microcontroller. ## Signal Processing Subsystem (Hardware and Software) A microcontroller, specifically the STM32F4 series microcontroller (STM32F407 or STM32F429). Real-time sensor data processing takes away a considerable amount of computing power. The STM32F4 series contain DSP instructions that allow a smoother way to perform raw data processing and noise reduction. This subsystem will allow us to perform triangulation to accurately estimate the location on the screen, smooth real-time data processing, latency minimization, sensitivity, and noise reduction. A bluetooth module, in order for the sensor to send its raw data to the microcontroller. We are planning to make the communication between the sensors and the pen to the microcontroller to be wireless. One bluetooth module we are considering is the HC05 bluetooth module. The microcontroller itself will be wired to the relevant computer system via USB 2.0 for data transfer of touchscreen interactions. ## Pen/Hand Wearable Subsystem (Hardware) The pen subsystem will employ a simple spring switch as a pen tip to detect pen to screen contact. We will also use a Sparkfun DEV-08776 Lilypad button to simulate a press/pen-to-screen contact for redundancy and if the user wishes to control the pen without contact to the screen. The pen will also contain several low profile buttons and a STMicroelectronics LSM6DSO32TR gyroscope/accelerator sensor to provide further customizable pen functionality and potentially aid in motion tracking calculations. The pen will contain a Taoglas UWC.01 ultra-wideband tag to allow detection by the location sensing subsystem and a bluetooth module to allow communication with the microcontroller. The unit will need to be enclosed within a plastic or 3D printed housing. ## Touch Screen Emulation Subsystem (Software) A microcontroller with embedded HID device functionalities in order to control mouse cursors of a specific device connected to it. We are planning to utilize the STM32F4 series microcontroller with built in USB HID libraries to help emulating the touch screen effects. We will also include a simple GUI to allow the user to customize the shortcuts mapped to the pen buttons and specify optional parameters like screen resolution, screen curve, etc. ## Power Subsystem (Hardware) The power subsystem is not localized in one area since our solution consists of multiple wireless devices, however we specify all power requirements and solutions here for organization purposes. For the wireless sensors in our location sensing subsystem, we plan on using battery power. Given the UWB transceiver has ultra-low power consumption and an internal DC-DC converter, it makes sense to power each sensor unit with a small 3.3V 650mAh rechargeable battery (potential option: [https://a.co/d/acFLsSu](https://a.co/d/acFLsSu)). We will include recharging capability and micro usb recharging port. For our pen, we plan on using battery power too. The gyroscope module, UWB antenna, and bluetooth module all have low-power consumption so we plan on using the same rechargeable battery system as specified above. The microcontroller will be wired via USB 2.0 directly to the computer subsystem in order to transmit mouse data/touchscreen interaction and will receive 5V 0.9A power supply through this connection. # Criterion For Success ## Hardware The UWB sensor system is able to track the pens location on the screen. The pen is able to detect clicks, screen contact, and tilt. The microcontroller is able to take input from the wireless pen and the wireless sensors. Each battery-powered unit is successfully powered and able to be charged. ## Software The pen’s input and sensor location data can be converted to mouse clicks and presses. The pen’s buttons can be mapped to customizable shortcuts/hotkeys. ## Accuracy and Responsiveness Touch detection and location accuracy is the most crucial criteria for our project’s success. We expect our device to have a 95% touch detection precision. In order to correctly control embedded HID protocols of a device, the data sent and processed by the microcontroller to the device has to have a low error threshold when comparing cursor movements with wearable location. Touch recognition and responsiveness is the next most important thing. We want our system, by a certain distance threshold, able to detect the device with a relatively low margin of error of about 1% or less. More specifically, this criteria for success is the conclusion to see if our communication network protocol between the sensors, USB HID peripherals, and the microcontroller are able to efficiently transfer data in real-time for the device to interpret these data in a form of cursor location updates, scrolls, clicks, and more. Latency and lags should have a time interval of less than 60 millisecond. This will be judged based on the DSP pipeline formed in the STM32F4 microcontroller. ## Reliability and Simplicity We want our device to be easily usable for the users. It should be intuitive and straightforward to start the device and utilize its functionalities. We want our device to also be durable in the sense of low chances of battery failures, mechanical failures, and systematic degradations. ## Integration and Compatibility We want our device to be able to be integrated with any type of screens of different architectural measurements and operating systems. |
||||||
2 | Antweight Battlebot Project |
Avik Vaish Jeevan Navudu Keegan Teal |
Jason Zhang | Cunjiang Yu | design_document1.pdf proposal1.pdf proposal2.pdf |
|
# Antweight Battlebot Team Members: - Keegan Teal (kteal2) - Avik Vaish (avikv2) - Jeevan Navudu (jnavudu2) # Problem In order to compete in Professor Gruev’s robot competition, there are many constraints that need to be met, including: - Maximum weight (2lbs) - Allowed materials (3D-printed thermoplastics) - Locomotion system and fighting tool - Wireless control via Bluetooth or Wifi The main goal of this competition is to design a Battlebot that is capable of disrupting the functionality of the other Battlebots with our fighting tool while maintaining our own functionality. # Solution For the project, we plan to build a battlebot with a custom electronic speed controller (ESC) that can independently control three brushless motors: two for the drive system, and one for the fighting tool. This ESC will be controlled by an STM32 microcontroller, to which we will add a Bluetooth module to connect to it and specify how much power we want to send to each motor. To communicate with our robot, we will use a laptop that can connect to Bluetooth. # Solution Components ## Vehicle Controller The main subsystem of the robot will be a combined vehicle control board and ESC. This subsystem will contain an STM32 Microcontroller that will serve as the brain for the whole robot. With this MCU, we’ll be able to flash our whole software package that will be able to control the speed and direction of the robot, the robot’s weapon, and the Bluetooth communication. ## Power Module This subsystem includes the battery, the voltage regulators/converters needed to power the electronics, and the necessary battery monitoring circuitry. Specifically, for the battery, we will use a 14.8V 4S2P LiPo pack to power all the components. There will also be a voltage short detection circuit for the battery that will shut down the robot in case of a short to ensure safe practices. This subsystem also contains a 5V linear regulator and 3.3V linear regulator to power the low voltage electronics. ## Drivetrain/Powertrain This subsystem includes the motors and H-bridges needed to control both the wheels and weapon of the robot. The H-bridges will be made with regular N-MOSs that will be controlled by a PWM signal sent from the STM32 MCU. This H-bridge setup will be able to control the voltage and polarity sent to the motors, which will be able to control the speed of the wheels or weapon. This subsystem will also include the mechanical wheels of the robot and actual hardware of the weapon, which will be a spinning object. Since all the wheels and the weapon have the same mechanical motion, they can all use the same hardware and software electronically, with minor adjustments in motor selection and the actual mechanical hardware/peripheral. ## Bluetooth Module One big requirement for this project is the ability for the robot to be controlled wirelessly via laptop. The STM32 MCU has bluetooth capabilities, and with additional peripheral hardware, the robot will be able to communicate over bluetooth with a laptop. The goal for the laptop is to be able to control the speed, direction, and weapon of the robot wirelessly and also have a display for live telemetry. ## Mechanical Design The last part of our project would be the mechanical design of the robot chassis and weapon. For the chassis and weapon material, we decided to go with PLA+ as it offers a blend of being strong and robust but not being too brittle. The drive system will be a 2-wheeled tank style drive with one motor controlling each side of the robot. For the weapon, we are looking to utilize a fully 3D-printed drum that will have a 100% infill to maximize the rotational inertia which can lead to bigger impacts. ## Criterion for Success We would consider our project a success if we are able to communicate with the robot from our computer as in sending throttle and steering commands to the robot, if those commands are then processed on the robots microprocessors and the motors are sent the according power needed to move and behave in the way that we want during a match. ## Alternatives The most commonly used electronics in current antweight battlebots consist mostly of RC drone parts. We plan to create a very similar ESC to those on the market but it will have an integrated Bluetooth wireless capability as well as telemetry monitoring. We also want to focus on minimizing packaging size to lower weight and increase flexibility as much as possible. |
||||||
3 | RFA: Smart Plant Pot |
Gavin Tian Morgan Sukalo Trisha Murali |
Surya Vasanth | Cunjiang Yu | design_document1.pdf other1.PDF proposal1.pdf proposal2.pdf |
|
# **Smart** Plant Pot **Team Members** \- Morgan Sukalo (msukalo2) \- Trisha Murali (tmurali2) \- Gavin Tian (gtian3) # **Problem** Growing plants in any capacity is a maintenance intensive task and requires a lot of varying inputs and knowledge to accomplish successfully. Automating this process would be a step in automating agriculture in a controlled environment. # **Solution** Our solution is to build a hydroponic plant pot that can sustain and grow a plant from sprout or one that has been transplanted while automating all the processes associated with hydroponic plant care. It will have a variety of sensor and actuator based subsystems that perform action items related to hydroponic maintenance. We will then have a central control unit which will interpret all gathered data, and ensure an optimized growth environment. The user will be able monitor the current status of the plant and its environment via the UI subsystem. This system will display sensor data from every other subsystem, report issues requiring more intensive maintenance, and display existing conditions of the plant (example: humidity levels, water temperature, water levels), ultimately making hydroponic agriculture more user-friendly. # **Solution** Components *Diagram of System* https://drive.google.com/file/d/1SQtptcK4uriIv2zN9ECVHA81-U5mPIS0/view *Humidity Subsystem* Most house plants need a humidity between 50-60% to protect against transpiration, and tropical plants require a higher humidity than this. The humidity subsystem will maintain a desired humidity level given the type of plant being grown and will consist of a variety of sensors/actuators to do so. * We will have a humidity sensor (SHT35-DIS-F) which will constantly monitor the humidity within the Smart Pot. For the specified plant, this humidity will have to stay within a certain range, and the sensor will monitor this. * On top of the clear lid for the pot, there will be an adjustable vent operated by a motor that will remain slightly open at all times to allow for air circulation, but if there is ever conditions that are not ideal to the plant (ex: humidity too high), the opening of the vent will adjust accordingly (ex: open up more to air out the plant if needed). * All data will be sent to the central control unit. If the humidity is too low, a humidifier will be turned on until a desired humidity level is reached within the Smart Pot enclosure. * If the humidity is too high, the vent at the top of the Smart Port enclosure will open wide, allowing for some of the gaseous water to escape (lowering humidity). * SENSOR: humidity sensor for Smart Pot enclosure. * ACTUATOR: humidifier, servo motor for vent @ top of enclosure * CONTROL UNIT BEHAVIOR: If the humidity is too high: open vent via servo motor If humidity too low: turn on humidifier *Oxygenation Subsystem* For hydroponics, water needs to be oxygenated and agitated to facilitate plant growth and impede algae and bacteria. * An air stone attached to a pump will be placed in the Smart Pot to continuously agitate the water. * A small fan will be integrated into the side of the Smart Pot lid. This fan will push out the old air from inside of the enclosure. * An air vent at the top of the Smart Pot enclosure will always be slightly open to allow fresh air into the enclosure. Closing/opening the air vent will be controlled by a servo motor. If the Smart Pot enclosure needs to be aired out, the vents can open wide. * This system will be connected to the central control unit. The central control unit will allow the air stone to remain on all of the time. The fan will be turned on/off throughout the day as needed. * SENSOR: n/a * ACTUATOR: air stone, fan attached to wall of Smart Pot lid, servo motor for vent @ top of enclosure * CONTROL UNIT BEHAVIOR: The air stone will always be running. The fan will only run during set intervals enforced by the control unit. If airing-out of the Smart Pot is needed, the servo motor attached to the vents will be actuated to maximize Smart Pot air release. *Grow Lights Subsystem* As we know, plants require sunlight for growth. During winter, the amount of sunlight and other environmental factors a plant receives may not always be ideal. To offset this and provide for plant growth despite the season, we will be using grow lights. * As part of this subsystem, we will also have light sensors to keep track of the amount of light the plant receives during a 24hr span. If this value ever becomes too high, the control unit will ensure that the LED growth light is off, and shades to shield the plant from light will be raised. If the control unit detects that the plant did not receive enough light, the LED growth light will be turned on for some amount of time until the light exposure requirement is fulfilled. * SENSOR: light sensor for plant * ACTUATOR: LED grow light (pre-made), shades with stepper motor * CONTROL UNIT BEHAVIOR: If the value tracked by light sensors become too high, grow light is turned off and shades move to cover the plant If plant did not receive enough light, the LED grow light will be turned on and the shades will be moved out *Water and Nutrient Subsystem* Automated hydroponics requires a water and nutrient module to ensure water stability and sufficient nutrient levels. To deter algae and bacteria growth, water changes will need to be made every two weeks to ensure an optimized growth environment. This subsystem will monitor water level and water temperature. * This is a gravity-based system and will include a water reservoir canister, a waste canister, and one nutrient canister. * The water reservoir canister, nutrient canister and waste canister will be connected to the Smart Pot via 12V solenoid valves. Opening these valves will allow for the contents of one container to flow into the other. * The water level of the Smart Pot will be monitored via a float switch (PLS-041A-3PAI), and a temperature sensor (DFR0198) will be used to observe the temperatures of the Smart Pot water and the water reservoir canister. * If the water level falls below a specified threshold, the water reservoir temperature will be compared to the Smart Pot water temperature. If so, the valve connecting the water reservoir canister to the Smart Pot will be opened, and the Smart Pot will be filled to the desired level. * Every two weeks, the water in the Smart Pot will be drained. Prior to draining, the temperature of the Smart Pot water and the water reservoir will be compared. If these values are similar, draining will commence. * While the valve between the water reservoir and the Smart Pot remains closed, the valve between the Smart Pot and the waste canister will open when draining begins. All water from the Smart Pot will flow into the waste canister. This valve will then close, and the Smart Pot is refilled with freshwater from the water reservoir. Nutrients are then injected into the Smart Pot. * SENSOR: temperature sensors, water level sensor (float sensor) * ACTUATOR: Valves that are between all canisters and the Smart Pot * CONTROL UNIT BEHAVIOR: will keep track of time passes since last water change, monitor water temperatures and water level. Will control when valves open/close. *User Interface Subsystem* The User Interface is a helpful feature for the users to track the environment of the plant. * The user interface will allow the user to specify what plant type the user is currently trying to grow in the Smart Pot. These plant types will be displayed on a TFT LCD (DFR0664) and can be selected using a rotary encoder (PEC11R-4115F-S0018). Once selected, this information goes to the control unit, and all desired humidity levels, pH levels, light levels, etc. are set. The user interface will also display real time data coming from all of the sensors listed above. This gives the user information on the current status of their plant’s environment in the Smart Pot. In addition to this, any maintenance alerts will be displayed. This is where any alerts concerning manual overrides or human intervention will show. The overall idea here is that the rotary encoder will be used to browse through these different selections and the push button feature will be used to select an option. * SENSOR: rotary encoder with push button feature * ACTUATOR: TFT LCD screen to display real time temperature, humidity, etc. specs of Smart Pot * CONTROL UNIT BEHAVIOR: display hydroponic plant types, plant status/stats, warning and alerts for the user *Power Subsystem:* The power subsystem aims to take wall power and convert it to voltages that are safe and usable for all sensors/actuators/devices that are needed to keep the Smart Pot up and running. * The power system comprises a custom PCB with a step down IC, 12V DC wall adapter. * The wall adapter is currently under contention as the amperage needs of the project need to be determined to pick the correct product. * The PCB will route 12V and limit to the solenoids that control water flow as those require the highest voltage. It will also step down 12V to 3.3V DC for the microcontroller and sensors. * SENSOR: n/a * ACTUATOR: 12V, 3.3V, GND power rails * CONTROL UNIT BEHAVIOR: Step down voltages and supply power to all the subsystems # **Criterion** For Success The Smart Pot subsystems prove individual functionality as well as collaborative functionality. Ultimately, the Smart Pot will be able to sustain all subsystems and care for the plant for 4 weeks without human intervention. To test, we want to check that all systems respond accurately and appropriately to their related sensor stimuli. The sensor readings themselves should match real life conditions in a reasonable manner and will be displayed on a UI for easy monitoring. |
||||||
4 | Switch Wizard for Troubleshooting Pinball Machines |
Aditya Gupta Gina Li Logan Henderson |
Pusong Li | Arne Fliflet | design_document1.pdf proposal3.pdf proposal2.pdf |
|
## Pinball Machine Diagnostic Device - RFA Team Members: - Logan Henderson (loganeh2) - Gina Li (ginafl2) - Aditya Gupta (aditya22) ## Problem: Pinball machines are complex, multifaceted devices, needing electrical and mechanical skills to troubleshoot, but I want to make it easier. Pinball machines operate primarily through switches that tell the machine what to show on the screen, solenoids to fire, etc. The issue is, troubleshooting these switches can be tough, because these machines have about 60-100 switches that all need to function properly. Pinball machines from 1977 and newer have rudimentary switch diagnostic features, but everything produced before 1977 are designated as electromechanical, meaning there is no software, only physical relays and switches. This makes electromechanical pinball machines extremely difficult to troubleshoot. ## Solution: The "SWITCH WIZARD" as I am calling it, would consist of multiple ports that would connect to the suspected faulty switch or switches. During gameplay or other testing, the device would record all switch hits with a timestamp and send it via WiFi to a nearby laptop or phone. Gavin, of Gavin’s Game Service, (a master technician in the Chicagoland area) has demonstrated interest in the product. I simply cannot understate how much easier this would make troubleshooting, and if the product is successful, I could genuinely see it having a place in this industry. ## Subsystem 1: Switch state detection Switches are obviously either open or closed, and this subsystem would detect and process the state of the switch. Multimeters can perform this task, but this pcb would be able to track multiple switches, which is extremely useful for Score Motor debugging. The score motor is present on all electromechanicals and essentially acts as a rudimentary clock for the game. The score motor is a cammed motor and usually has 3-6 switches on each cam on the motor, with 6-8 cams depending on the game. ## Subsystem 2: Wireless Component We plan to include a wireless component on our PCB to send the signal data tracked by the multimeters to a nearby laptop or device through WiFi. On the device, the switches are coded to corresponding names. The multimeter will detect whether switches are open or closed. Theb, the device would display the name of the switch where closure was detected. This information is essential to the troubleshooter. # Criterion For Success The Switch Wizard needs to properly detect switch closures, stuck switches, and send them to a nearby device with a timestamp. The user must be able to set a specific port to a temporary name, so as to aid in troubleshooting. Electromechanicals are the main focus of this device and they run on 6 VAC, but it also needs to be compatible with modern games, which run on 5 VDC. |
||||||
5 | Bicycle Lighting System |
Jack Nelson Quentin Mooney Sloan Abrams |
Sanjana Pingali | Kejie Fang | design_document1.pdf proposal2.pdf proposal1.pdf |
|
# Project: Bicycle Lighting System ## Team: Quentin Mooney (qmooney2) Jack Nelson (jnels9) Sloan Abrams (sloanaa2) ## Problem: We are all cyclists, and feel road safety would be improved significantly by a robust lighting system to communicate with other cars, bikes, and pedestrians. Hand signals work decently well, but not everyone is confident enough on a bike to take a hand off their handlebars while riding. Hand signals are also significantly less effective at night when visibility is lower. ## Solution: We want to design a control system for a bicycle lighting system. Headlights and taillights are already widely used, and in a lot of places required by law. We would like to expand upon that by adding brake lights that make the taillights brighter when the brakes are engaged, as well as turn signals so cyclists can signal their intended changes in directions more easily. # Solution Components ## Brake System: - Brake taillights that are automatically activated when the brakes are engaged. We plan to use the ALS31313 Hall Sensor in conjunction with a magnet on either the brake lever or brake calipers to sense brake engagement and trigger the brake lights ## Turn Signal System: - Turn indicator lights on the front and rear of the bicycle - Easy to use and access buttons or switches for the rider to turn on their signals - Turn indicators automatically turn off after turn is complete (the same way a car's will). We will use an Inertial Measurement Unit ICM-42670-P for sensing when the turning action is completed . ## User controls/Interfacing - The rider can see if their turn signals are on or off. This will either be accomplished by a small light indicator on the handlebars, or the turn indicators on the front of the bicycle will be positioned in such a way as to be visible to the rider. - On/Off controls for the entire lighting system. ## General System - Hazard lights (both turn indicators simultaneously) that can be turned on and off by the rider. - Front headlights for visibility to other road users. - On/Off controls for the entire lighting system. ## Power System - Battery powered. - Batteries are easy to remove and replace. ## Additional Stretch Goals/Possibilities: - Ability to control brightness of lights / power conservation mode /brights. - Bluetooth/wireless system. - Rechargeable battery (super stretch goal: Dynamo powered). - 'Auto' mode for the lights (automatic daylight sensing). - Automatically turn off whole system if bike has been inactive for 15+ min and lights were accidently left on. Using IMU sensor for motion detection. # Criterion for Success: - Rear brake lights activate when brakes are engaged. - Turn signals turn on when activated by the rider, and automatically turn off after the turn is complete (for turns of 90 degrees or sharper.) - Headlights on bike. They are bright enough to be seen at night from at least 25 yards away. - Rear taillight is always on when system is on. - Entire system can be turned on and off by the rider. |
||||||
6 | RFA: PTM Dome for Digitally Preserving Our Past |
Austin Meissner Philip Xie-O'Malley Stephanie Leigh |
Jason Jung | Arne Fliflet | design_document1.pdf other1.pdf proposal2.pdf proposal3.pdf proposal1.pdf |
|
# Project: Polynomial Texture Mapping Dome for Digitally Preserving Our Past # Team - Stephanie Leigh (sleig2) - Austin Meissner (alm13) - Philip Xie-O'Malley (qy10) # Problem While museums are a great way of displaying artifacts to visitors, there are geographic limitations that restrict a broader audience from being able to view or study these artifacts. There are many advantages to involving researchers from diverse backgrounds across the world because they can contribute a wide variety of perspectives due to their unique experiences. A way to effectively share these artifacts with scholars worldwide is to create high-quality digital models of the artifacts using a method called Polynomial Texture Mapping (PTM). The Spurlock Museum hopes to employ this method to digitally preserve its large collection of artifacts and gain insight from scholars outside of the Champaign area. My partners and I plan to redesign and upgrade the existing, non-functional PTM Dome. The Spurlock Museum relies heavily on the current "dome" setup, which includes 32 LEDs sequenced with a high-quality camera's shutter. The output from the camera is 32 pictures - each corresponding to a particular LED and angle of light- and these pictures are stitched together to create a 3D digital model of the artifact. Currently, the dome setup is not functional, and a new solution is required to meet the robustness, functionality, and modularity desires of the Spurlock Museum. # Solution Our solution will be centered around the use of a PCB, a control box, and a complete redesign of the current wiring system. The PCB will contain a power distribution system, a microcontroller, connections to LED drivers, and a connection to the Canon EOS 1 camera. The PCB will work with the control box to send proper high and low signals to the LEDs # Solution Components ## PCB The PCB will be designed for many applications in our project including power distribution and regulation and interfacing with the microcontroller, LED drivers, and camera. We plan to utilize a 12V power supply (L6R24-120) rated for 2A to feed into the board through a soldered connector, as well as implement a linear voltage regulator circuit to step down the voltage to 5V, a value within the microcontroller’s required operating voltage range. For the microcontroller, we plan to utilize the ATMega32u4 because of its high-speed capabilities, large i/o count, and compatibility with a USB connector. The microcontroller will interface with the control box buttons and receive command signals from the buttons, and the code will send output signals to the LEDs according to which commands the microcontroller receives. ## Control box The control box is the user interface, where the user can input signals for a sequenced flash of LEDs or individually controlled LED flashes. For individual LEDs, we plan to set up a two-digit LCD screen where each digit is manipulated by pressing a set of increment and decrement triangular buttons. Then we will use an activation button to trigger the 32 LEDs in order and another to trigger the corresponding individual LED indicated by the number displayed on an LCD screen. 2*LCD: YSD-160AR4B-8 Green & Red Buttons: TSL12121 PCB Enclosure: DC-57P ## Wiring System The wiring system is the physical wires that will take the power pulses from the LED drivers in the PCB to the actual LEDs to turn them on. We plan to completely redesign the existing wiring system to include labels for the LEDs and the wires that correspond to the different LED numbers while also choosing a wire size that is apt to handle the power pulses that need to be carried to the LEDs. At this point, we are thinking about using just standard 14 or 16 AWG wire to run from the PCB to the LEDs, and then we will use standard 14-16 AWG butt splices to connect the wires to the LEDs to deliver the power. # Criterion For Success The Spurlock Museum hopes that my partners and I will accomplish the following goals: Our team shall redesign and rewire the system so that the 32 lights are sequenced properly with the camera shutter The system shall function with a failure rate of less than 10%; i.e. the system shall properly capture the 32 photos with proper LED sequencing and shall do so with a failure rate of less than 10% The system shall fit in the given dome's space limitations Our team shall create a software program and a control box for the user interface The system shall be functional, capturing 32 distinct images, each corresponding to the lighting of a specific LED Ultimately, we consider success as being aligned with the criteria above and most importantly, being able to hand over a working product to allow the museum to go on with their digital artifact preservation work. |
||||||
7 | STORM RFA |
Abhee Jani Dev Patel Vikram Battalapalli |
Angquan Yu | Arne Fliflet | design_document1.pdf proposal1.pdf |
|
# STORM: Sprint Training Optimization and Real-time Monitoring **Team Members:** - Abhee Jani (abheej2) - Trivikram Battalapalli (tb17) - Dev Patel (devdp2) # Problem During most sprint training and high intensity cardiovascular activities, we see a lack of proper monitoring for biomechanical metrics including heart rate, VO2max, ground contact time, and stride cadence. Current solutions, including force-detecting treadmills and coaches, are not only very costly but also inaccessible to the average athlete trying to better their performance. In addition, these solutions are not all-inclusive and omit more specific data such as thigh angular velocity which is one of the most impactful metrics on sprint speed. Other solutions, such as fitness wearables, can only track average speeds over long distances but not over short sprints such as 100m. There is a need for a system that’s not only affordable but can help the user optimize their training in real-time. # Solution Our solution is a multi-sensor monitoring system that tracks various biomechanical metrics and interfaces with a mobile app to provide the user with analysis of their sprinting form. The first component is a chest strap sensor that tracks heart rate, VO2max, overall speed, and steps. It also uses haptic feedback to notify the user to stay in their desired heart rate zones during training. The second component is a knee strap sensor that tracks leg movement and ground contact time to evaluate the user’s sprint form and mechanics. The data from both sensors is wirelessly transmitted to our mobile app using Bluetooth, where the user can visualize their performance metrics and track their progress over time. This integrated solution will give the athletes actionable insights that will enhance their training regimen, sprint technique, and cardiovascular performance. # Solution Components ## Subsystem 1: Chest Strap Monitoring System **Function:** Tracks heart rate, VO2 max, tilt, speed, and steps; provides haptic feedback **Components:** - **Heart Rate Sensor:** Maxim Integrated MAX30102 - **Microcontroller:** STM32H7 for data processing and wireless communication - **Haptic Feedback:** Precision Microdrives 306-109 for vibration notifications. - **Bluetooth Module:** HC05 Bluetooth Module - **Inertial Measurement Unit:** STM iNEMO LSM6D032X for pedometer, gyroscope, and accelerometer to calculate steps, tilt, and speed - **Flash Memory:** W25Q32JVSSIQ TR to log data collected from the sensors - **Rechargeable battery:** 3.7V 500mAh Li-ion Rechargeable Battery - **Battery Power Regulator:** MCP1702 3.3V Linear Regulator ## Subsystem 2: Knee Sensor for Sprint Analysis **Function:** Monitors leg movement and ground contact time. **Components:** - **Microcontroller:** STM32H7 for data processing and wireless communication. - **Bluetooth Module:** HC05 Bluetooth Module - **Inertial Measurement Unit:** STM iNEMO LSM6D032X for pedometer, gyroscope, and accelerometer to calculate steps, tilt, and speed - **Flash Memory:** W25Q32JVSSIQ TR to log data collected from the sensors - **Rechargeable battery:** 3.7V 500mAh Li-ion Rechargeable Battery - **Battery Power Regulator:** MCP1702 3.3V Linear Regulator ## Subsystem 3: Mobile Application **Function:** Retrieve sprint related data from subsystem 1’s bluetooth sensor transfer this data to the cloud and develop a frontend analysis for sprint metrics. **Components:** - **React Native Framework:** For developing the mobile application with cross-platform compatibility - **AWS Cloud Services:** For securely storing and processing data in the cloud - **Kinesis:** Data streaming and analytics from the mobile app to the cloud - **Lambda:** For data processing after its been streamed with Kinesis - **S3:** Storing the sensor data of user metrics - **DocumentDB:** Database management system in a noSQL format - **REST API:** For data transfer between the mobile app and cloud services # Criterion For Success ### Data Accuracy and Reliability: - **High-Accuracy Biomechanical Metrics:** Track ground contact time stride cadence and thigh angular velocity with a margin of error within 10% compared to high-speed video analysis or industry-standard equipment - **Precision in Cardiovascular Monitoring:** Maintain heart rate zone tracking with a 95% confidence level across different levels of exertion - **System Reliability and Durability:** Ensure sensors are resilient to sweat impact and environmental conditions typical in high-intensity training ### Software and User Experience: - **Real-Time Feedback and Responsiveness:** Ensure the chest strap's haptic feedback system responds within 200 milliseconds to changes in heart rate zones - **User-Friendly Data Visualization:** Provide an intuitive UI with features like color-coded performance indicators and trend graphs easily interpretable by athletes without a technical background - **Seamless Data Integration and Cloud Connectivity:** Complete cloud processing and retrieval within 5 seconds for a full training session’s data |
||||||
8 | Budget Odor Detector |
David Lacayo Jeffrey Wong John Yan |
Chentai (Seven) Yuan | Kejie Fang | design_document2.pdf proposal4.pdf |
|
# Odor Detector Team Members: - Jeffrey Wong (jwong19) - David Lacayo (dlacayo2) - John Yan (johnyan2) # Problem Roughly 20% of the general population has a bad sense of smell ([Link](https://www.ncbi.nlm.nih.gov/books/NBK567741/)). This makes it hard to pick up odors, which may indicate a larger issue like a leak that would cause damage to the house, or potentially put the owner of the home in danger. An odor detector will compensate for this issue, but they are expensive on the market, going upwards of $200. Additionally it is common to have to buy more than 1 device to sense more than 1 gas at a time as well, increasing the cost even more. ([Link](https://www.google.com/search?sca_esv=a2b652855d7b65ae&sca_upv=1&rlz=1C1ONGR_enUS1085US1085&q=bathroom+odor+detector&tbm=shop&source=lnms&fbs=AEQNm0Dwc9VijzN-JW-4YRo_w_BUQbKUrL1mkR3HAGvFJsTEU2yTeL61j6uh8rPucZu_asD8QQkcJ2ZCfmQyRJdbMVxfwLj5E_IXO-CVkXuakqtw-13zeocNWRU3YlReVPUwaBFs7zE1oDuVzgTEJzPqBr0ACU6aqrZtnZnlm3-LAMzrU0cFrQcLXC9jCL5Okk4OO1PSKoZMcVW2fOwruwAjjqCVDGXpSQ&ved=1t:200715&ictx=111&biw=958&bih=944&dpr=1)). # Solution Our solution is an innovation– we will make a budget odor detector with sensors that detect methane, H2S, NH3, and CO. We will have an LCD screen to show each gas’s ppm level. Also, if the readings cross a dangerous threshold, the screen will display a warning notification as well as output a warning alarm to alert the user. # Solution Components ## Microcontroller We will use the STM32xx as our main microcontroller to process data sent from the sensors, and to interface that data with the LCD screen to display that information. ## Sensors NH3: ([Link](https://www.digikey.com/en/products/detail/sparkfun-electronics/SEN-17053/13252162)) H2S: ([Link](https://www.digikey.com/en/products/detail/sparkfun-electronics/SEN-17052/13252248)) Methane: ([Link](https://www.digikey.com/en/products/detail/sparkfun-electronics/SEN-09404/6161754)) CO: ([Link](https://www.digikey.com/en/products/detail/sparkfun-electronics/SEN-09403/6163653)) ## LCD Screen, LEDs We will have an LED that illuminates when the device has low battery, and an LED that illuminates when the alarm is going off. We will have an LCD screen that displays the ppm levels of NH3, H2S, Methane, and CO in the room that the device is in. If the sensors detect a dangerous level of gas in the room, the LCD will automatically turn on and display a warning message to indicate this crossed threshold. ([Link](https://www.digikey.com/en/products/detail/newhaven-display-intl/NHD-0420CW-AB3/5022951?utm_adgroup=&utm_source=google&utm_medium=cpc&utm_campaign=PMax%20Shopping_Product_Medium%20ROAS%20Categories&utm_term=&utm_content=&utm_id=go_cmp-20223376311_adg-_ad-__dev-c_ext-_prd-5022951_sig-CjwKCAjwreW2BhBhEiwAavLwfG0TbDch5eYRKwQTaY1O_lBKy-WOhhoPX9u6xC7CX5-OeFdmQ0rW8RoCHpcQAvD_BwE&gad_source=4&gclid=CjwKCAjwreW2BhBhEiwAavLwfG0TbDch5eYRKwQTaY1O_lBKy-WOhhoPX9u6xC7CX5-OeFdmQ0rW8RoCHpcQAvD_BwE)) ## Alarm The device will have an alarm that goes off if the gas sensors’ ppm levels exceed their thresholds. This is to alert the owner of the device of potential danger stemming from these gasses. ([Link](https://www.digikey.com/en/products/detail/pui-audio-inc/AI-1223-TWT-3V-2-R/5011391)) ## Power supply The device will be powered by a 9V battery. This will ensure that enough power is delivered to each of our sensors, our STM chip, & our LCD screen. ([Link](https://www.digikey.com/en/products/detail/duracell-industrial-operations-inc/9V/21259959)) ## Power Switch / Monitor Power Switch There will be a switch to turn the whole device on/off, as well as one for turning the LCD monitor on/off to save power consumption. ## Device Enclosure All of the previous components will be contained inside of an enclosure (likely 3D printed) to protect hardware components. # Criterion For Success ## The detector will need to do the following actions: * Detect exceeding thresholds of the following gasses in compliance to OSHA or default alarm settings ([Link](https://www.indsci.com/en/blog/understanding-gas-detector-default-alarm-settings) for all): * NH3 (Ammonia) when it exceeds 25 ppm ([Link](https://ctigas.com/ammonia-gas-detection/#:~:text=Ammonia%20detectors%20located%20in%20refrigerated,levels%20is%200%2D100%20ppm.) [Link2](https://nj.gov/health/eoh/rtkweb/documents/fs/0084.pdf)). Tested by opening a container of household ammonia. * H2S (Hydrogen Sulfide) when it exceeds 20 ppm ([Link](https://www.osha.gov/hydrogen-sulfide/hazards)). Tested by opening a container of a rotten egg/food. * CH4 (Methane) when it exceeds 1000 ppm ([Link](https://www1.agric.gov.ab.ca/$department/deptdocs.nsf/all/agdex9038/$file/729-2.pdf?OpenElement=#:~:text=The%20Occupational%20Safety%20and%20Health,1%2C000%20ppm%20(0.1%20percent).)). Tested by opening a container of natural gas. * CO (Carbon Monoxide) when it exceeds 50 ppm ([Link](https://www.osha.gov/sites/default/files/publications/carbonmonoxide-factsheet.pdf)). Tested by opening a container of gas from burnt substances/exhaust pipes. * Display the ppm of the gasses on the LCD display. * Upon exceeding the sensor thresholds, display a warning message on the LCD display. * Upon exceeding the sensor thresholds, it sounds a warning alarm. * Display LEDs showing low battery and when the warning alarm triggers. |
||||||
9 | Laser/Voice Assisted Cat Toy |
Paul Jablonski Rahul Grover Yutong Gan |
Rui Gong | Kejie Fang | design_document1.pdf proposal1.pdf |
|
# Laser/Voice Assisted Cat Toy Team Members: - Paul Jablonski (pjj3) - Yutong Gan (yutongg9) - Rahul Grover (rgrover4) # Problem Modern cat toys have some systems for automatically moving around, but rarely use any sophisticated sensors. This is commonly seen in commercial toys like balls that roll around in random patterns. However, these widespread and commercial systems could use some serious improvements as problems exist in longevity, noise generation, and a lack of interaction for the cats. These toys typically bang into walls without any preventative systems in place, causing damage to both the toy and potentially the pet owner's home - on top of being terribly loud. This is significant as owners may need to replace their cat's toys far more than desired. Furthermore, the constant speed and random directional movement of the toy detract from a cat's play experience. With the toys moving at a fixed rate and without much excitement, cats may often stare or fear these toys rather than chase them down as they would a live animal. Given the importance of engaging play for a cat's health, owners are burdened by the current market's lackluster and rudimentary options. # Solution We propose that these problems be resolved through a mouse-like toy, which has been seen before, but is now refined with multiple more advanced systems. The sensors will uniquely consist of a distance measuring laser, used for scanning ahead of the toy and triggering stopping or turning events, and a vibrational sensor that can change behavior based on the cat's interactions. The non-rolling shape of a mouse will also allow for more rigid and controllable movement in addition to stabilizing the sensors that will enable reactivity to its environment and less noisy behavior. Furthermore, a moving tail will be used to mimic the more excitatory behaviors of prey like mice and rats, making it more engaging than a typical toy's static tail. This would be accompanied by faster motorized movements and more realistic movement states in comparison to the industry standard for automated cat toys - as will be regulated by our microcontroller. Our solution will thus consist of several subsystems, which will be contained in a compact and light body such that the toy may move more freely and with more rapid control. They will additionally be powered by re-usable lithium ion batteries for convenience and ease of use. These systems may be more cleanly and broadly divided as such... 1. **Laser Range Sensor** - Time-of-flight module that will accurately measure distance up to a couple meters in order to avoid collisions and change movement. 2. **Vibration Sensor** - Picks up on changes in frequency, will trigger a "caught" state to change the toy's movement behavior. 3. **Separately Motorized Wheels** - Allow for turning and precise movement control, will be accompanied by a caster wheel for stabilizing the body and avoiding scratched floors. 4. **Motorized Tail** - A rapidly moveable tail that will trigger variably during different movement states in order to excite cats more. 5. **Arduino Nano Microcontroller** - Will receive inputs from sensors in order to output the appropriate movement states and power. 6. **Lithium Ion Batteries** - Will supply voltage to the microcontroller and the motors powering the toy's movement. 7. **Custom PCB** - Will integrate all components onto a single board and allow for easier power distribution and part mounting. # Solution Components ## Laser Range Sensor The laser range sensor is our most unique component, which enables the toy to detect obstacles and measure distances ahead of the body. It will actively scan the environment ahead of the toy to prevent collisions, adjust speed, or change direction as necessary. The output signals from the sensor will thus be input signals to the Arduino microcontroller and its behavior-adjusting program. Based on how close the nearest obstacle is, the state of the microcontroller will change and the movement of the wheels will also change to either turn or stop. Components: * Laser Range Sensor (VL53L1X) ## Vibration Sensor The vibration sensor adds an interactive element by detecting when the toy is being touched or played with by the cat. When the sensor picks up sharp vibrations that are presumably not due to turning, the toy can change its behavior. This includes entering a "caught" state, slowing down, or performing evasive maneuvers. This sensor generally increases the engagement factor for the cat by making the toy's movements more responsive to physical interaction. In commercial cat toys, the vibration sensors are practically solely used for turning the toy on after it has shut off from a lack of engagement. Our design thus provides additional functionality that contributes to reactivity during play, rather than simply using vibrations as a "power button". Components: * Vibration Sensor (SW-420) ## Separately Motorized Wheels Two separately motorized wheels placed on the back of the body will allow the toy to move with precise control, enabling it to turn, speed up, slow down, and change direction based on sensor input. A caster ball wheel will be included to stabilize the toy in the front and prevent damage to floors. The motorized wheels will also need to be controlled by an H-bridge motor driver in order to properly and independently divert power according to the current state of the toy. Thus, the Arduino microcontroller will have to output PWM signals to the motor driver based on the input sensors and predefined movement states. Components: * Two DC Motors (N20 Micro Metal Gear Motor) * Two DC-Fitting Wheels (Pololu 32mm Plastic Wheels) * H-Bridge Motor Driver (L298N) * Caster Wheel (Pololu 67mm Ball Caster with Plastic Ball) ## Motorized Tail The motorized tail is designed to mimic the unpredictable movements of a mouse's tail, enhancing the realism and engagement of the toy. The tail will be triggered by the toy's movement states or cat interaction, and can be used to entice the cat to chase the toy. A small servo motor will control the tail's motion, allowing for more precise movements than what is seen in a DC motor. Meanwhile, an attachable feather or string will be connected to the servo through a custom printed mounting bracket in order to extend the tail more visibly. Components: * Servo Motor (SG90 Micro Servo Motor) * Custom Printed Mounting Bracket and Feather ## Arduino Nano Microcontroller The Arduino microcontroller is going to intake all inputs from the laser sensor and the vibrational sensor and will be powered by the lithium ion batteries. It will then be programmed to form a state machine with these inputs that consists of multiple different movement types, a "caught" state, and a "rest" state. Based on the current state, the Arduino will then output different signals to the servo-powered tail and DC-powered wheels in order to generate different movements. The movement types will reflect whether the toy should move ahead in a quickly accelerative manner, or whether it should stop and turn in order to avoid collisions. The latter state should interrupt the prior if signals from the laser sensor indicate a wall is oncoming. Furthermore, different accelerative states may be included to vary the mouse's movement for the cat, this include dashing back and forth, or simply egging on the cat before moving as quick as possible. Each of these states will move the tail in unique ways as well to engage the cat further. If at any point the Arduino receives signals from the vibrational sensor that are far more sharp than expected, it may initiate the "caught" state, where the toy acts dead momentarily before re-initiating with the cat. This healthily mimics real prey behavior, something not seen in other modern cat toys. A "rest" state will also be included if no significant vibration is detected in a while to conserve power. Components: * Arduino Nano (ATmega328P) ## Lithium Ion Batteries The lithium-ion batteries provide the necessary power for all the toy's components, including the motors, sensors, and microcontroller. These batteries are chosen for their high energy density, which allows for a power source that can sustain the toy's functionality for a longer time. The batteries will be rechargeable to ensure convenience and reduce long term costs. A battery management system will be connected to the battery pack in order to allow for recharging and to prevent issues such as overcharging the batteries. The state of charging will be displayed on the toy through connecting an LED to this battery management system on the status pins. Additionally, the power button for the entire cat toy will be connected to the positive and negative lines from the battery pack in order to turn the toy on and off. Components: * Battery Pack (7.4V Lithium-Ion Battery Pack) * Battery Management System (USB-C TP4056 Module with Protection) * Mountable Charging Indicator LED (Cree CLV1A-FKB-CW) * Power Button (ADA1479) ## Custom PCB The custom-built PCB will integrate all of our electronic components onto a single board, making the connections much easier to form and maintain. This will reduce the hassle of using wired connections and will also make our toy much more compact and physically secure with all of the movement that will occur. We will also need mounting brackets for parts such as the laser, which will need to be positioned perpendicularly to the board. Components: * Custom Built and Printed PCB * VL53L1X Mounting Bracket # Criterion For Success With the various systems we have decided upon for our project, we have a few key goals in mind. Some of these pertain to the subsystems individually - such as battery functionality and rechargeability. Meanwhile, others pertain to solving the problems we have set out for, such as noisiness, longevity, engagement, and realism. The following are some key testable goals: 1. **Obstacle Detection and Avoidance** - The toy must be able to stop and turn if a wall or object is detected ahead of it. It must then move in another direction that avoids obstacles while also maintaining playfulness. Easily testable through placing the toy in a large room with various smaller objects and wall-like obstacles and observing if collisions occur. 2. **Noise Reduction** - The toy must be quieter than the common commercially automated cat toy. This is partially testable by it avoiding colliding into walls, but also through direct comparison to another actual toy. Sound can be directly recorded from each toy on the same recording device to compare volume levels. 3. **Interactivity** - The toy must include multiple movement styles, which must vary significantly from the constant speeds seen in commercial toys. Furthermore, the tail must be capable of moving during these movement states. This can be tested through going through each of the microcontroller states and observing the wheel speeds changing variably, or through simply observing the toy's behavior in comparison to a standard cat toy. May also be tested through presenting the toy to a cat and measuring time of engagement compared to another automated toy. 4. **Ease of Use For Owners** - The toy must be overall more convenient for the owner, meaning that maintenance should be minimal and parallel goals such as noise reduction are fulfilled. This is directly testable through ensuring that the charging functionality of the device works through the LED or a direct measurement. The power button functionality should also be tested to make sure the device does not function while powered off. |
||||||
10 | RunCompanion |
Advaith Yeluru Arnav Jaiswal Rohan Gulur |
Chentai (Seven) Yuan | Kejie Fang | design_document1.pdf design_document2.pdf other1.pdf proposal3.pdf proposal2.pdf |
|
A wearable running companion that provides real-time, personalized feedback using advanced sensors to optimize your workout, improve form, and ensure consistent performance. | ||||||
11 | Early Response Drone for First Responders |
Aditya Patel Kevin Gerard Lohit Muralidharan |
Manvi Jha | Arne Fliflet | design_document1.pdf design_document2.pdf other1.pdf proposal2.pdf proposal1.pdf |
|
**Problem:** Every week, UIUC students receive emails from the Illini-Alert system regarding crimes that are committed, fires that are occuring, and other dangerous situations to be aware of. With the latest reported median response time of first responders to a 911 call being over 6 minutes in Champaign County ([source](https://dph.illinois.gov/topics-services/emergency-preparedness-response/ems/prehospital-data-program/emsresponsetimes.html)), the situation to which emergency personnel are responding can drastically change from the initial details that were provided. To best be able to manage the event, first responders need as much accurate information as they can possibly receive so that the situation can be handled in a timely manner and the safety of everyone involved is prioritized. **Solution Overview:** Our solution is to construct a cost-effective drone that first responders can deploy and immediately fly to the location of an emergency event. While en route, they can use the drone’s on board camera and computer vision capabilities to assess the situation at hand. There are multiple scenarios in which this drone could be particularly beneficial, such as: - Police: monitor crime scenes and track suspicious individuals; provide aerial surveillance for events with a high density of people (such as sports games, concerts, or protests) to ensure the safety of everyone - Fire: monitor the spread of fire at the location; obtain information on what kind of fire it is (electrical, chemical) and any potential hazards - Medical: assess the type and number of injuries suffered, and locations of patients Our drone system comprises 4 different elements: a cloud storage, a backend, a frontend, and the drone itself. The high level block diagram linked below illustrates which elements communicate with other elements through data transferring shown by the arrows. [[Link](https://drive.google.com/file/d/12qx_syQQH0pHcrh7uVouneDARXH_6Dbi/view?usp=sharing)] In order to create a baseline early response drone, we need to be able to control the drone as well as receive information from the drone such as capture frames, altitude, roll, pitch, and yaw. The capture frames and data will be visually displayed in the frontend. However, this data bundle will first be stashed onto a cloud storage, and when the backend is ready to receive the data, it will retrieve it. The reason why we have a backend is because if time permits, we want to perform machine learning processing using object tracking and detection models. The other data transmission that occurs is by sending command signals from the frontend to the drone itself. In other words, whenever there is a keyboard click, we can visually see the key click which is uploaded to the cloud storage. **Solution Components:** 1. **Drone Hardware/Software:** Utilizes ESP 32 with SIM7600 for data transmission. Retrieve roll, pitch, and yaw using MPU6050 IMU sensor and altitude (using pressure) with BMP280. Utilize Servos to control flaps, rudders, and aileron and brushless motor + ESC for single rotor control. 3. **Drone Structure:** We will be utilizing foam board due to ease for repair just in case rather than LW PLA or PLA in general. Utilize larger wingspan for easier control of the drone . 4. **Cloud Storage:** The cloud storage will act as a Medium between the Drone itself and the C++ Backend. EXTRA: We are trying to completely eliminate the use of Cloud storage. There seems to be a way of using either TCP or a higher level protocol like HTTP requests according to youtube, arduino forums, and Chat-GPT 4o. 5. **C++ Backend:** Utilize HTTP Request to retrieve drone from the cloud storage to send and to the TypeScript Frontend using websockets. Utilize Websockets to receive command signals. EXTRA: Run the frames on a Deepsort Model for tracking humans using either a pre-trained Yolo Model or a trained Yolo Model (the train set will be generated by utilizing the drone itself). 6. **TypeScript Frontend:** Use Websockets to send command signals and retrieve drone data to the C++ backend. Visually display a command control for the user. **Criterion for Success:** - **Stability and Flight Controls:** Smooth operation of drone while in flight at varying altitudes, and non-jerky response to user-controlled inputs - **Sophisticated UI:** Easy-to-use and proportional web-based user interface for viewing camera frames, sensor data, and controlling the drone’s movements - **Frame Transmission:** Ability to transmit frames back and forth to the database, which then connects to the C++ backend using a cellular connection - **Computer Vision:** Time permitting, ability to detect and track objects (people) from a high up, aerial view based on a self-trained ML model Additionally, for testing the demonstration purposes, we plan to view the university guidelines and restrictions on drone flight. We will then find a suitable location, such as an open field or quadrangle, for launching and landing our drone. For permission, we will need to register the drone with the FAA and university, and each of our group members will need to take a short test to obtain a drone license. |
||||||
12 | Color Detecting Automatic Paint Dispenser |
Alexander Kaplich Lucky Konatham Rajeev Bommana |
Chi Zhang | Arne Fliflet | design_document1.pdf design_document2.pdf other1.pdf proposal1.pdf |
|
Team : - Rajeev Bommana (bommana2) - Alexander Kaplich (kaplich2) - Lucky Konatham (lrk4) Problem : - Whenever a painter starts a new project they always begin by mixing their desired colors on their palette using some combination of red, blue, yellow and white. However, the painter will inevitably run out of paint, and then will need to mix the exact color that they had before. Speaking from experience this part of the process is very frustrating and time consuming, especially for artists that are bad at mixing colors. Rather than wasting time learning color theory, or buying the color you were using straight from the tube, we will save time and money by designing a machine that can determine the pigments required to mix any color using RGB sensors, and it will also mix it for you using a combination of primary colors so that you don't have to. Solution : - The idea is pretty simple, the user of the device will "scan" the desired color by using a color sensor that detects the RGB values of a surface using red, green, blue and 'clear' photodiodes. The device will send the RGB values of the color to the onboard mcu which will do some simple calculations to convert the RGB value of the color to CMYK format using conversion formulas. This is the same principle behind color printers which create color images by mixing cyan, magenta, yellow, and black. The mcu will then communicate with 5 stepper drivers which are wired to 5 Stepper motors that will dispense the appropriate amount of white, cyan, magenta, yellow and black paint into a cup. The components will be powered by a non-rechargeable battery bank and the final result should be a paint cup with the color that was scanned before. Ideally the person using this tool never needs to actually do any mixing, they can just scan a color and apply it directly on the canvas/work surface. **Subsystem 1 (Color Classifier) :** - HiLetgo TCS-34725 TCS34725 RGB Light Color - Sensor Colour Recognition Module RGB Color - Sensor with IR filter and White LED for Arduino - STM32 Series MCU - 4 x WWZMDiB A4988 Stepper Motor Drive - Custom PCB for Microcontroller management - Standard button - 12 volt power supply (L6R24-120) The Color Classifier subsystem will have all the logic for the design. It will have a pcb that contains input from a color sensor and a microcontroller to process the input from the color sensor. This will all happen when a button is pressed and a high signal is sent. The motor drivers are also placed on the PCB to interface with the microcontroller to receive the processed signals once the color is processed. We will use a 12 volt power supply in order to power this system. A voltage regulator circuit will be used in order to get the voltage to the 3 volts needed for the microcontroller. **Subsystem 2 (Paint Dispenser) :** - 4 x Low flow peristaltic pump 12V dc Kamoer NKP - 4 x Nema 17 Stepper Motor Bipolar 2A 59Ncm(84oz.in) 48mm Body 4-lead - Custom 3d printed casing for holding pumps - Printing at idea lab / material to be used: PLA - Wires for connections - 3d printed reservoirs for paint - 12 volt power supply (same one as Color Classifier subsystem) - Silicon tubing for peristaltic pump The Paint Dispenser will receive signals from color classifiers in order to pump the correct materials. The motors will drive the peristaltic pumps to pump the paint from the paint reservoirs into a central location centered around the 3d printed pump holder. The artist can put their palette in this area and mix around their colors once every paint needed for that color is dispensed. |
||||||
13 | Autonomous Gardening Rover |
Dhruv Sanagaram Ryan Thammakhoune Tanishq Aryan Myadam |
Sanjana Pingali | Cunjiang Yu | design_document1.pdf proposal2.pdf proposal1.pdf |
|
# Autonomous Gardening Rover Team Members: - dhruvs7 - tmyadam2 - rct4 # Problem Our group would like to focus on gardening and agriculture. Hobbyists and farmers alike often struggle with monitoring soil quality, as it frequently relies on having accurately placed sensors where they intend to grow crops. This solution does not accommodate the varying intervals in which seeds are planted, causing the sensors to be removed and relocated manually, which can be an arduous process. # Solution Our project is a small autonomous rover that can monitor soil quality. The rover can be operated in two steps. The first involves the user configuring the rover’s autonomous movement through a web application. They can configure the plot size and plotting intervals through the app. The second step sees the rover traversing across the plot based on this configuration and creating a soil quality profile that summarizes the pH, humidity, and temperature, amongst other characteristics. This profile will be shown on the web app to inform the user’s treatment of the soil. This solution can be used across home gardens and commercial plots due to its small size and ease of use, which makes it more accessible than existing solutions. # Solution Components ## User Input Subsystem This system will allow users to input the following parameters (assuming the field is a perfect rectangle) using a React application accessible through their computer. Field length, width (m) Soil monitoring interval(m) Rover starting point (m,m) Our code will use the field to create a movement plan, which will be uploaded to a ESP32 microcontroller through a wired Serial connection. The movement plan will be stored on the ESP32 in flash memory, specifically through LittleFS, which is the microcontrollers file system. The rover will execute the movement plan once a button is pressed on the PCB. The movement plan will consist of splitting the field up into rows according to the soil monitoring interval. The rover will traverse across each row in a snake pattern, turning towards the next row once it reaches the end of a row. Components: ESP32 Microcontroller Button ## Autonomous Movement Subsystem Given a predetermined path, the rover would use an Ultra-Wideband system to determine its precise location. We will set up anchors around the plot and a tag on the rover. Using the time it takes for signals to travel between the anchors in the tag, we can determine the distance between the rover and the anchors, thus giving us its precise location. Using feedback from an IMU, we would then use a PID algorithm to correct any errors in movement that could be caused mechanically or through the bumpy texture of the soil. 3D Printed Chassis Wheels and motor dc-geared-motor-and-wheel-kit-3-9v-77rpm Adafruit 9-DOF Absolute Orientation IMU Fusion Breakout - BNO055 Phoenix America Universal Hub Encoder Kit Qorvo DWM1000 Module ## Soil Monitoring Subsystem Another subsystem of the smart gardening rover will focus on soil monitoring. This subsystem will use a combination of moisture, pH, and temperature sensors to assess soil conditions in real time. The data collected will help inform the user’s decisions on how to treat the soil, which can be done through soil distribution, watering, or pesticide disbursement, which the user can do. We will embed the sensor into the soil using a linear actuator, which will be activated according to the input interval. Components: Moisture Sensor: Adafruit STEMMA Soil Sensor - I2C Capacitive Moisture Sensor pH Sensor: Atlas Scientific GRAVITY ANALOG ISOLATOR Temperature Sensor: MCP9808 High Accuracy I2C Temperature Sensor Linear Actuator, Electric Micro Linear Actuator (Stroke 100mm-8mm//s-70N) ## Visual Application Subsystem Using the data collected by the rover, we will show a heatmap of the plot. The heatmap will distinguish areas of concern and areas that are in a healthy state. Data will be sent over USB connection once the rover is done with its movement plan. The data will be accessible through the file system LittleFS. Our algorithm will use the precise location data along with the soil data to create the heatmap. Data received on the React application will be used to generate and show the heatmap. ## Power Subsystem The power subsystem for the smart gardening rover will utilize a rechargeable lithium-ion battery pack that can provide consistent energy to the microcontroller, sensors, motors, and dispensing mechanisms. The battery pack will help ensure that the system lasts for a long period and can be recharged as needed, minimizing the cost and need for frequent battery replacements. Additionally, to protect the components and manage power distribution effectively, we will create a comprehensive BMS system containing a Battery Management IC to monitor the battery’s health, ensuring that it doesn’t discharge or overcurrent too quickly. A voltage regulator and step-down converters will also be needed to help distribute appropriate battery voltage levels for different components, such as sensors and ESP 32 microcontrollers. Additionally, power from these lithium-ion batteries will be stepped down to a specific voltage for the actuators, motors, and servos we plan to implement. Components: Rechargeable Lithium-Ion Battery Pack: 10.8V (11.1V) 3500 mAH 10A Lithium Ion Battery with Wire Leads 3S1P from Liion Wholesale Battery Management IC: TI BQ769X0 Voltage Regulator/Step down: TI MC34063ADR Power Switch: Standard 2N2222 NPN TO-92 Plastic-Encapsulate Power Transistors # Criterion For Success We will place the rover in a dirt field and set the field size to a small rectangular region. Then, we will set our testing interval to a reasonable amount so that the rover will be able to test the soil multiple times per row for multiple rows. The React web application will have a two-fold approach: Control and Configuration: Users can set intervals for soil monitoring and adjust various parameters for the rover’s operation directly from the web interface. Data Monitoring and Analysis: The application will be able to receive data from the rover, allowing users to monitor soil conditions and other key metrics, providing insights and analysis for better decision-making in gardening tasks for the user. |
||||||
14 | AA/AAA Universal Charge/Discharger |
Aditya Prabhu Jonathan Biel Stan Hackman |
Jason Jung | Arne Fliflet | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
|
# Universal Battery Charge/Discharger ## Problem: Batteries are a common and underestimated fire hazard in many homes, especially where a lack of knowledge meets convenience. A partially charged battery in a trash compactor could lead to devastating damage, large costs, and loss of life. ## Solution: A battery discharger that rapidly discharges a battery for safe disposal by using variable paths to maximizes current flow within normal battery operating temperatures. The system would also, when directed by the user, charge LA or Lithium rechargeable batteries. ## System overview: Our discharger would use variable resistance paths to adjust the level of discharge in order to maximize current for a given temperature. Rather than other types of battery discharger which seek to extend the life of the battery, the goal of ours would be to rapidly make a battery safe for disposal. The excess energy, then, would be dissipated as heat. When directed to, the system would also use a specialized IC to charge the battery using user input and dynamically monitoring system conditions ## Subsystems: **Battery Receptacle** : Holds the battery and connects it to the system. - Custom made battery trays which will allow the system to switch between AA & AAA battery usage. **Cooling System** : A fan and heat sink for use in dissipating heat more effectively - Motor part number : Tower Pro MG996 - needs 5-7VDC **Temperature Monitoring System** : Monitors system and battery temperature for use by the control system - Temperature probe part number : LM235Z - needs 5VDC **Current and Battery Monitoring system** : Monitors battery charge and output current - Current sensor : part number LAH 25-NP - Voltage sensor on battery output : **Charge System** : An IC designed to effectively charge LA and Lithium batteries. **Discharge System** : Accepts inputs from the Control system to cycle through circuits in a current divider in order to maintain discharge rate and limit temperature buildup - Custom PCB by us. It will function as a current divider, and will shift layout using IGBTs controlled by the Control System. **Control System** : Accepts sensory data from the monitoring systems and alters the current paths and possibly fan speed - An ATMEGA328 will serve as the microcontroller. **User interface** : The User Interface subsystem will accept user input to determine the system’s mode of operation, and relay system conditions to the user. - A switch in the casing to break the circuit on opening so that the battery can be safely removed and placed in. - A switch on the outside of the casing to turn the whole system off. - Two switches: charge/discharge and nickel/lithium - LCD display depicting current charge/discharge status **Power Supply** : Use a USB phone charger as a 5VDC input. - Wall to USB adaptor 2YHA11B8018669 ## Criterion For Success: - Be able to rapidly (within an hour) deplete a battery from 50% charge to a condition it can be considered safe to common trash-borne hazards. - Maintain temperature within safe battery limits to enable maximum sustained discharge rate without exceeding hazard thresholds (120F steady state, 140F transient). - Be able to cycle active circuits based on system conditions to maximize discharge, minimize system temperature(as much as feasible to at least be safe), and maximize system’s operating lifetime. ## Extra Considerations: - Every member will read the battery safety guidelines thoroughly, and review them at least monthly - Each member will be certified with fire safety training and fire extinguisher training. ## NetIds: - Stan Hackman (shackma2), - Jonathan Biel (jbiel2), - Aditya Prabhu (aprabhu3) |
||||||
15 | Real-Time Golf Swing Tracker |
Ben Kim Ryan Leuba Tamir Battsogt |
Sanjana Pingali | Arne Fliflet | design_document1.pdf design_document2.pdf other1.pdf proposal2.pdf proposal3.pdf proposal1.pdf |
|
# Real-Time Golf Swing Tracker Team Members: - tamirb2 - leuba2 - kijungk3 # Problem Mastering the golf swing is a complex challenge with nuances that can be difficult to grasp without precise feedback. Current training methods often rely on professional coaching and visual observation, which might not be readily accessible or affordable for all golfers. Additionally, the subtle mechanics of a golf swing, including swing path, speed, and force, are not easily quantifiable through mere observation. There's a growing need for a more accessible and scientific approach to golf training that leverages modern technology to provide real-time, detailed feedback directly to the golfer. # Solution We propose to develop the Real-Time Golf Swing Tracker equipped with an integrated sensor system and a companion mobile application to analyze and improve golf swings. The core of our solution involves embedding accelerometers, gyroscopes, and force sensors within the grip of a standard golf club. These sensors will capture critical data points such as swing speed, angle, and grip pressure during each stroke. This data is then processed by a microcontroller that filters and interprets the raw sensor outputs. The processed information is wirelessly transmitted to a mobile application that provides the golfer with immediate visual feedback and historical data analysis. # Solution Components ## Sensor Subsystem This subsystem includes accelerometers, gyroscopes, and force sensors integrated into the golf club's grip. These sensors capture real-time data on swing speed, angle, and grip pressure. We plan to use MPU9250, which includes the gyroscope and the accelerometer together. This sensor will be attached right above the golf head, allowing for accurate sensing when the club swings up and down. Also, the FSR06BE sensors will be utilized to sense the grip force from our hands to the golf grip. We would insert sensors under the grip such that the pressure resulting from our hands will be transmitted to the microcontroller, which will be on the golf shaft. - Accelerometer : Measures the acceleration and deceleration of the golf club to track swing speed. - Gyroscope : Tracks the orientation and angular velocity to track the angle of the club throughout the swing. -Magnetometer : Tracks absolute orientation in relation to Earth’s magnetic field to work in tandem with the gyroscope and gather a more accurate reading. - Force Sensor : Monitors the grip pressure applied by the golfer throughout the swing. If the scope of the project is too small, we also plan to add force sensors on the golf club face to measure the point of contact made between the club and the ball. ## Microcontroller Subsystem The microcontroller subsystem processes data from the sensors, executes filtering algorithms, and manages wireless data transmission to the mobile application. - Microcontroller (ESP32-S3-WROOM-1 MCU): Manages real-time data processing from all sensors and supports Bluetooth communication to transmit swing data to the mobile application. Sensors will be connected to the microcontroller through the GPIO and ADC allowing for digital and analog transmission. The microcontroller/PCB will most likely be placed & screwed on the shaft of the club to allow for even distribution of wires between grip sensors and club sensors. ## Power Subsystem This subsystem ensures that all electronic components within the golf club are adequately powered during use. - Rechargeable Cable/Battery (5V) : A lightweight, durable battery capable of providing consistent power to necessary subsystems for extended periods, ensuring usability through multiple rounds of golf. A USB port will be utilized to allow for the battery to be recharged. ## Mobile Application and Data Analysis Subsystem A comprehensive app that receives data from the golf club’s microcontroller. The user interface displays real-time analytics and historical trend analysis to help golfers understand and improve their swing techniques. High-level/process-intensive code will be run from the mobile application to perform any algorithms or potential ML to analyze golf swings. The application will most likely be exclusively hosted as an Android application for easier development. # Criterion For Success - **Precision**: The sensor data must be accurate to within a few degrees or percentage points, ensuring that feedback is reliable. - **User Interface**: The mobile application must be intuitive and easy to use, providing clear and actionable insights without overwhelming the user. A section will be dedicated to user data & numerics so that users can quickly digest raw data. - **Durability**: The Real-Time Golf Swing Tracker must withstand regular use in various weather conditions without sensor or system failure. # Alternatives Current training aids mostly involve static tools like swing trainers or mats that do not provide dynamic, real-time feedback. While some digital solutions exist, such as swing analyzers that attach to a club, they often require additional devices or do not integrate seamlessly. Our solution improves upon these by integrating all necessary technologies directly into the club and accompanying app, providing a more holistic and user-friendly experience. |
||||||
16 | Mobile Deployable Smart Doorbell |
Charles Lai Ricky Chen Victor Lu |
Rui Gong | Kejie Fang | design_document1.pdf proposal2.pdf proposal1.pdf |
|
# Mobile Deployable Smart Doorbell Team Members: - Ricky Chen (pohsuhc2) - Charles Lai (jiayeyl2) - Victor Lu (vclu2) # Problem: As a college student living in a dorm/apartment complex, the absence of a doorbell poses an inconvenience for both myself and my visitors, such as my friends, neighbors, or anyone who would come to my house. My room is located far from the entrance; therefore, every time they knock on the door, I can’t respond promptly. Moreover, regular doorbells will fail to notify me if I am either too far from the door or there are barriers between. # Solution: Our project is a small, smart doorbell that can be easily deployed and notify the resident via their phone. The doorbell will be connected to the internet and to the resident’s phone. When a visitor presses the doorbell, the resident will be notified via phone, which is almost always with the resident in this technological society. Therefore, the resident can be notified in real-time regardless of where they are. Furthermore, our doorbell will support a variety of features, such as voicemail and video recording. The resident can respond to their visitors despite not being at home and in many different circumstances. # Solution Components **Subsystem 1 - Camera** This project will contain a built-in camera enclosed within the package. This camera will record everything outside the apartment real-time and upload these video data to the mobile app. The camera will automatically take a picture whenever the button is pressed. With this feature, the user will be able to be alerted immediately when someone is outside the apartment. Moreover, the user will also be able to identify the visitor with the photo taken by the camera. **Subsystem 2 - Internet Connection** We will connect our device to the internet, so all the data collected in the doorbell will be sent to the users’ phones. One feature we want to achieve is the property owner can listen to and watch the person answering the door in real-time, in other words making a video call but only the phone user can see the other person. Therefore, we will have to send both video and audio data from the doorbell to users’ phones. **Subsystem 3 - Phone App** In order to offer users the best experience, this project will also include a mobile application. Once the module is deployed onto the user’s door, the user can utilize the phone app to receive eminent information regarding the situation outside the apartment. Moreover, the user can also monitor his or her house everywhere in the world with internet connection, further ensuring the safety of the apartment. **Subsystem 4 - Audio Transmission** Since we want the property owner and the person answering the door to talk with each other, we will deploy a microphone and a speaker in the doorbell. The microphone will get the audio data from the person answering the door, and the speaker will play audio from the phone’s app from the property owner. On top of this, we will also have to access audio components in a phone, so that we can make sure the microphone and speaker on both sides can receive and play data. **Subsystem 5 - Power** The battery of the doorbell has to support several modules, including a camera, internet connecting system, microphone, mini speaker, and the PCB itself. We want to make sure that it has enough power so that the user doesn’t have to constantly refresh the battery. Additionally, for the sake of convenience, we also make the doorbell powered with dry cells so that the users don’t have to wait and charge the batteries. **Subsystem 6 - Button** There will be a button in addition to all the other features within this project. When this button is pressed, the microphone embedded inside the doorbell will start recording the surrounding voice. When the button is released, the recording will stop and the whole audio data will be uploaded to the user’s mobile device. Thus, the visitor can leave a voice message even if the user is not home at the moment. Furthermore, the button will also send a notification to the user once it is pressed, informing the user that someone is waiting for him or her at the doorstep. **Subsystem 7 - Deployment Device** We will make our device as light as possible so that it can be stuck on a door. The backside of the doorbell will be a side of velcro tape to make it easy to install and remove. Users can easily buy Velcro tapes anywhere, so it will be convenient if they have to move to another place or replace Velcro tapes. [Velcro tapes](https://www.amazon.com/Command-Picture-Decorate-Damage-Free-PH206-14NA/dp/B073XR4X72/ref=sr_1_1?crid=2B5FDX12XH14Q&dib=eyJ2IjoiMSJ9.1ACfERMwVE_d9OrKAbQNTVcQQllbw9HsgrVPtNcqxwcRB5HLjDf8VDmscXwG3gTHJ7NB0US4TQtDIQCSYfHHoxuoYuEP22ZXkVz8Vsp0ZHMJuTbGxvTYmwFZ3nMoB1AAIziEDzmXASbvxiRFuV64dn9twhcbzFACHCdBAi6EGeYc0us2vNChK1Efn-RmgdPjskD_OOgLfdYsKTG--1xWb58eooKSQUvhYIoP-4iZNWUtsbaGAfClvM56YWaKivI0rj0pvhIJGbcgvmqxzX0KfZF5Eqx2Guu_23Iycvp0zqM.IWzCDO7NG9HXaUU5hM8VSmkOG-AUGEGEzM09aBWfMU0&dib_tag=se&keywords=3m%2Bmagic%2Btape%2Bcommand&qid=1725939954&sprefix=3m%2Bmagic%2Btape%2Bcomman%2Caps%2C138&sr=8-1&th=1) # Criterion For Success - Straightforward User Interface: The UI for the mobile application needs to be as straightforward as possible. Thus, even customers who are not familiar with technology can use this product easily. - Data Transmission: The module will need to enable a smooth transmission of visual data from the doorbell to the application. It also will have to allow simultaneous transmission and reception of audio data to and from the mobile device. - Low Power Consumption As the module will be running on battery, we need to make sure that the power consumption of the device is low enough to sustain a long period of continuous operation. - Simple Setup Procedure The setup procedure needs to be simple but robust. Thus, the product can be deployed on almost every door. |
||||||
17 | Firefighter Health Monitoring Network |
Bryan Chang Kevin Huang Steven Y M Chang |
Surya Vasanth | Cunjiang Yu | design_document1.pdf design_document3.pdf design_document4.pdf proposal1.pdf proposal2.pdf |
|
# Team Members Bryan Chang chchang9 Steven Y M Chang sychang5 Kevin Huang kuanwei2 # Problem Firefighters operate in extremely hazardous environments where their health and safety are constantly at risk. Current methods of monitoring firefighter health during active duty are limited, often relying on periodic check-ins or self-reporting. This can lead to delayed responses to health emergencies, such as heat exhaustion, overexertion, or cardiac events. Incident commanders lack real-time, comprehensive health data on their team, making it challenging to make informed decisions about resource allocation and firefighter safety. # Solution We propose the development of a "Firefighter Health Monitoring Network" - a system of wearable devices integrated into firefighters' gear that continuously monitors vital signs and environmental conditions. The system uses a mesh network of ESP32-based devices to transmit real-time health data to a central monitoring hub. This allows incident commanders to have immediate, comprehensive awareness of their team's health status, enabling quick decision-making and potentially life-saving interventions. # Solution Components ## Hardware Subsystems 1. Wearable Sensor Subsystem This subsystem is responsible for continuously collecting real-time health and environmental data from individual firefighters. The sensors track vital signs like heart rate, blood oxygen level, body temperature, and motion, as well as external factors such as temperature and smoke density. The data is sent to the mesh network of the esp32 and to the central hub via a reliable communication method ESP-MESH and LoRa. The rugged design ensures it functions in extreme conditions without compromising firefighter mobility or safety. - ESP32 microcontroller - Heart rate sensor (Photoplethysmography (PPG) sensor) - Blood oxygen level sensor (Photoplethysmography (PPG) sensor) - Body temperature sensor (e.g., MLX90614) - Accelerometer/gyroscope for motion detection - Environmental sensors (e.g., external temperature, smoke density) - LoRa module for extended communication - Small, rechargeable, heat-resistant battery - Rugged, heat-resistant enclosure - Audio jack to connect to the firefighters communication system - Buttons and LEDs for simple setting configuration 2. Central Monitoring Hub Subsystem The central hub acts as the control center for the network, gathering and visualizing health data from all firefighters in real time. It allows incident commanders to monitor the team’s health status, detect potential health risks, and respond quickly to emergencies. Its extended battery life and rugged design ensure that it remains operational during prolonged operations in harsh environments. - ESP32-based device with larger battery capacity - 7" TFT touch screen for data visualization and input - LoRa module for extended communication - Rugged, portable enclosure - Buzzer to sent out critical alert to watch commander - Buttons and LEDs for simple setting configuration 3. Power Subsystem The power subsystem ensures that both the wearable units and the central hub have the energy to operate continuously in extreme conditions. Larger batteries in the central hub support extended use, while the power management circuitry optimizes battery life. Heat-resistant lithium-ion batteries for wearable units. - Larger capacity battery for the central hub - Power management circuitry for efficient operation ## Software - Embedded software for wearable units to collect and transmit sensor data - Mesh networking protocol implementation (ESP-MESH) - Data processing algorithms for health status assessment - Central hub software for data visualization and alert management - Health analytics/ algorithm for abnormal health data detection - Mesh Network Integration - Utilize ESP32's ESP-MESH capabilities for a self-forming, self-healing network - Implement secure, low-latency data transmission protocols - Develop network management software for the central hub ## Subsystem Integration - Wearable units continuously collect and transmit health data through the mesh network - Central hub receives, processes, and displays data from all connected firefighters - The mesh system should alert every firefighters in site for faster response time # Criteria for Success 1. The system shall continuously monitor and transmit vital signs data 2. Wearable units shall operate for at least 8 hours on a single charge in typical firefighting conditions. 3. The mesh network shall maintain connectivity in challenging environments (e.g., inside buildings, around obstacles) 4. The mesh network shall automatically form and maintain connectivity with no manual configuration required. 5. The system shall generate automatic alerts for abnormal vital signs or lack of movement within 10 seconds of detection |
||||||
18 | PowerBox Technology Power Meter |
Abraham Benzaquen Garcia Arisa Aramratsameekul Frank Lu |
Jason Jung | Kejie Fang | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
PowerBox Technology |
# PowerBox Technology Power Meter Team Members: - Abraham Benzaquen (ADB9) - Arisa Aramratsameekul (arisaa2) - Frank Lu (yuzhelu2) # Problem This team will work under the supervision of Oscar Castillo, the founder of PowerBox Technology, in order to create a power meter for the PowerBox. This power meter is to be used in an industrial setting, ensuring that the high-power machinery receives a consistent, clean source of energy. The purpose of this project is to resolve the common industrial problem of having unstable power for heavy machinery. In a large factory, for example, a power outage can cause hours of downtime and thousands of dollars to be lost. # Solution Our solution will be to create a power meter. This power meter will connect to the 3-phase output of an inverter and will be used to measure the 3-phase RMS current / voltage, real power, reactive power, and apparent power. The voltage/current will also get stepped down and outputted to be used as an instruction signal for a DSP. All of our data will be recorded for use in optimizing power delivery to the machinery that requires it. # Solution Components ## Subsystem 1 ### RMS Voltage Measurement Circuit The RMS voltage measurement circuit uses the output of the inverter to measure the three-phase voltage. This and the AC current measurement circuit will be used to calculate the real power, reactive power, and apparent power. This circuit will also step down the voltage and output three analog signals to the digital signal processor. This high voltage will be stepped down to a value that can be outputted as signal through the use of a PCB step down transformer. In order to safely do this, we will also incorporate regulators and protection circuits. ## Subsystem 2 ### AC Current Measurement Circuit The current for the three phases will be going through current transformers, then to the AC current measurement circuit. The current transformer is used to accurately monitor the current while not damaging the equipment with the high current flow. The AC current measurement circuit and the RMS voltage measurement circuit will be used to calculate the real power, reactive power, and apparent power. The circuit will also step down the current and output three analog signals to the digital signal processor. The current transformer is given in this project, but we will have to set the current limits. We will include circuitry that will ensure that the stepped down values are clean and accurate. These may include: precision regulators, overcurrent protection, and low pass filters. ## Subsystem 3 ### Power Calculation This part of the interface board will take the voltage and current from the previous circuits and calculate real power, reactive power, and apparent. The power data will be recorded. This could be done with an Arduino board and the power data could be stored in registers for the other software to use if necessary. ## Subsystem 4 ### Communication Protocol In this subsystem, we will use a communication protocol like Modbus TCP to transmit power data (real power, reactive power, and apparent power) from the power meter to the Power Node. Modbus TCP is a well-established protocol in industrial environments and provides a reliable solution for data transmission. The Arduino board (or an equivalent microcontroller) will be equipped with an Ethernet shield to support TCP/IP communication. This microcontroller will interface with the power calculation circuit to collect power data while the Ethernet connection will facilitate communication using Modbus TCP. The Arduino will act as a Modbus TCP server that regularly updates power parameters, sending data in response to requests from the Modbus TCP client of the Power Node. # Criterion For Success The power meter should take the input of three-phase high voltage and three-phase current from the inverter to accurately measure the real power, reactive power, and per-phase voltage and current of the machine. The meter should also output the correct stepped-down three-phase analog voltage and current to the digital signal processor. In addition to the proper outputs, the meter should be reliable in its consistency of accurate measurements and reasonably cost-effective for production and usage. |
||||||
19 | Autonomous Golf Green Divot Locator Robot |
Akhil Bonela Michael Cherry Ved Eti |
Pusong Li | Kejie Fang | other2.pdf proposal1.pdf proposal2.pdf |
|
# Autonomous Golf Green Divot Locator Robot Team Members: - Ved Eti (vedeti2) - Michael Cherry (mcherry3) - Akhil Bonela (abonela2) # Problem For those familiar with golf, one of the biggest problems golfers face is ball marks on the green. When a golfer lands their ball on the green of the hole, it leaves a divot in the ground. It is common etiquette to use a tool to repair the mark your ball left behind to flatten the green back out. Sadly, many golfers do not follow this etiquette, which leaves golf greens full of divots and uneven bumps. This leads to worse quality greens and an overall worse quality experience while golfing. # Solution Our idea is to create an autonomous robot to identify these divots at the end of the golf day and mark all of the locations on the green. We will create our own custom tool to mark these divots. We would dispatch the robot after all of the golfers are done with the course at the end of the day, and traverse the golf green similar to a Roomba. We plan to use stereo cameras to pinpoint the exact locations of divots and plan to use an infrared camera as a failsafe to ensure we are able to identify all divots. # Solution Components Stereo Camera - Used to detect the edges of golf greens, and divots (Amazon.com: Synchronized Dual Lens Stereo USB Camera 1.3MP HD 960P Webcam 3D VR Web Camera Module with 1/3 CMOS OV9715 Image Sensor Industrial Camera USB2.0 Lightburn Camera Plug&Play for Android,Linux,Windows : Electronics) Micro-Controller - (A chip from the ESP32-S3 series) Raspberry Pi - Used for additional computer power for computer vision tasks Servo - Divot Marker Dropper (Amazon.com: Miuzei 20KG Servo Motor High Torque RC Servo Metal Gear Waterproof for 1/6, 1/8, 1/10, 1/12 R/C Model DIY Car Robot, DS3218, Control Angle 270° : Toys & Games) Battery - Zeee 14.8V 4S Lipo Battery (50C 3300mAh) with an XT60 plug BMS System - 14.8V 4S 30A 18650 Lithium Battery BMS PCB Integrated Circuits Protection Board Robot Chassis - Prebuilt chassis with motors (Amazon.com: Metal Smart Robotic RC Tank Chassis Kit with 4pcs DC TT Motors for Arduino UNO R3, Raspberry Pie, STEAM Education, TT04 Crawler Tank Car Chassis Platform for Adults Teens (Black) : Everything Else) Ultrasonic Sensor - Failsafe mechanism to also help detect divots in the green (Amazon.com: WWZMDiB 2Pcs HC-SR04 Ultrasonic Sensor Module for Arduino R3 MEGA Mega2560 Duemilanove Nano Robot XBee ZigBee (2Pcs HC-SR04 with housing) : Industrial & Scientific) Casing for Components - Plan to use 3-D printed materials ## Autonomous Traversal This subsystem is mostly going to be interfacing with our microcontroller, our motors, and the stereo cameras. We plan to have the microcontroller controlling the motors to continue going forward until the edge of the green is detected. Once it is, we will turn around and look at the ground next to the place we just checked. This will act very similar to a common roomba, and robotic vacuum cleaners. We repeat this process until we check the entire green. We plan on using a pre-produced chassis from either a toy car or an RC car, so that we don't have to spend time making and manufacturing our own car. We will add our own microcontroller, PCB for power distribution, and battery to the chassis. ## Image Processing and Sensing The image processing module will mostly have two tasks, identify divots, and identify edges of golf greens. It will pass along information about what it detects to the Raspberry Pi so that we can either use the traversal module to move the robot, or the marker placement module to place markers down. In addition to the computer vision tasks, we plan to add an ultrasonic sensor to detect distances to the divots as well. This is going to be used more as a failsafe, in case the stereo camera does not detect the divot. ## Power The robot utilizes a Zeee 14.8V 4S Lipo Battery (50C 3300mAh) with an XT60 plug, paired with a 14.8V 4S 30A 18650 Lithium Battery BMS PCB Integrated Circuits Protection Board. This combination provides reliable power management and safety features for the robot. LiPo batteries are chosen for their high energy density, which allows for a compact and lightweight battery pack, ideal for mobile robots. The BMS safeguards against overcharging, over-discharging, short circuits, and overcurrent, ensuring the battery's longevity and the robot's safe operation. ## Marker Placement Will be using input from the microcontroller module to allow a marker to fall down from a tube. Will plan on using a tube filled with markers with an arm at the bottom to block markers from falling down, driven by a servo. We will 3D print our own brightly colored markers, and a container for the markers. ## Remote Power On / Return Controller We plan on developing a bluetooth/WiFi based controller that tells the robot to begin traversing the course and also be told when to stop and return to its “dock”. We intend to create or purchase a cage of some sort for the robot to safely reside in near the golf green and return to when done. It would also come with a RF transmitter that we can receive and dictate the robot to return to the cage using the esp 32’s RF receiver. # Criterion For Success Describe high-level goals that your project needs to achieve to be effective. These goals need to be clearly testable and not subjective. For this to be effective, we need to first be successful at identifying the divots and the edge of the greens. We plan to test this by buying a portable green and artificially making divots. Task 1 is to correctly identify a divot on a fake gold green using computer vision. We also need to mark this using the marker placement subsystem. Task 2 is to make sure we can detect the edges of the green accurately, and turn the robot around to continue traversing the course. |
||||||
20 | A Better Yogurt-Maker |
Betty Nguyen Chidambara Anagani Zitao Wu |
Angquan Yu | Arne Fliflet | design_document1.pdf other1.pdf proposal1.pdf |
|
# A Better Yogurt-Maker Team Members: - Betty Nguyen (bcnguye2) - Chidambara Anagani (canaga2) - Zitao Wu (zitaowu2) # Problem There are a few smart yogurt fermentation devices on the market but these devices usually only allow the user to set a timer and a temperature. There is no or very little real time sensing happening. However, time can be an unreliable indicator due to the high amount of variability in the starting culture (backslopping) leading to differing fermentation rates or even a dead batch of yogurt. # Solution Our solution would be to build a device that uses sensors to measure the temperature, pH, and viscosity of the yogurt and use these measurements to determine the taste and texture of the yogurt. The device would then send the information via wifi to an app on the user’s phone for easy monitoring # Solution Components ## Subsystem 1: Sensors The purpose of the sensor system is to gather real time information about the taste, texture, and viscosity of the yogurt. These measurements will then be used to measure the status and quality of the fermenting yogurt. - The temperature of the yogurt will be monitored with a thermistor or other temperature sensor. - The sourness of yogurt is determined by the amount of lactic acid in a sample. Thus we can use a pH probe such as the ENV-30-PH to determine if the taste is acceptable. - The texture of the yogurt can be inferred from the viscosity. We can determine viscosity with a small rotating spindle that is submerged in the yogurt. The torque required to rotate the spindle can then be related to the yogurt's thickness. - The rotating paddle would be powered by a dc motor such as the ROB-16413 which comes with an encoder. Using information from the encoder and the hall effect, we can determine the RPM and relate RPM to viscosity. Motor control will be programmed using the MCU. - The temperature and pH sensor will be attached to the sides of the yogurt fermentation container. The rotating spindle to measure texture can be attached to the lid of the container. ## Subsystem 2: Control and Communication The purpose of the control subsystem is to read the information from the sensor, support Wi-FI communication of sensor data to a mobile application, and also provide control for the motor of the texture sensor. - Microcontroller (ESP32-S3-WROOM): Is used to get and process the data from sensors. This microcontroller also supports WiFi and bluetooth data transmission on the board and can be used to send yogurt fermentation to a mobile app. - The ESP32 MCU has GPIO and ADC pins that can read digital and analog measurements from our sensors. - Our MCU needs to be able to send a control signal to start and stop the texture sensor motor. This can be achieved using a timer on the MCU and having the MCU send an enable signal to run the texture sensor every 30 minutes. ## Subsystem 3: Power Subsystem This subsystem will provide the power for all of the electronic components in the device. It must provide enough power for the electronics to last the entire 8 to 24 hours of fermentation - A 12V Battery rechargeable battery will be used to provide power to each subsystem. The battery can be recharged with a USB port. - Voltage regulator circuits will help provide steady and consistent power. ## Subsystem 4: Mobile App and User Interface The purpose of this subsystem is to display the information received from the microcontroller in an easily digestible manner. An app either locally hosted on the ESP32 microcontroller or mobile (Android) will be built to use sensor data to build progress visuals and relay alerts to the user. A graph of the pH and temperature data can be displayed. For the texture measurement, the yogurt will be classified as: runny, good, thick. # Criterion For Success - Ease of Use/User Interface: We want our user interface to be easy to use and clear for our intended audience, homecooks. The mobile application must display information in a way that is easily digestible and also provide clear actionable feedback on the yogurt fermentation process. - Sensor Monitoring: Our sensors must be able to determine the taste and texture of the yogurt and let the user know when to stop fermentation. pH should be within the range 4.0 to 4.4 for food. Temperature should be within the range 105 to 112 F. Texture will be classified as runny, good, too thick. - Durability: The entire device should be able to go through a fermentation cycle (8 to 24 hours) without needing to be recharged. Since the sensors will be touching the yogurt/yogurt container, the probes need to be rated for operation at 105 to 112 F. - Real Time Monitoring and Alerts: The designed app will show the progress of the fermentation while providing any necessary alerts to the user if any of the defined conditions are out of range. |
||||||
21 | String/Drum Synthesizer (Repost) |
Abhi Nayak Joel Schurgin |
Manvi Jha | Cunjiang Yu | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
|
## Team Members: - Abhi Nayak (arnayak2) - Joel Schurgin (joelbs3) - Need one more partner! (Send us an email at arnayak2@illinois.edu and joelbs3@illinois.edu if interested) ## Problem A musical artist brings with them a sense of how to evoke certain emotions or create particular atmospheres, but may lack the technical knowledge to bring it to life. Sound design can be challenging as it involves crafting the unique timbres and textures that define a piece of music, which can be both an art and a science. The learning curve is steep, not only because it requires a grasp of complex technical tools but also because it demands a creative intuition that isn't easily taught. Musicians and producers must learn to navigate a vast array of software and hardware tools, each with its own set of knobs, sliders, and buttons. Understanding how to shape sounds means diving deep into the physics of sound waves, the principles of digital signal processing, and the subtleties of different synthesis methods. This process can feel overwhelming, especially for beginners who might feel lost amidst the jargon and the sheer number of options. ## Solution Our proposal is to build a synthesizer. A synthesizer is a machine that is capable of producing a range of sounds when given a simple key or noise pulse. The range of sounds are controlled by a set of parameters such as filter cutoffs, oscillator types, low frequency oscillators, etc. The proposed solution is two-fold. The first part of the project is to build emotive knobs that feature descriptive ranges such as soft to aggressive or thin to thick. These knobs would also control the internal parameters of the system to achieve the desired effects on the input sound. The other part is to create a system that records a sound and finds the best way to set the synthesizer's internal parameters to recreate it. The intended work flow of this device for a musical artist is to record a sound, tweak the knobs, and play that sound by pressing a key on a MIDI keyboard. It is up to the musician to record the sound into their own recording software for further manipulation. # Solution Components ## Subsystem 1: Power Block Purpose/Requirements: This subsystem must be able to power the processor and the USB keyboard. Capacitors will be used to stabilize the power block. Components: - 5V DC Power Supply Adapter - Capacitors - Subsystem 2: Synthesizer ## Subsystem 2: Software Synthesizer Purpose/Requirements: This subsystem will generate sounds using the Karplus Strong algorithm which features a pulse generator and an echo chamber (modeled with digital delay, feedback, and filtering). When a note is played on a USB keyboard, this component must be able to change the pitch based on which key is pressed. The algorithm’s parameters must be controllable using potentiometers, but several parameters can be determined as a function of one knob. We chose the Karplus Strong algorithm to reduce the computational load, but we are modifying certain aspects to allow for a greater range of sounds. The pulse that we will use is a blend of noise and sine wave oscillator which are added together and distorted. In addition, there will be some soft distortion applied on the output. In addition, the echo chamber will use 3 IIR filters: highpass, lowpass, and a bell. The choice to use IIR filters is to lower the amount of computation. We made a demo of the synthesizer algorithm using a digital audio workstation in order to play with parameters and determine the names of the emotive knobs. Below are the emotive knobs and how the corresponding synth parameters should change to implement them: Aggression - Synth Params: Pulse distortion, output distortion - Knob Left: Low distortion - Knob Right: High distortion Brightness - Synth Param: Low pass filter cutoff - Knob Left: 20 Hz - Knob Right: 20 kHz Boominess - Synth Param: High pass Cutoff - Knob Left: 20 kHz - Knob Center: Note pitch - Knob Right: 20 Hz Dampening - Synth Param: Bell Q - Knob Left: High Q - Knob Right: Low Q *Not sure what to call this yet, but it affects how the echo chamber resonates. Note: setting this to around 3.6 kHz sounds good, so it could be fine to keep this completely internal and forgo the corresponding knob. - Synth Param: Bell frequency - Knob Left: 20 hz - Knob Right: 20 kHz Scratchiness - Synth Param: Length of pulse - Knob Left: 0 - Knob Center: 1/(Note pitch) - Knob Right: 4/(Note pitch) Sharpness - Synth Param: Mix between pulse noise and oscillator - Knob Left: Pulse consists of only oscillator - Knob Right: Pulse consists of only noise *Strike weight => Corresponds to simulated weight of object striking the string/drum head. - Synth Param: Pulse oscillator pitch - Knob Left: 20 kHz - Knob Right: 20 hz Notes: - The value "note pitch" is determined by USB keyboard input. - Bell gain is fixed and should be set to attenuate the signal. Components: - Microprocessor - DAC - Potentiometers for parameters (linear, should be able to snap to center position if center value has meaning other should be smooth all the way through) - USB Port ## Subsystem 3: Amplifier Purpose/Requirements: This subsystem should output the smallest possible signal to noise ratio through a guitar cable or headphones. In order to achieve this we will experiment with using several inverting op-amp amplifier circuits in parallel and summing them together with an inverting mixer. The gain of each amplifier will be set, but the volume control will be part of the inverting mixer circuit. Components: - Mono 1/4in audio jack - 4 Op-amps - Logarithmic Potentiometer (for volume control) ## Subsystem 4: Screen/Monitor Purpose/Requirements: This subsystem should display the waveform of the sound being processed. The purpose of this subsystem is for the user to be able to receive a visual representation of the sound they are developing using their input pulse and emotive knob manipulations. The screen could enhance the usability of the synthesizer by providing visual indicators or icons for each emotive knob, making it easier for users to identify how different settings affect the overall sound. This would bridge the gap between technical sound parameters and intuitive control. Components: - Mono 1/4in audio jack - Op-amp/Transistors - Raspberry Pi Screen, 7 inch portable monitor external display # Criterion For Success High-Level Goals: 1) Get a simple sound, such as a sine wave at 440 Hz, out of DAC output from the synthesizer subsystem. Should be viewable in an oscilloscope, but not important to hear yet. 2) Add potentiometers to the synthesizer and use to control pitch of simple sound for testing. 3) Build the amplifier in order to hear the previously described sound. 4) Add USB port and code for parsing keyboard input. 5) Use synthesizer algorithms to implement the emotive knobs, starting with the pulse generator and then building the echo chamber. Reach Goal: 6) Implement the screen into the entire system. It will show the waveform of the generated sound, including its amplitude, frequency, and potential distortions. This helps users better grasp the effect of parameters like distortion, brightness, dampening, etc. This visualization would assist the user in creating more precise and desired sounds. Important Notes: -The screen is peripheral and not required for the rest of the system to operate. Therefore implementing it is a reach goal. |
||||||
22 | Smart Stick System (Triple S) |
Pranav Nair Ritvik Manda Shivam Patel |
Dongming Liu | Cunjiang Yu | design_document1.pdf proposal1.pdf proposal2.pdf |
|
# Smart Stick System (Triple S) Team Members: - Ritvik Manda (rsmanda2) - Pranav Nair (pranavn7) - Shivam Patel (shivamp6) # Problem Lacrosse players and coaches currently lack real-time, detailed performance metrics to help improve gameplay. Traditional training methods rely heavily on subjective observation, which is not very consistent. No tools such as those available for other sports like baseball, golf, soccer, etc are available to monitor and improve lacrosse form and accuracy, especially with the player alone. Since lacrosse is not a well known sport, it becomes difficult for beginners and enthusiasts to start learning the mechanics of the stick and being proficient in it. # Solution This project aims to address the need for a smart, data-driven tool that can measure shot speed, accuracy, and stick form, providing players with accurate and immediate feedback to enhance their training and technique. By incorporating motion tracking, the system will enable players to adjust to their game, fostering more efficient and targeted improvement. This will allow experienced players to obtain performance data and also aid beginners in strengthening their form and tactics. As an overview, our system will include two overall subsystems: one “base” including the pcb, a microcontroller, an LCD screen, and a camera which overall exists to act as the processing unit of the system and use computer vision to analyze a player’s form. This base alone will be able to process and provide general feedback via the LCD screen or more specific feedback via an application. The second subsystem is meant to be mounted to the back end of the lacrosse stick and must be relatively small and lightweight. It will include a small microcontroller with low energy bluetooth capability as well as an accelerometer and gyroscope to transmit more detailed info about swing speed and stick angle to the base. This detailed dataset can lead to enough information to process form and more important information like how fast and what trajectory a ball would have been thrown. # Solution Components ## Subsystem 1: LaxHub (external, box unit) LaxHub is the main processing unit of this system and contains the custom PCB, microcontroller, LCD screen, and camera, as well as necessary functionality to talk to subsystem 2 via bluetooth. The LaxHub will need to be powered by a rechargeable battery. - Microcontroller: ESP 32 - LCD Screen: ST7735R SPI LCD Screen - Camera: Focus 5MP OV5647 Sensor - Rechargeable Battery: Jameco ReliaPro Lithium Ion Polymer Battery 3.7V 500mAh Rechargeable ## Subsystem 2: LaxSense (stick unit) LaxSense is a subsystem that mounts on the lacrosse stick, which will contain the microcontroller, accelerometer, and the gyroscope. These parts will work in conjunction to keep track of performance metrics such as shot speed, stick angle, and form. Because this is a standalone device, this will need to be powered by a small battery system. - Microcontroller: LOLIN D1 mini (based on ESP-8266EX) - Accelerometer + Gyroscope: MPU6050 OR WT901BLE MPU9250 - Rechargeable Battery: B0143KH9KG, 3.7V-2600mAh-9.62Wh,18650 Rechargeable Li-ion Battery Pack ## Subsystem 3: TripleS (Application) Since the LCD display in LaxHub can’t show all metrics and history, this app will manage data display and analysis. - React: Front-end framework for the application. - Kinesis Data Streams: Real-time data streaming from the Smart Lacrosse Stick. - Kinesis Data Analytics: Real-time analysis of the streamed data. - AWS Lambda: Process data from Kinesis streams. - DynamoDB: Store historical data for retrieval. - AWS Amplify: For app deployment and hosting. # Criterion For Success 1. Accuracy of Metrics: Ensure the stick unit measures shot speed and stick angle with a precision within ±5% of actual values, validated through calibration and expert comparison. 2. Real-Time Feedback: Provide performance feedback with a latency of less than 5 seconds from sensor data capture to display on the app, ensuring immediate and actionable insights. 3. Scalability: Ensure the cloud infrastructure can handle varying loads and scale automatically to accommodate increasing data and user activity without performance degradation. |
||||||
23 | RNG Challenge Alarm Clock |
Allen Zhu Rithvik Kopparapu Zinovy Alecksandrovich |
Sanjana Pingali | Kejie Fang | design_document1.pdf proposal2.pdf proposal1.pdf |
|
# Randomly Generated Challenge Alarm Clock Team Members: - Rithvik Kopparapu (rithvik9) - Allen Zhu (allenz2) - Zinovy Alecksandrovich (zinovya2) # Problem For college students, industry professionals, and people from all walks of life, alarms or other methods of waking up on time have become essential. Sleep is an essential need that no one wants to give up, yet there are numerous demands in our lives that take us away from the comfort of our beds. In order to stay on top of their schedules, people resort to various methods of alarms- setting many alarms all 2-3 minutes apart, downloading an app that forces them to take a picture, or using a smartwatch alarm. However, many times the human body automatically adjusts to the routine of a regular alarm, allowing people to snooze or turn off alarms in their sleep, turn off their phone, or getting used to the vibration of a smartwatch alarm. # Solution In order to solve this problem and force users to actually wake up in order to turn off their alarm, we wish to make an alarm clock with 4 different challenges using simple sensors (load cell, gyroscope, temperature, pedometer) to complete to turn it off- with the clock randomly picking which challenge needs to be completed every morning. Randomly picking between 4 different challenges every morning keeps the user on their toes, with minimal effort required from the user in order to set the alarm by having everything in one succinct device. # Solution Components ## Subsystem 1 - Alarm clock and speaker The first part of our solution is the physical alarm clock that we will be modifying to add our challenges. We wish to use a simple AA-powered alarm clock with a clear LCD display for the user to be able to easily program times in and use power efficiently. In order to inform the user what challenge is to be completed that morning, pressing the clock's snooze button will play an instruction on a separate speaker that we will add (i.e., "SHAKE CLOCK FOR ONE MINUTE"). ## Subsystem 2 - Challenge Deck with sensors The second part of our solution is our challenge deck with the associated sensors: Gyroscope sensor - MPU-9250 with built in gyroscope and accelerometer sensors. The challenge we want to incorporate here is to shake the clock for 1 minute, and we will use the data from the sensor to verify the shaking of the clock. Temperature sensor - TSYS03 temperature sensor. The challenge we want to incorporate here is to get up and put the clock in the fridge for 2 minutes while waiting there for the alarm to turn off. We will check to see if the clock holds a temperature below 40 degrees Fahrenheit (avg fridge temp is 37 degrees) for at least 1 minute, to consider the time it takes for the clock to cool. Pedometer sensor - MIKROE-3567 pedometer sensor. The challenge we want to incorporate here is to take get up and take 250 steps with the alarm clock. Load cell - SparkFun SEN-10245 load cell. The challenge we want to incorporate here is to apply an even and constant force for 3 minutes, in order to make it an inconvenient enough time to be unable to do it in your sleep. ## Subsystem 3 - Linkage to alarm clock To link all the sensors, we will be using a ATMEGA324PB microcontroller. To make the alarm clock stop ringing when the challenge is completed, we will generate the signal that is usually generated by the "stop alarm" button to the alarm. Once the challenge is completed, we will also use the previously mentioned speaker to give the user a simple audio feedback that they've completed the challenge, with a "ding" sound. # Criterion For Success Our criterion for success are as follows: 1) Each challenge needs to work appropriately and actually stop the alarm from ringing. 2) Challenges must successfully randomly switch every morning. 3) Alarm must only deploy one challenge at a time. |
||||||
24 | Four Point Probe |
Dorian Tricaud Ming-Yan Hsiao Simon Danthinne |
Dongming Liu | Arne Fliflet | design_document1.pdf proposal1.pdf proposal2.pdf |
|
# Four Point Probe Team Members: Simon Danthinne(simoned2) Ming-Yan Hsiao(myhsiao2) Dorian Tricaud (tricaud2) # Problem: In the manufacturing process of semiconductor wafers, numerous pieces of test equipment are essential to verify that each manufacturing step has been correctly executed. This requirement significantly raises the cost barrier for entering semiconductor manufacturing, making it challenging for students and hobbyists to gain practical experience. To address this issue, we propose developing an all-in-one four-point probe setup. This device will enable users to measure the surface resistivity of a wafer, a critical parameter that can provide insights into various properties of the wafer, such as its doping level. By offering a more accessible and cost-effective solution, we aim to lower the entry barriers and facilitate hands-on learning and experimentation in semiconductor manufacturing. # Solution: Our design will use an off-the-shelf four point probe head for the precision manufacturing tolerances which will be used for contact with the wafer. This wafer contact solution will then be connected to a current source precisely controlled by an IC as well as an ADC to measure the voltage. For user interface, we will have an array of buttons for user input as well as an LCD screen to provide measurement readout and parameter setup regarding wafer information. This will allow us to make better approximations for the wafer based on size and doping type. # Solution Components: ## Subsystem 1: Measurement system We will utilize a four-point probe head (HPS2523) with 2mm diameter gold tips to measure the sheet resistance of the silicon wafer. A DC voltage regulator (DIO6905CSH3) will be employed to force current through the two outer tips, while a 24-bit ADC (MCP3561RT-E/ST) will measure the voltage across the two inner tips, with expected measurements in the millivolt range and current operation lasting several milliseconds. Additionally, we plan to use an AC voltage regulator (TPS79633QDCQRQ1) to transiently sweep the outer tips to measure capacitances between them, which will help determine the dopants present. To accurately measure the low voltages, we will amplify the signal using an JFET op-amp (OPA140AIDGKR) to ensure it falls within the ADC’s specifications. Using these measurements, we can apply formulas with corrections for real-world factors to calculate the sheet resistance and other parameters of the wafer. ## Subsystem 2: User Input To enable users to interact effectively with the measurement system, we will implement an array of buttons that offer various functions such as calibration, measurement setup, and measurement polling. This interface will let users configure the measurement system to ensure that the approximations are suitable for the specific properties of the wafer. The button interface will provide users with the ability to initiate calibration routines to ensure accuracy and reliability, and set up measurements by defining parameters like type, range, and size tailored to the wafer’s characteristics. Additionally, users can poll measurements to start, stop, and monitor ongoing measurements, allowing for real-time adjustments and data collection. The interface also allows users to make approximations regarding other wafer properties so the user can quickly find out more information on their wafer. This comprehensive button interface will make the measurement system user-friendly and adaptable, ensuring precise and efficient measurements tailored to the specific needs of each wafer. ## Subsystem 3: Display To provide output to users, we will utilize a monochrome 2.4 inch 128x64 OLED LCD display driven over SPI from the MCU. This display will not only present data clearly but also serve as an interface for users to interact with the device. The monochrome LCD will be instrumental in displaying measurement results, system status, and other relevant information in a straightforward and easy-to-read format. Additionally, it will facilitate user interaction by providing visual feedback during calibration, measurement setup, and polling processes. This ensures that users can efficiently navigate and operate the device, making the overall experience intuitive and user-friendly. # Criterion for Success: A precise constant current can be run through the wafer for various samples Measurement system can identify voltage (10mV range minimum) across wafer Measurement data and calculations can be viewed on LCD Button inputs allow us to navigate and setup measurement parameters Total part cost per unit must be less than cheapest readily available four point probes (≤ 650 USD) |
||||||
25 | SpoilSense |
Azim Shad Sarthak Shah Vikram Harish |
Rui Gong | Kejie Fang | design_document1.pdf proposal1.pdf |
|
An automated device designed to monitor the freshness of produce inside a refrigerator by detecting humidity levels and ethylene gas emissions, which are key indicators of spoilage. | ||||||
26 | Wearable Air Quality Monitor |
Xin Yang Ziheng Li Zonghan Yang |
Chentai (Seven) Yuan | Kejie Fang | design_document1.pdf proposal2.pdf proposal1.pdf |
|
# **Wearable Air Quality Monitor** Team Members: • Ziheng Li (zihengl5) • Xin Yang (xiny9) • Zonghan Yang (zonghan2) # **Problem** Air pollution has been a growing global concern. The World Health Organization estimates the air breath by 9 out of 10 people containing high levels of pollutants, leading to billions of people suffering in health issue related to it. Despite this severe situation, most individuals lack real-time information about the air quality in their current environment. And existing air quality monitors are often expensive, with prices ranging from $100 to several hundred dollars, which is not affordable to every individual. In addition, most air quality monitors are designed for fixed location and often contains limited information. # **Solution** We propose a wearable air quality monitor that can track crucial air quality parameters such as temperature, humidity, PM2.5, PM10, and CO2. Our solution aims to address the following key points: 1. Affordability: By optimizing component selection, we aim to keep the price of our device between $50-80, making it 2 times more affordable than current market alternatives. 2. Portability: The compact and wearable design ensures users can monitor air quality wherever they go. 3. Comprehensive monitoring: Our device will track multiple air quality parameters to provide an overview of the environment. 4. Real-time data and notifications: The device will connect to smartphones via Bluetooth or Wi-Fi to provide real-time data and send notifications when air quality is bad. 5. User guidance: Based on the detected air quality, the device will suggest actions such as wearing a mask, closing windows, or avoiding outdoor activities. # **Solution Components** **Sensor Subsystem** This subsystem will handle all data measurements, including temperature, humidity, CO2 level, and pollutants like PM2.5 and PM10. Components: - Temperature and Humidity Sensor: SHTC3 - Particulate Matter Sensor: PMS5003 - CO2 Sensor: (Specific part number to be determined) **Processing Subsystem** The core of our processing subsystem will be responsible for collecting sensor data, performing necessary calculations, and evaluating whether air quality thresholds are exceeded. Components: - Microcontroller: (specific model to be determined) **Communication Subsystem** This subsystem will allow the device to communicate with a user's smartphone via Bluetooth or Wi-Fi. It will send data to the connected mobile app and send notifications if air quality get worse. Components: - Built-in Bluetooth and Wi-Fi capabilities of the microcontroller **User Interface Subsystem** This subsystem will provide immediate visual feedback to users. Components: - OLED Display: (Specific part number to be determined) **Power Subsystem** This subsystem will manage power supply, charging, and discharging. Components: - 5V Rechargeable Lithium Battery: (Specific part number to be determined) - Power Management IC: (Specific part number to be determined) - Voltage regulator **Outer-packaging Subsystem** This subsystem will focus on the physical aspects of the device, including protection and wearability. Components: - 3D-printed outer shell - Clip for attachment to backpack or clothing # **Criterion For Success** 1. Cost Effectiveness: The final product cost should not exceed $80, making it at least 50% cheaper than the lowest-priced comparable product on the market. 2. Accuracy: The device should achieve accuracy rates within ±10% of readings from professional-grade air quality monitors for PM2.5, PM10, and CO2 measurements. 3. Battery Life: The device should operate continuously for at least 24 hours on a single charge under normal usage conditions. 4. Response Time: The device should detect significant changes in air quality and send notifications to the connected smartphone within 60 seconds. 5. Durability: The device should continue to function normally after from -10-120 Fahrenheit. 6. User Interface: Users should be able to read and interpret the OLED display data at the first use. 7. Connectivity: The device should maintain a stable Bluetooth or Wi-Fi connection with the smartphone app at a distance of up to 5 meters. 8. Size and Weight: The final product should not exceed the dimensions of 15cm x 15cm x 15cm and should weigh less than 500 grams. 9. Custom PCB Design: Design a custom PCB that integrates all necessary components while meeting the size and power requirements of the device. |
||||||
28 | Establishing an Intelligent Square Stepping Exercise System for Cognitive-Motor Rehabilitation in Older Adults with Multiple Sclerosis (Pitched Project) |
Hank Zhou Junmin Liu |
Jason Jung | Arne Fliflet | design_document1.pdf design_document2.pdf proposal3.pdf proposal4.pdf proposal1.pdf proposal2.pdf |
|
**Problem** Persons with multiple sclerosis (MS) may experience declines in balance, mobility, strength, sensory, cognitive and mental health function. Despite benefits, exercise participation remains low in persons with MS due to personal, environmental and societal barriers. Even though nowadays there are various devices for health people to monitor and log their exercises, these devices may not be very suitable for people with MS. Therefore, there's a need to develop a system which specifically facilitates people with MS to do more physical exercise safely. **Solution** Our project aims to develop an intelligent home exercise system so that people with MS can exercise safely at home, and this helps to rehabilitate their motor cognitive abilities. This system consists of a smart mat which can monitor users’ activities when they step on it, and will synchronize the data to wireless devices such as smartphones and laptops. In this way, this system can evaluate the condition of the user, and alert medical facilities if they detect the user's fall or other anomalies. **Solution Subsystems** **Hardware** - A smart mat with pressure sensor arrays and light emitting diodes (LEDs) for providing feedback on stepping pattern. - Analog-to-digital converters - A microprocessor to process sensors’ output - Device to transmit/store data **Software** - Data collection and integration - The software will gather data from the pressure sensors on the smart mat and convert it into useful information regarding the user’s balance, weight distribution, and movement patterns. - A wireless transmission protocol (e.g., Bluetooth, Wi-Fi) will synchronize the data to a connected device (smartphone, laptop, etc.). - User interface - A mobile or desktop application will be developed to display the user’s exercise progress, balance, and movement patterns. - It will include user-friendly visualizations of data in real time, such as balance graphs, step analysis, and fall risk assessments. - Health monitoring and alerts - The system will analyze the user's movements and flag any anomalies such as instability or unusual gait patterns. - If the software detects a fall or abnormal movement, it will immediately notify the user’s medical team via a pre-set communication method (SMS, email, or app notification). - Data storage and analysis - The software will store the data securely for long-term analysis, allowing medical professionals to track the user’s progress over time. **Criterion for Success** For the project to be successful, the following criteria must be met: 1. Smart Mat Performance: The smart mat must accurately detect and track user movement (e.g., weight distribution, balance shifts). The pressure sensors should be responsive and capable of detecting subtle changes in foot placement and weight. 2. Real-time Data Synchronization: The system should reliably transmit data to connected devices with minimal latency, ensuring that real-time feedback is provided to users. The data should be synced wirelessly to a smartphone or laptop for easy access. 3. Ease of Use: The system must be easy to set up, with minimal technical expertise required for users to begin exercising safely at home. 4. Robustness: The system must be able to keep intact and function properly under frequent use. |
||||||
29 | Automated Tea Maker |
Kyle Humphries Milan Patel Tanmay Mittal |
Chi Zhang | Kejie Fang | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
|
# Automatic Tea Maker ## Team Members: -Tanmay Mittal (tmitta3) -Kyle Humphries (kylebh2) -Milan Patel (milanhp2) ## Problem Preparing tea can be a tedious process that requires knowing how to balance the right amount of tea leaves and monitoring steeping times. Measuring specific amounts of leaves requires tea drinkers to pack ingredients into a mesh/metal infuser. They have to wait, depending on how strong they want the final drink to be, which can be an additional inconvenience. New drinkers can have a hard time knowing what types of tea to mix and knowing how long to steep them for. The current process for preparing tea from loose leaves is very hands-on but can be automated. ## Solution Our proposed solution is to introduce a device that is capable of storing a variety of dry leaves and dispensing them depending on the drink the user wishes to make. This blend would be loaded into some sort of mesh chamber that lies above a space where a hot cup/mug of liquid will be placed. A motorized subsystem would then lower the mesh chamber into the cup and steep the blend for some amount of time. Once complete, it would raise back up, allowing the user to take their cup. Finally, it would dispose of the wet tea leaves by dropping them into a waste compartment. Our microcontroller will be responsible for controlling motors and measuring out specific amounts of leaves as well as dispensing them into the mesh chamber. An application component would allow users to select blends, control steep times, and get notified when the drink was prepared. # Solution Components # Steeping Subsystem The purpose of this subsystem is to steep the tea leaves in the water for a specified amount of time so that we can make tea in various strengths and flavors. The subsystem will steep a measured amount of tea leaves to ensure consistent flavor across multiple uses. We want to have a retractable system that can lower into a cup of water and retract after about 2-3 minutes depending on the strength of the tea desired. Load up the spoon with the specified amount of tea leaves automatically as well as discard the tea leaves after use. Also, have a motor to press a lever to compress the tea and leaves in the spoon when they are in the water to prepare the tea ## Storage and Dispensing Subsystem This subsystem will need to store the dry tea leaves in separate compartments and leverage weight sensors to dispense specific amounts. The backside of this component can be clear to allow users to see when specific ingredients are running low or spoiling. The dispensing requires a valve that will open/close depending on what step of the process we are in. ## Sensors We will require a weight measuring sensor to measure the weight of the specified type of tea to be able to accurately ensure tea strength across multiple uses. We will also require a timer sensor that can keep track of the time the steeping spoon has been steeping in the water to ensure consistent batches. ## Control and Communication The microcontroller will be responsible for controlling multiple aspects of this device. We will use the ESP32-S3-WROOM for its wi-fi/Bluetooth capabilities. This will communicate with the sensors that will help measure specific amounts of leaves. The motors that control the valves that dispense the blend of dry leaves will need to be controlled by the MCU as well. Finally, the subsystem responsible for steeping the recipe will have motors involved as well as a timing component. The microcontroller will need to be able to process data from the sensors and relay the correct commands to different subsystems. The application component will need to be to communicate with it as well. ## Mobile Application This subsystem will be responsible for providing users with a friendly interface. Through a lightweight application, users can have access to preset blends that they can select. They can also create their own. Steeping times can be selected as well. Once a user wishes to dispense and steep a recipe, they can use the application to communicate wirelessly with the microcontroller to begin the process. Once the process is complete, we can notify users through the app. Additional features we could add include a live countdown of how long is left until their drink is ready and saving specific blends. ## Power Provide power for all the motors and sensors for at least 10 minutes so the tea process can be set up, made, and reset. A 12V battery would be used to supply the power due to all the motors being smaller. The battery will be either rechargeable or have a compartment for the battery to be easily switched out. To fix the inconsistent power of batteries we would add a voltage regulator circuit. # Criterion For Success: -Fully Automated Steeping Process: A working system to lower the ingredients into the water and remove them once the tea is at the desired strength. -Measuring: An accurate way to control/measure the ingredients being put into the steeping system. Be able to take the user set input of the amount of ingredients and correctly disperse the set amount -An interactive interface: The consumer can input the amount, time duration, and emergency stop/hold. All information should be clearly displayed and easy to use. Notification to show that the process is done or any errors that have occurred. -Longevity: The overall system should be able to handle high temperatures since they will be operating around a cup of boiling water. The device should be able to handle water due to working near it and for cleaning purposes. |
||||||
30 | Antweight BattleBot Champion Destroyer |
Aarav Singh Hrishi Kini Neel Acharya |
Chi Zhang | Arne Fliflet | other2.pdf other3.pdf other4.pdf |
|
# Antweight BattleBot Champion Destroyer ### Team Members: - Hrishi Kini (hkini2) - Aarav Singh (aaravs2) - Neel Acharya (iaa6) ## Problem We will be designing and building a PC-controlled battlebot as per the instructions provided by Prof. Gruev. However, several constraints need to be met, which introduce challenges to the design process. These restrictions include: - The battlebot must weigh under 2 lbs. - Only 3D-printed parts made from PET, PETG, ABS, or PLA/PLA+ are allowed for the chassis and weapon. - The robot must be wirelessly controlled via a Bluetooth or WiFi-enabled microcontroller. - It must demonstrate visible mobility and have an indicator light showing when power is on, with an optional secondary light for the wireless connection. - The battery voltage must not exceed 16V, and the system must include a manual disconnect for safety. - If a pneumatic weapon is used, the pressure must remain under 250 psi, and the system must have an easily accessible bleed valve. It would also be heavier than a plastic option due to the need for a metal pressurized tank. - If a spinning weapon is employed, it must come to a complete stop within 60 seconds of power being disconnected. - A custom PCB must be implemented. We were motivated to choose this project as soon as it was pitched by Professor Gruev. All three of us thought it was an incredibly interesting concept that also allowed us to apply our engineering and design skills in a hands-on, competitive environment, while also challenging us to work within real-world constraints such as weight, materials, and safety regulations. ## Solution Adhering to the above restrictions, our proposed solution involves the development of a battlebot using an STM32 microcontroller paired with a WiFi module for wireless control from a laptop. The bot will utilize three motors: two for the drivetrain and one for the weapon, a horizontal spinning blade. ## Solution Components ### Subsystem 1: Chassis Design Choices The chassis will serve as the structural foundation for the battlebot, providing support for the motors, weapon, and electronic components. We will be 3D-printing the chassis to adhere to the weight restrictions while ensuring a sturdy structure. We have chosen to use **PETG** for the chassis due to its superior strength and durability compared to **PLA+**, which was our other shortlisted material. While PETG is slightly heavier, we believe that using a stronger material is the right choice for our 2-wheel drive design, where durability is crucial. If for cost/weight reasons **PLA+** seems like a better option in the future, we will make the transition to it. The chassis design will make sure to cover all electrical components to prevent any damage to them during the competition. To enhance both protection and offensive capabilities, we are incorporating **ramps** on the front and sides of the chassis. The widened base will shield the wheels from direct attacks, minimizing vulnerability to opponents aiming to disable our bot's mobility. The ramps will allow the bot to slide underneath opposing bots, lifting them slightly off the ground. This design will expose more of their undercarriage to our spinning blade, significantly increasing the damage potential. ### Subsystem 2: Mobility and Drivetrain **Purpose** Mobility is a key factor for success in the competition, allowing the bot to outmaneuver opponents and react quickly to control inputs. Our bot will feature a **2-wheel drive system**, with anti-friction pads at the front to facilitate smooth, agile turns. The system is designed to ensure that the bot can traverse the arena quickly and accurately. **List of components** - 2 strong brushless motors - 2 wheels will be 3D printed with hollow rims and fitted with rubber treads - 3D printed drivetrain ### Subsystem 3: Wireless Control **Purpose** The main purpose of the WiFi module attached to the microcontroller is to wirelessly communicate with the battlebot via our laptop. We plan to use a controller plugged into a laptop that transmits each joystick forward and backward to one motor each, thus controlling left and right movements when one is forward and the other is backward. **List of components** - WiFi module (more research on the specific chip is required, however, a couple of us have worked with WiFi modules before, so we will be choosing that over a Bluetooth module) - Laptop & Controller ### Subsystem 4: Weaponry **Purpose** To attack, disrupt, and disable other bots we will be competing against. We will use a 3rd motor to power a downward-leaning blade (3D Printed) aimed at the base of any opponent bot in an attempt to flip it over. We will design the blade to have prongs on the ends that can carry weight and do more damage. We will use a powerful motor with strong responsiveness to our controls. ### Subsystem 5: Power Our initial power source will be a **9V D-cell battery**, chosen for its balance between size and output. However, should performance demands require more power, we will upgrade to a **15V LiPo battery**, provided the weight limits allow it. ## Criterion For Success We will consider this project a success if: - We can establish wireless communication with the battlebot. - The battlebot demonstrates precise and responsive movement within the arena. - The spinning blade operates effectively. **Hopefully, we win!!** :) |
||||||
31 | Moving Alarm Clock |
Karthik Bagavathy Teja Nerella |
Rui Gong | Cunjiang Yu | design_document1.pdf proposal2.pdf proposal1.pdf |
|
**Team Members:** Karthik Bagavathy (kb42), Teja Nerella (nerella2) **Problem:** Alarm clocks are essential for use in modern life, helping us wake up for important early morning commitments. However, due to issues with heavy sleeping, sleep deprivation, and more, many individuals struggle to wake up to traditional alarms. Rather, they would constantly press the snooze button, trying to get more sleep but missing important commitments in the process. Certain innovative alarm clocks with online tasks to wake up the user might not be effective for everyone as those can be easily gamified. Because of these issues, the need for another innovative and dynamic alarm clock solution is necessary. **Solution:** The solution being proposed is a moving alarm clock that will begin its movement around the room as soon as the alarm has been triggered, requiring the user to physically move in order to catch and disable the alarm. The movement of the clock can be achieved through a motorized base with omnidirectional wheels . The electronics of the alarm clock will involve a custom PCB that will be programmed to manage the alarm sound and movement. The alarm sound will be handled by a small speaker which will produce the noise once the time has been reached. The PCB will be programmed such that once the sound starts playing, the motors are activated. In order to deal with obstacles in a typical bedroom, ultrasonic sensors can be added as well for obstacle detection. An important aspect of this alarm clock is the unpredictability of movement, which can be achieved through the use of accelerometers to alter the clock's motion randomly. In addition to these features, an LED display will be included so that the user can set alarms for the day. In addition, the clock will be powered by rechargeable batteries to eliminate the need of a power outlet - this is necessary for free movement of the alarm clock. The body of the clock should also be designed to ensure no damage occurs to the internal electronics. The easiest solution would be to 3D print the housing, but more durable materials can be explored as well during the course of project development. **Solution Components:** Power Subsystem - The device will be powered by rechargeable batteries to enable untethered movement User Interface System - The use of a LED screen for the user to input time information will be the primary method for the user to interface with the system Mechanical Movement System: A motorized base with omnidirectional wheels and accelerometer to enable unpredictable movement around the room. Obstacle detection System: Utilizing ultrasonic sensors to detect obstacles nearby and enable smoother movement Alarm Sound System: Speaker controlled by the PCB and microcontroller to produce the alarm sound at a given time Physical Housing system: 3d printed durable housing that will make sure that the internal electronics are shielded from damage Microcontroller System: Utilizing an ESP32 microcontroller to manage motor control, alarm activation, sensor input, and user interface interactions. **Criteria for Success:** The main criterion of this project is that the alarm clock should make a sound within 10 seconds of the time set by the user. This is the most important criterion as this is what convinces the user to wake up. The next criterion for success is the robot should be able to move while the alarm is sounding. The robot should also be able to avoid crashing into walls and other obstacles using ultrasonic sensors. Another criterion for success is the ability for the user to set an alarm at any time of the day and the robot to accurately tell what time it is. The alarm sound system should also be able to be turned off if the user manages to catch the robot. This gives the user an incentive to get up, waking them up in the process. |
||||||
32 | Independently Controlled Auto-Watering System for Garden Plants |
Aditya Adusumalli Ary Indarapu Sneh Chandak |
Surya Vasanth | Kejie Fang | design_document1.pdf proposal2.pdf proposal1.pdf |
|
Team Members: - Aryan Indarapu (awi2) - Sneh Chandak (snehc2) - Aditya Adusumalli (adityaa8) # Problem Maintaining an optimal environment for a diverse range of plants can be challenging, especially for gardening enthusiasts who may struggle to monitor and cater to the specific needs of each plant. Different plant species require varying levels of soil moisture, and ensuring that each plant receives the ideal amount of water can be time-consuming and error-prone. Overwatering or underwatering can harm plants, leading to poor growth or even plant death. # Solution Our project aims to address this challenge by implementing a smart, automated watering system tailored to the unique needs of different plants or areas of a garden. The system will consist of a central water supply connected to individual plants or plot sections, each managed by its own valve. These valves will regulate water flow based on real-time soil moisture data, ensuring that water is released only when the soil moisture levels drop below a predefined threshold. Each section will be equipped with multiple soil moisture sensors to provide a more accurate reading of the area's overall moisture level. The sensor data will be averaged to determine the exact water requirements for that section. This information will then be transmitted wirelessly using Bluetooth technology to a central PCB, which will manage the specific valve, ensuring precise and efficient water distribution across that area. This solution minimizes water waste and ensures that each plant receives the appropriate amount of moisture, leading to healthier growth and easier garden maintenance. # Solution Components ## Subsystem #1: Sensor Modules The sensor modules are simple edge devices that monitor soil moisture levels and communicate that data back to the host system. It will read the soil moisture data and broadcast that data (via the chosen communication protocol) to the irrigation module, while also acting as a "hop" point for other incoming data. Each pot will have 3 modules, each independently broadcasting data. These modules are low-power, i.e. only need 5 V. Thus, we will power the ESP32 with three AA batteries. Components: ESP32, Soil Moisture sensor ## Subsystem #2: Irrigation System The system will include a water pump that is connected to a three-way hose splitter. The host splitters will be attached to flow sensors, creating a closed-loop system. The water will flow to a watering can head, which will evenly water the plant. The plant itself will have a moisture sensor subsystem. Components: Water pump, valves, flow sensors, hose watering can head ## Subsystem #3: Irrigation Module Our host system will be the core of our project, analyzing sensor data and controlling the irrigation system. Analyzing data: The host system will receive data from the sensor modules in the field using bluetooth protocol. It will aggregate the data by pot and average the moisture level to determine if the plant needs to be watered. If so, it will determine how much water to send. ### Closed-Loop Irrigation System: Once the central host system determines that a plant or section needs watering, it will open the corresponding valve to allow water to flow. Flow sensors attached to each valve will monitor the amount of water being delivered. This data is sent back to the host system to confirm that the correct amount of water has been supplied. After the desired amount of water has been delivered, the system will close the valve, completing the watering cycle for that section. Both the valve and the pump will be controlled using H bridges connected to the ESP32 module. The flow sensor data will be read by the ESP32 over I2C protocol. This feedback loop between the sensors, valves, and the central host ensures that plants receive the optimal amount of water without waste, creating a highly efficient and automated watering process. Components: ESP32, 2x Half-H Bridge ## Subsystem #4: Power The power subsystem will differ for the host system and the edge sensor modules. The irrigation module will require 12V power for the valves and the pumps, while the flow sensors and soil moisture sensors only require 3.3V power. As such, the soil moisture sensors on the sensor module will be powered by the accompanying ESP32. The flow sensor will be powered by the ESP32 on the irrigation module. The valves and pump will be powered through the H bridge. Batteries for both subsystems will be Lithium-Ion. ## Subsystem #5: BLE communication protocol The sensor modules will use Bluetooth to communicate. Since they may not always be within range of the irrigation module, the data will hop from one Bluetooth module to the next until it reaches the irrigation module. Once the data arrives, the irrigation module will send a "stop" message to prevent further broadcasting of that specific data. In case BLE doesn’t work, we will instead use ESP- NOW or WiFi. However, the goal is to have this working in areas without WiFi, so BLE or ESP-NOW is preferred. ## Subsystem #6: User Interface We will have a simple application on Flutter/React which will display current soil moistures for each pot, along with trends over time. It will also mention if any pot is currently being watered. # Criterion For Success **Accurate Moisture Detection:** The system must correctly read and transmit soil moisture levels from three different sensors placed in three pots. The soil moisture data must be relayed to the host PCB without errors, and the system should respond to moisture levels falling below a predefined threshold for each plant. **Precise Water Delivery:** When the soil moisture level in a pot drops below the threshold, the corresponding valve must activate and release water. The water released should be measured using a flow sensor to confirm that the amount dispensed matches the required level based on the soil’s moisture deficit. **Independent Valve Control:** Each valve must operate independently, ensuring that only the plant requiring water receives it. No unintended watering should occur in adjacent pots or zones when the moisture threshold is met in one area. **Real-Time System Response:** The system should react to changes in soil moisture levels within a specified time frame (e.g., each pot relays information every 30 mins) to ensure timely water distribution. **Consistent Performance:** Over multiple tests, the system must consistently provide the correct amount of water across various trials, with no more than a 15% deviation from the calculated requirement as measured by the flow sensor. |
||||||
33 | # AMADEUS - Augmented Modular AI Dialogue and Exchange User System |
Chengyuan Peng Ryan Fu Wesley Pang |
Jason Zhang | Cunjiang Yu | design_document1.pdf proposal1.pdf |
|
# AMADEUS - Augmented Modular AI Dialogue and Exchange User System # Team members: · Ryan Fu (ryfu2) · Qiran Pang (qpang2) · Chengyuan Peng (cpeng14) # Problem For many years, people have dreamed of having natural, everyday conversations with robots to fulfill their emotional and lifestyle needs. However, current interactive AI systems are often bulky, and even the most portable solutions still rely on smartphone interactions. Regarding emotional needs, we don’t want to talk to a cold, lifeless screen. Instead, we hope for a more tangible medium—like a child chatting with a SpongeBob toy embedded with AI. Thus, the needs are clear: We require a more compact AI platform that can easily integrate into various devices. On top of that, it should be as affordable as possible to make it widely accessible. # Solution We are designing an AI-based audio interactive interface. The baseline feature of the project is a cheap PCB board interface that can receive audio from the user and then send it through Wifi to a model on a computer so that the AI model can process the audio and reply with audio, which is sent to the board to be played out. We will use an ESP32 microcontroller with wifi and audio input/output capability to achieve this. Additional features would be indoor and outdoor modes such that when we are outdoors we will speak when a button is pressed and the input will be denoised. Another additional feature can be integrating the board with headphones or Bluetooth earbuds. Moreover, a text display interface can be embedded on the PCB to display the converted audio as text. Please view our block diagram via the Google link: https://docs.google.com/document/d/1Uv_b5SzeoN7boqyMyB3Kkgl7XGVAnuv50S6DZ1e3PhY/edit # Solution Components # Subsystem 1: AI Web Client Our language model will be hosted on a cloud-based server. The local MCU will transmit audio to the server via a WiFi module. We are collaborating with a local start-up that will provide the AI model and handle the audio training. However, we also have the option to train our own AI model to create additional characters using their interface. # Subsystem 2: ESP32 with Wifi Capability We will utilize ESP32 for the processor to process the signal. Before use, it will receive a password from the user’s device through Bluetooth to connect with Wifi. It will receive an audio signal from the ADC and send it to the PC for AI Web Client input. After receiving the output audio signal from the PC, it will be sent to the audio codec for audio output. # Subsystem 3: Power System The system can be powered through either a USB connection or a 5V battery. The 5V supply directly powers the I/O devices and the programming module. To provide 3.3V power for the microcontroller and audio processing module, a 5V to 3.3V LDO voltage regulator is used to step down the voltage. # Subsystem4: Bluetooth Communication A Bluetooth transceiver module will be connected to the ESP32 processor to receive user input for configuring the internet connection. The user will transmit the internet passcode to the Bluetooth transceiver, which will then relay this information to the microcontroller to establish the connection. # Subsystem5: Audio I/O & Processing The microphone on the board will capture the audio input, which will be processed by an Audio Codec module. Once the audio output is fetched from the internet into the MCU, it will be transmitted through the Audio Codec and played through a speaker. # Subsystem6: Text Display An additional feature of our project will be a text display. After the ESP32 module converts the audio input / output into texts, an LCD screen will be attached to the microprocessor to display the text output. # Subsystem7: Debug Module A serial port will be temporarily integrated into the PCB for debugging the output from the ESP32 processor. Additionally, a programmer will be connected to the MCU for programming purposes. |
||||||
34 | Portable Plotter Robot |
Matthew Paul Sagnik Chakraborty Shinan Calzoni |
Dongming Liu | Cunjiang Yu | design_document1.pdf other1.pdf proposal1.pdf proposal2.pdf |
|
# Portable Plotter Robot Team Members: - Sagnik Chakraborty (sagnik3) - Shinan Calzoni (calzoni2) - Matthew Paul (mjpaul3) # Problem One of the biggest problems with plotter machines is their bulky rails needed to guide the tool head. This makes transportation a hassle and limits their use cases to offices with the space to house them. # Solution To solve this issue, we propose a portable plotting system that uses a small robot to hold a tool head and drive around the writing surface. This eliminates the need for any rails and allows the user to plot both on small and large scales. The system will also have 4 reflective markers to put on the corners of the writing surface and sensors on the device to make sure it does not leave the area. There will be a web app to communicate to the device what it should draw. # Solution Components ## Subsystem 1: Driving and Motion There will be 4 wheels with their respective stepper motors (Nema 17) and motor drivers (DRV8825) to control the plotter's movement. It also will contain an additional servo motor (HS-55) to actuate the tool head up and down onto the writing surface. Most of the subsystems will be connected to an ESP32 microcontroller, this subsystem will use it for controlling each of the motors. ## Subsystem 2: Boundary Detection and Positioning This subsystem will contain the ultrasonic sensor (HC-SR04) to accurately position and keep the device within the reflective boundary markers. It will communicate with the ESP32 microcontroller to relay sensor information and determine positioning. *Note: We are also looking into using UWB signals for more accuracy per Prof. Fliflet’s suggestion, specifically the RYUW122 module. ## Subsystem 3: Communication and Control Since the ESP32 has built-in WIFI capabilities, we will communicate with a web app to relay instructions for plotting and handling different commands. The web app will have a simple interface that allows the user to input measurements for simple shapes. The user will be able to remotely start the plotter from the app. *Note: We are looking into the possibility of having this machine read gcode that could be generated from vectorized files. If we go this route, an additional Raspberry Pi could be used for the image processing. ## Subsystem 4: Power Management This subsystem will use rechargeable lithium-ion batteries and various voltage regulators so that the correct amount of power can be delivered to the motors, sensors, and microcontroller. # Criterion For Success 1. The device can communicate with the web app to plot closed, single-lined shapes. 2. The Device footprint can stay within the boundary dictated by corner markers. 3. The Toolhead can actuate up and down to draw discontinuous lines. |
||||||
35 | Schnorr Signature Key Fob |
Michael Gamota Pedro Ocampo Vasav Nair |
Pusong Li | Cunjiang Yu | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
|
# Schnorr Identification Protocol Key Fob Team Members: - Michael Gamota (mgamota2) - Vasav Nair (vasavbn2) - Pedro Ocampo (pocamp3) # Problem Current car fobs are susceptible to different types of attacks. Rolling jam attacks are one of such attacks where an attacker jams and stores a valid "unlock" signal for later. Cars with passive keys/cards can be stolen using relay attacks. Since a car can be the most expensive item someone owns, it is unreasonable to allow people to steal them so discreetly by hacking the fob/lock combo. # Solution By leveraging public key cryptography, specifically the Schnorr identification protocol, it is possible to create a key fob which is not susceptible to either attack (rolling jam and relay) and also gives no information about the private key of the fob if the signal were to be intercepted. # Solution Components # Key Fob ## Subsystem 1 Random number generation - We will use a transistor circuit to generate random numbers. This is required by the Schnorr protocol to ensure security. ## Subsystem 2 Microcontroller - The MCU will run all the computation to calculate the messages. We will likely use an ATtiny MCU so we can use the Arduino IDE for programming. However, some group members have experience with the STM32 family so that is another option. ## Subsystem 3 Power - We plan on using either a 5V battery or 3.3V battery with a boost converter to power the fob. ## Subsystem 4 Wireless Communication - We plan on using the 315 MHz frequency band which is currently used by some car fobs. We will need a transmitter and receiver, since the protocol is interactive. # Lock ## Subsystem 1 Random number generation - We will use a transistor circuit to generate random numbers. This is required by the Schnorr protocol to ensure security. ## Subsystem 2 Microcontroller - This MCU will also run all the computation to calculate the messages. We will likely use an ATtiny MCU so we can use the Arduino IDE for programming. However, some group members have experience with the STM32 family so that is another option. This MCU will need to have PWM output to control the lock. ## Subsystem 3 Linear Actuator - We plan on using a linear actuator as a deadbolt lock for demonstration purposes. ## Subsystem 4 Wireless Communication - We plan on using the 315 MHz frequency band which is currently used by some car fobs. We will need a transmitter and receiver, since the protocol is interactive. ## Subsystem 5 Power - This subsystem will also likely require 5V, but power sourcing is not an issue since this system would be connected to the car battery. During a demo I would be acceptable to have this plugged into a power supply or a barrel jack connector from an AC-DC converter. # Criterion For Success Describe high-level goals that your project needs to achieve to be effective. These goals need to be clearly testable and not subjective. Our first criteria for success is a reasonably sized fob. There is some concern about the power storage and consumption of the fob. The next criteria for success is communication between the fob and the lock. This will be the first milestone in our design. We will need to have a message sent from one MCU that is properly received by the other, we can determine this in the debug terminal. Once we are sure that we can communicate between the fob and the lock, we will implement the Schnorr protocol on the two systems, where the fob will act as the prover and the lock as the verifier. If the Schnorr signature implementation is correct, then we will always be able to unlock the lock using the fob whose public key is associated with full privileges. |
||||||
36 | Monitoring System for Older Cars |
Agrim Kataria Sachin Bhat Tommy Park |
Angquan Yu | Arne Fliflet | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
|
# **Team Members** - Sachin Bhat (sachinb3) - Agrim Kataria (agrimk2) - Tommy Park (thomasp6) **Problem** Car monitoring systems are becoming increasingly common in newer cars today. However, the majority of cars on the road are older models that lack these advanced safety features. According to the Intelligent Transportation Systems Joint Program Office, only 17% of cars on the road currently have monitoring systems, and according to the National Highway Traffic Safety Administration, there are over 800,000 blind spot accidents annually. Additionally, vehicles parked in high-risk or unfamiliar areas are often vulnerable to potential theft or vandalism. Our idea is to build a comprehensive car monitoring system that can be easily installed into any older car, providing drivers with improved safety and security both while driving and when their vehicle is parked. **Solution** We propose to develop a blindspot detection and monitoring system equipped with sensors that alert drivers when another vehicle is in their blindspot. The system will use a combination of LEDs on the side mirrors and audible alerts, such as a beeping noise, to notify drivers of potential hazards. A printed circuit board (PCB) will process real-time sensor data to determine if a vehicle is present in the blindspot. We will use ultrasonic sensors with adjustable sensing distances, which can be customized through a companion mobile application. Additionally, the mobile application will offer driving analytics, such as the number of close calls, the side on which more detections occur, and other driving metrics to help users improve their driving habits. To further enhance vehicle safety, we will integrate an "Away From Car" (AFC) subsystem. This subsystem will utilize the same sensors to detect motion or activity around the vehicle while it is turned off. If any motion is detected, the system will send an alert to the driver’s mobile app. This feature provides an extra layer of security for drivers when parked in unfamiliar or high-risk areas, ensuring vehicle safety even when the car is not in use. By combining blindspot detection, driving analytics, and the AFC system, our solution provides a comprehensive monitoring and safety package that can be easily retrofitted to older vehicles, significantly enhancing road safety and vehicle security. # **Solution Components** **Subsystem 1 - Sensors** This project will feature an array of ultrasonic sensors designed to detect objects at various distances around the vehicle. The sensors will transmit real-time data to a microcontroller for processing, allowing for accurate blindspot detection. The sensors will be strategically placed on the side view mirrors for optimal coverage. Additionally, the sensors will be used in multiple modes, including active monitoring while driving and motion detection when the vehicle is stationary (as part of the AFC system). **Subsystem 2 - Mobile Application** We will develop a companion mobile app to enhance user experience and safety. Through this app, users will be able to receive driving metrics such as the number of close calls, which side of the vehicle had more detections. Additionally, the app will deliver notifications when the AFC System is triggered. The AFC system will have an on/off switch within the mobile application to only be used in certain situations. **Subsystem 3 - AFC “Away From Car” System** This subsystem uses the same sensors employed for blindspot detection to monitor motion or activity around the vehicle while it is parked and turned off. If any movement is detected, the system will send an immediate alert to the driver’s mobile app, providing a sense of security in high-risk or unfamiliar areas. The AFC system is designed to provide real-time protection and is configurable via the mobile app for activation or deactivation based on user preferences. **Subsystem 4 - Alerting System** The alerting system includes two LEDs on each side of the vehicle that will light up based on data received from the sensors to visually alert the driver of potential hazards. In addition, a speaker connected to the microcontroller will output a synchronized audible alert (e.g., beeping sound) when an object is detected in the blindspot. The alerting system will function both while driving (for blindspot warnings) and when the vehicle is parked (as part of the AFC system). The intensity and frequency of the alerts can be configured through the mobile app to suit user preferences. **Subsystem 5 - Power System** We will have dry cell batteries to power the AFC System. The on/off switch will toggle the sensors to ensure the system is long lasting. # **Criteria for Success** **High Detection Accuracy:** The system accurately detects vehicles in blindspots and motion around the car with at least 90% accuracy and minimal false alarms. **Quick Alerts:** Alerts (LED, sound, or mobile app notifications) are activated within 1 second of detecting a vehicle in the blindspot or motion around the car. **Easy to Use:** The system can be installed by any user within 30 minutes, and the mobile app is user-friendly and easy to navigate. **Enhanced Safety Perception:** At least 70% of users feel safer and more confident while driving with the system installed. **Affordable:** The system is affordable for most car owners, ideally priced under $200, with low ongoing maintenance costs. |
||||||
37 | BattleBot RFA |
Deepika Agrawal Ishanvi Lakhani Megha Esturi |
Surya Vasanth | Arne Fliflet | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
|
Team Members - Megha Esturi (mesturi2) - Deepika Agarwal (deepika7) - Ishanvi Lakhani (ishanvi2) Problem Statement According to the guidelines PDF, we must ensure that the robot adheres to several key requirements. The robot must weigh no more than 2lbs and be constructed using 3D-printed thermoplastics. It should also feature a locomotion system and a fighting tool, with wireless control enabled via WiFi. The main goal of the competition is to design a Battlebot capable of disrupting the functionality of opponents' robots using its fighting tool while maintaining its own operational integrity. SOLUTION We plan to create a Battlebot using 3D-printed components, equipped with two omni wheels that will enable the robot to move in any direction. The Battlebot itself will spin like a beyblade, powered by brushless motors for smooth and efficient control. Additionally, we aim to incorporate a flipping mechanism, which will be driven by a combination of a spring and motor. A sliding plate, attached to a motor, will allow the plate to maneuver under the opponent’s robot. Once in position, the spring will activate, flipping the opposing robot to disrupt its functionality. The ESC will serve as the central control system for the robot, managing the motor's speed and direction. Additionally, communication with the bot will be established via WiFi, allowing us to remotely control its movements and operations during the competition. SUBSYSTEMS Power Module We would be using LiPo batteries in our battlebot. As of now, we plan to use 16V batteries but we may consider using slightly lighter batteries or batteries of lower voltage as we need to ensure that the weight of all components combined is less than 2lbs and so that all our hardware pieces can handle the voltage level. WiFi Controller We will use the ESP-32 to enable communication between the robot and a PC via WiFi. Our plan is to program the ESP-32 to create a wireless connection, allowing the robot to send and receive commands from the PC. The controller will manage the rotation and directional changes of the wheels, allowing the Battlebot to navigate effectively. Additionally, the controller will feature buttons to extend and retract the slate based on user input. Another button will trigger the slate to lift and flip the opposing robot during battle. All of these motions will be triggered by keyboard inputs. Driving System The battlebot will have DC motors connected to 4-6 wheels (still deciding) to control movement. The wheels will be omnidirectional wheels to allow for the robot to turn in place and move with ease. However, there is a concern that the bot might be able to be pushed around easily, so if time permits we will install a locking mechanism or mix the type of wheels that we use, such as adding grip wheels. Spring System and Flipping Motion The flipping mechanism will feature a high-torque DC motor to control the extension and retraction of a sliding slate, allowing it to move outward and inward beneath the opponent's robot. A compression spring will be released when triggered to execute a powerful flipping motion. Together, the motor and compression spring will provide precise control over the slate's movement and a strong, reliable flipping action. This will happen when the user presses the button on the PC control. Rotating System The T-Motor MN2212 Brushless Motor is a lightweight and efficient motor and we plan to use it for our rotating motion. It provides a good balance of torque and RPM, making it ideal for a small that needs to be less than 2lbs. Its durability and precise performance allow for smooth rotational motion, which is crucial for agile and responsive movements in competitive settings. The motor will be connected to the main part of the bot only allowing for the main shell to spin. Criterion for Success We would consider our project a success if we can establish effective communication between the PC and the various motors, particularly for the extending and retracting of the slates and the flipping mechanism. Since the opponent's Battlebot will also weigh 2 lbs, it is crucial that the slate has enough strength to lift this weight during the flipping action. For this project to succeed, obtaining this degree of authority and control will be essential. |
||||||
38 | Adaptive Response Digital Guitar Pedal |
Arya Nagabhyru Jack Vulich Will Coombs |
Jialiang Zhang | Cunjiang Yu | design_document1.pdf proposal1.pdf |
|
# **Adaptive response digital guitar pedal** Team members: - jvulich2 - aryacn2 - wcoombs2 # Problem The original method of guitar amplification used vacuum tubes. While tube amplifiers sound great and are very responsive to the dynamics of the guitar signal (i.e. a quiet note has very little distortion and a loud one is quite distorted), they are heavy and expensive. Modern day solid state amplifiers are not very responsive to the dynamics of the signal, giving the player less flexibility when playing. # Solution We propose an adaptive response digital guitar pedal. This pedal would respond to the volume of the note played, and adjust the amount of effect accordingly. This would be a cheap, effective way to give the guitar player the responsiveness of a tube amplifier, without the cost of it. This same principle would be applied to other effects as well, allowing the player to adjust the amount of reverb, chorus, delay, or fuzz just by how lightly they pluck a string. This opens up a world of possibilities for the player, allowing them to adjust their sound on the fly. This effects pedal is mainly for the use of the player, to allow them to change their sound without making any adjustments to their amplifier or effects board. It is not something for the listener to notice. There will also be a compression switch, allowing the user to adjust effect amount by playing a note at a different volume, but keeping the output volume at the same level. # Solution Components ## Analog Subsytem The input guitar signal will need to adjusted to the proper voltage level before going into the ADC. It will also need to be amplified after the DAC and before it is sent out of the pedal. This will be done via analog amplification circuits. ## Microcontroller Subsystem The microcontroller is what will perform the filtering and processing of the guitar signal. It will also manage the I/O of the pedal, including the LCD display, rotary encoder, and compression switch. The microcontroller being used is the ESP32-S3. ## Power Subsystem The power of this board will come from a 9V wall outlet. We will have a regulator regulate the voltage down to 3.3V for the microcontroller, DAC, and ADC, and likely use the 9V for the digital subsystem. ## ADC/DAC Subsystem The analog signal coming in must be converted to digital so that the microcontroller can process it. It also must be converted back to analog after the processing is done. We plan to use 32 bit hardware controlled ADC/DAC to remove some strain from the microcontroller. ## I/O Subsystem We will have various components to make up the input and output for the user. We will have an on/off switch , turning the effect on or off. We will have a rotary encoder to select the effect being used. An LCD will display the current effect. A compression switch will give the user the option to have the output compressed to the same level. We will use bluetooth to control different parameters from a phone. Parameters include volume control and responsiveness. # Criterion for Success Responsive: our unit will need to respond to the volume of the incoming signal to adjust the amount of effect being used Bluetooth: We want to keep the I/O of the physical pedal as simple as possible, so bluetooth must be used to adjust the different parameters, including the amount of responsiveness Easy to use: our I/O interface must be easy to use so that the player can spend more time playing and not waste time playing around with the pedal settings. |
||||||
39 | RFA: A device for showing arbitrary tokens for trading card games: The Tokenizer (Revised) |
Jackson Peterik Nathan Shin Niketh Lakshmanan |
Angquan Yu | Cunjiang Yu | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
|
# The Tokenizer RFA (Revised) ### Team Members: - Jackson Peterik (peterik3) - Niketh Lakshmanan (nikethl2) - Nathan Shin (nsshin2) --- ## Problem Statement In trading card games like *Magic: The Gathering* (MTG), players often need to generate a wide variety of "tokens." These tokens act as temporary stand-ins for creatures or other game elements that aren't part of a player's physical deck. The sheer variety and number of tokens can be difficult to manage during a game. Currently, players represent tokens using improvised items like spare cards, dice, or paper scraps. These methods are inconvenient, messy, and prone to causing confusion—especially when players need to track specific game states like power/toughness, abilities, or counters on each token. Furthermore, as games progress and the number of tokens increases, managing the game board becomes tedious. Our personal experience playing MTG has led us to frequently face these challenges. Not only is managing tokens cumbersome, but it also interrupts the flow of the game. This inspired us to create a hardware-based solution—a digital token display that streamlines gameplay and reduces the physical clutter on the table. --- ## The Need for the Tokenizer Tokens are integral to the gameplay in *Magic: The Gathering* and similar games. Often, they have different power levels, abilities, and counters, which can change during a game. Tracking all of this manually can lead to errors, slow gameplay, and detract from the overall experience. While a mobile app could solve part of this problem by displaying token images, it is not a perfect solution. Using an app would tie up a player's phone, and since games can last up to an hour or more, this may be impractical. Phones are often needed for other purposes, such as checking messages, using timers, or referencing rules. Constantly switching between these functions during gameplay would disrupt the flow of the game. In a game where the board state should be visiable at all times, picking up your phone for a rule check would mean opponents would be unable to see what your true board state is. A dedicated hardware solution like the Tokenizer avoids these issues by freeing up the player's phone and providing a specialized, easy-to-use interface tailored for gameplay. Moreover, hardware allows for faster, more intuitive interactions—such as adding or removing tokens or updating their statuses in real-time—without the hassle of navigating through an app during gameplay. --- ## Solution Overview Our solution is a dedicated device with a card-sized screen that dynamically displays the tokens needed during a game. The screen will show individual tokens, groups of identical tokens, or several unique tokens, depending on the game state. Players can manage tokens using simple physical buttons to add or remove copies or update their status (such as changing their power/toughness or adding counters). The device will work in conjunction with a companion mobile app, which will allow users to select token types and upload new images or data to the device. --- ## Solution Components ### 1. The Tokenizer Device The device will consist of three main subsystems: Input/Output (IO), Power Management, and the Microcontroller. Each of these plays a critical role in ensuring the Tokenizer functions smoothly during gameplay. ### 1. Input/Output (IO) The IO subsystem is responsible for user interactions and displaying information. It includes: - Buttons: Physical buttons that allow players to interact with the device, such as adding or removing tokens, updating their attributes, or scrolling through different token displays. - Switches: A power switch, as well as a switch to enable the editing of all identical tokens simultaneously. - Display Screen: A 4.2-inch E-Ink or LCD screen that shows the token images and associated statuses (such as counters or abilities). E-Ink is considered for its low power consumption, making it ideal for prolonged use during gameplay. LCD would allow for a full color image, but would increase power consumption. ### 2. Power Management This subsystem ensures the device remains powered efficiently during long gaming sessions: - Battery: A flat LiPO battery will provide sufficient power for several hours of gameplay. - Battery Management System (BMS): This component manages charging and power distribution, ensuring the battery remains healthy and efficiently charges via a USB-C connection. This may end up being two circuits, a battery protection circuit and a charging circuit. - Power Regulation: A buck-boost converter and voltage regulator will ensure stable voltage to all components, even as the battery depletes, preventing any disruptions during use. ### 3. Microcontroller The microcontroller is the "brain" of the device, handling data processing and communication between subsystems: - Microprocessor: This component manages the input from buttons, displays the correct token images on the screen, and communicates with the mobile app for data transfers. It will have built-in USB communication capabilities to enable data transfer between the device and a phone. - Memory: Extra RAM or flash memory will be used to store multiple token images and any real-time updates, such as token state changes during gameplay. - USB-C Communication: The microcontroller will also handle data transmission over the USB-C port, allowing users to upload token images and data from the app seamlessly. ### 4. Companion Mobile App The mobile app will interface with online card databases to retrieve high-quality card images and corresponding game data. Users will be able to select tokens from these databases and upload them to the device via USB-C. - UI: The UI will allow users to select images from their phone or search cards online, and add them to the local device to be sent over. - USB Serial Communication: The app will send all card data from the phone to the device over a Serial USB connection. - API Access: The app will connect with online public databases to get official art for tokens if wanted, such as Scryfall for Magic. --- ## Criteria for Success To ensure the Tokenizer is both functional and practical for gameplay, we propose the following success criteria: 1. Clear and Crisp Display of Tokens: The screen should clearly display individual tokens, groups of tokens, and updates to token attributes (e.g., power, toughness, counters). - Token's art should be differentiable from a distance of 1m. 2. Real-Time Interaction: Players should be able to add or remove tokens and update token statuses in real-time using physical buttons without needing to pause the game or access the app. - Changes to card details should be visible within 1 second of buttons being pressed. 3. Adequate Data storage: The device should be able to handle many token types, with even more unique variations per tokens, and a much larger number of identical tokens. - The device should be able to store the data for 10 token types, the attributes for 16 unique tokens per token type, and up to 255 identical copies per token. 4. Seamless Data Transfer: The mobile app should efficiently transfer token images and game data to the device via USB-C, with minimal setup time. - Transferring 10 unique cards along with their attributes should take no longer than 2 minutes. 5. Battery Life: The device should be able to last for several games on a single charge. - The device should last at least 4 hours of continous usage on a full charge. 6. User-Friendly App: The companion app should allow easy searching, selection, and transfer of token images from online databases. - Selecting a card should be able to be done by somebody who has never used the app before. 7. Tactile User Interface: Physical buttons on the device should offer intuitive control, allowing players to quickly and effortlessly modify tokens during gameplay. - Buttons should be debounced, having no accidental double presses, and should have a tactile feedback when actuated. By meeting these criteria, the Tokenizer will simplify token management in trading card games, enhance gameplay experience, and reduce the mess and confusion associated with manual token representation. |
||||||
40 | Item Retrieval Robotic Assistant |
Haotian Wang Peng Chen Ziyi Han |
Zutai Chen | Cunjiang Yu | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
|
**Item Retrieval Robotic Assistant** Team Members: - Peng Chen (pengc5) - Haotian Wang (hw46) - Ziyi Han (ziyihan2) ## Problem Statement: When someone with a leg injury or mobility issues needs to retrieve items from another room at home, it can be quite inconvenient. Without assistance from others, they often need to rely on crutches or wheelchairs to reach the location of the item and pick it up, then return. While this may seem like a simple task, for those with limited mobility it can be very exhausting and inconvenient. Therefore, having a device that can help these individuals retrieve and deliver items they need would make their lives much easier. ## Solution: Our project is a remote-controlled car equipped with a robotic arm, designed to enable people to retrieve objects without the need to move. Users can operate the movement of the car and the robotic arm using joysticks on the controller. The remote controller features a screen that displays the camera feed from the robotic arm, allowing users to see the car’s surroundings and select items they want to pick up. After the robotic arm picks up an item, it will flip backwards and put the item into the box on the car. In addition, for items stored at heights, users can utilize the telescopic function of the robotic arm to reach the item. Moreover, by rotating the gripper of the robotic arm, users can open cabinets and drawers to retrieve items inside. ## Materials: Custom designed PCB , Microcontroller, Robotic arm, Grippers, Motor Drivers, Cameras, Joysticks, LCD Screen , Batteries , Car ## Subsystem 1- camera Camera will allow the user to see the condition around the robotic arm car, and the information which is collected by the camera will be transmitted to the monitor. Camera module will be a separate part with PCB since it is not used to control the car or robotic arm. ## Subsystem 2 - power The power of our project will be provided by the battery. The power will be separated into two parts: mechanical power supply and electronic power supply, and these two components are supplied with different powers by the battery. Mechanical power supply includes the power supply for electric motors of the car and robotic arm, and electronic power supply includes the power supply for PCB and so on. ## Subsystem 3 - remote controller Remote controller is used to remotely control the motion of the car and the rotation of the robotic arm. Remote controller will have one PCB which contains the signal generator, and the signal generator will send different signals to the robotic arm car, which could make the robotic arm car do different actions. ## Subsystem 4 - remote control car Remote control car is a car which could be remotely controlled by the remote controller to move forward and backward, turn left and turn right. The remote car contains electric motors, PCB and car itself like wheels. When the signal receiver in PCB receives the signal from the remote controller, the PCB will send the signal to control which wheels, which direction and what speed should turn. ## Subsystem 5 - remote control robotic arm Remote control robotic arm is the robotic arm installed on the remote control car. The remote control robotic arm will have two joints and one robotic hand, which make it flexible enough, and the motion of two joints and the robotic hand is finished by the electric motors. Also there will be PCB inside the remote control robotic arm, and PCB containing signal receiver which could receive signal from the remote controller could make the robotic arm finish different actions. ## Criterion For Success - Functionality and Accuracy: the robotic arm must be able to successfully pick up and place items weighing up to 2 kilograms into the box on the car in 95% of trails - Telescopic Reach: the telescopic function of the robotic arm should be able to extend to a minimum of 1.2m and successfully retrieve items placed at this height in 90% of trails - Gripper Effectiveness:The gripper must be capable of rotating and successfully opening cabinets or drawers in 90% of trails. |
||||||
41 | Ambient Lighting System |
Anusha Adira Chinmayee Kelkar Manushri Dilipkumar |
Chentai (Seven) Yuan | Kejie Fang | design_document1.pdf proposal1.pdf proposal2.pdf |
|
# Title Team Members: - Anusha Adira (adira2) - Manushri Dilipkumar (md38) - Chinmayee Kelkar (ckelkar2) # Problem In many environments, the ambiance is heavily influenced by lighting, which often requires manual adjustment to match the mood of the space. This can be inconvenient and limits personalization. What if we had a system that could track how you are feeling and adjust the lighting system accordingly? We propose an individual lighting experience that acts as a dynamic lighting system, reacting to sound and heart rate to provide a personalized, immersive environment. This system would eliminate the need for manual intervention, offering a more cohesive ambiance that changes based on both noise and the user’s emotional state. For example, if a user is watching an action movie or playing music, the system will synchronize lighting with the intensity of the scene or sound. In addition, the system adjusts the brightness based on the user's heart rate, creating a unique experience tailored to their mood and activity. # Solution Our project proposes the development of an intelligent lighting system that connects LED strips, which can be placed behind a TV, painting, or near a speaker. The system automatically synchronizes with the background noise of the user's activity, while also adjusting intensity based on the user’s heart rate. This enhances the user experience by providing adaptive lighting that is highly personal and responsive. At a high level, we have an audio system that collects background audio and sends signals to change the color of the LED strip. Additionally, a heart monitor system connects to the circuit via Bluetooth and sends signals to adjust the intensity of the LED strip—brighter for higher heart rates and dimmer for lower heart rates. # Solution Components ### Subsystems: 1. **Audio Processing (Noise Levels)** 2. **Heart Rate Monitoring** 3. **LED Control System** ## Subsystem 1 - Audio Processing This subsystem will take the audio input, process it to detect noise levels, and feed it into the MCU (Microcontroller Unit) for LED color adjustment. We will use a MEMS microphone (e.g., **INMP441**), which provides high-quality audio input and interfaces directly with the MCU. The audio data will be processed using Fast Fourier Transform (FFT) or amplitude-based algorithms to detect noise levels and changes in frequency. **Key Components:** - **MEMS Microphone (INMP441)**: Captures audio input - **MCU (e.g., ESP32)**: Processes audio data - **Audio Detection Circuit**: Amplifies and filters audio signals ## Subsystem 2 - Heart Rate Monitor This subsystem will include a heart rate monitor and a Bluetooth module. The heart rate monitor will collect heart rate information, and the Bluetooth module will transmit this data to the MCU. The MCU will use this data to adjust the intensity of the LED strip. For this, we will use the **MAX30102** pulse oximeter and heart rate sensor, which is capable of providing accurate heart rate measurements. **Key Components:** - **MAX30102 Pulse Oximeter**: Measures heart rate - **Bluetooth Module (HC-05/HC-06)**: Transmits data to the MCU - **MCU (e.g., ESP32)**: Receives heart rate data and adjusts LED intensity ## Subsystem 3 - LED Control System The LEDs will be controlled via an LED driver connected to the MCU. The LED colors and brightness will be dynamically adjusted based on the processed audio input and heart rate data. We will use individually addressable LED strips (e.g., **WS2812B**) to allow for precise control over color and intensity. To control the LEDs, we will use **Pulse Width Modulation (PWM)** to vary the brightness based on heart rate, while color will change based on the audio analysis. **Key Components:** - **WS2812B LED Strips**: Individually addressable LEDs - **MCU (e.g., ESP32)**: Controls color and brightness - **LED Driver**: Supplies power and control signals to the LED strip ## Subsystem 4 - Power System The power requirements for the LEDs and sensors will be handled by a regulated 5V power supply capable of delivering enough current for the LED strips. The MCU and sensors will run off the same power supply with proper voltage regulation to ensure safe operation. **Key Components:** - **5V Power Supply**: Powers the LED strips and MCU - **Voltage Regulator**: Ensures stable power for sensors and MCU # Subsystem 5 - Software/Control System The MCU will run the core software that processes both the audio input and heart rate data. The audio data will be processed using FFT algorithms or simple amplitude analysis to detect noise levels and trigger color changes. Meanwhile, heart rate data will be received via Bluetooth and used to adjust brightness via PWM control. The system will also prioritize how to combine the two data inputs (audio and heart rate). For instance, audio data will mainly control color, while heart rate controls intensity. Both inputs will be processed in real-time, ensuring smooth transitions and a responsive system. # Criterion For Success Our project will be effective if we have an LED strip that reacts to sound, changing color depending on the change in noise. Our project needs to utilize a heart monitor and connect this with our system via Bluetooth and accurately identify when a user’s heart rate is beating fast, and if so, increase the intensity of the light accordingly. Overall, we need to make sure that data transmission is accurate and the LED strip changes intensity based on heart rate and changes colors based on background noise. |
||||||
42 | Household Water Usage Monitoring System |
Advait Renduchintala Daniel Baker Jack Walberer |
Pusong Li | Kejie Fang | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
|
Team Members: - Daniel Baker (drbaker5) - Jack Walberer (johnaw4) - Advait Renduchintala (advaitr3) # Problem In our apartment, we pay an additional water fee if we go over a certain threshold. When we do, we have no specific information about where we can reduce our water usage. For example, we would like to know if one specific shower is using more water than other showers so that we can tell that roommate to reduce their shower duration. In addition this product can encourage a reduction in water usage in general, which is good for the environment. # Solution We plan to make an IoT Water Usage Monitors that individually record the usage of an individual water source. These monitors will attach to the water source (shower pipe, faucet, etc) and measure the water usage over the selected period of time. The client will have a dashboard listing their water usage monitors in their household, where they can name each device for easier identification. This dashboard will also show each device’s water usage so they can quickly identify the high water usage sources. Each device will have an onboard information display with the client’s name of the device and its respective usage. Describe your design at a high-level, how it solves the problem, and introduce the subsystems of your project. # Solution Components Ultrasonic Sensor Explain what the subsystem does. Explicitly list what sensors/components you will use in this subsystem. Include part numbers. ## Subsystem 1 - Water Source Attachment Subsystem (Hardware) Each device will attach to the water source via a clamping mechanism. The clamping mechanism allows for easy attachment to various pipe sizes. ## Subsystem 2 - Water Flow Rate Measurement Subsystem (Hardware & Software) Each device will have a mounting brace and two Ultrasonic Sensor integrated with the clamping mechanism to measure the flow rate through the pipe. When we design the mounting braces for our bathroom faucets we will measure the diameter of the outer pipe. This will be used to get the cross sectional area of the inner pipe, while the ultrasonic sensors will measure the difference in time between sending and receiving signals to each other. Using this difference in time, we can use the following equation to calculate the velocity of the water in the pipe: v = [c2 * (T2 - T1)] / [2 * L* cos(theta)]. Where v is the flow velocity of the fluid. Where L is the length of the ultrasonic path between the transducers. Where c is the speed of sound in the fluid. Where theta is the angle between the ultrasonic path and the direction of flow. Where T2 - T1 is the transit time difference between the ultrasonic sensors/transducers. Ttotal The overall time for the ultrasonic pulse to travel from the transmitting transducer to the receiving transducer. Twalls The time taken for the pulse to traverse both pipe walls. Tfluid The time taken for the pulse to travel across the fluid inside the pipe. Ttotal = Twalls + Tfluid Twalls = 2 * (dwall / cwall) Tfluid = Ttotal - Twalls Dinner = cfluid * Tfluid Dwall is pipe thickness that will be measured manually before we mount the monitoring system. To get this we need to manually measure Dwall to use the equation as well as mount our monitoring system. Our unknown/what we’re trying to figure out is the Dinner which will be used for figuring out the flow rate. Combining this velocity and cross sectional area, we can find the flow rate: Q = vA. All these computations will be done via the onboard microcontroller ESP32 with wifi. ## Subsystem 3 - Water Usage Dashboard (Software & Hardware) The water usage dashboard will be hosted on a website and will include a table/chart with descriptive information showing water usage statistics. This website will be constructed using React, Node, JavaScript, HTML/CSS. We will use GitHub pages to host this website so it can be easily accessed locally by the microcontroller and the rest of Subsystem #2. ## Subsystem 4 - Onboard Information Display (Hardware) We plan using a LCD Display that’s relatively small because we want to show the current water usage and the unique ID we assign the specific monitor. This LCD display will be connected to the microcontroller so that the data can be fetched from the webpage where the data will be stored and updated accordingly. # Criterion For Success Describe high-level goals that your project needs to achieve to be effective. These goals need to be clearly testable and not subjective. -Devices accurately measure the amount of water from the sink each time it is turned on, when the sink is shut off, device resets. To test this we will fill ½ gallon of water using the sink, time it and ensure we follow a scientific process to ensure we get accurate and unbiased flow results. We will then use what our monitoring system outputs to compare to the actual amount to ensure that the system works as intended. -Devices connect to each other and successfully send measurements to the website. To ensure this, we will first set up the webpage with code that can be tested independently of the system. We will then also test whether the microcontroller is sending the accurate data to the webpage. This will ensure that the webpage works and also ensure that the microcontroller is sending accurate data. |
||||||
43 | Water-Skimming Robot for Pollution Cleanup |
Dylan Bautista Malay Rungta Zachary Krauter |
Dongming Liu | Arne Fliflet | design_document1.pdf design_document2.pdf proposal1.pdf proposal2.pdf |
|
Title: Water-Skimming Robot for Pollution Cleanup Team Members: Dylan Bautista (dylanjb5) Zachary Krauter (zpk2) Malay Rungta (mrungta2) Problem: Water pollution from man-made debris, poor waste management, and invasive species threatens aquatic ecosystems and public health. Traditional cleanup methods are often inefficient and labor-intensive, highlighting the need for an automated solution. With increasing environmental concerns, there is a pressing need for innovative solutions to protect marine ecosystems. Solution: We propose a robotic system that autonomously skims water surfaces to detect and collect small floating debris within a predefined area. The lightweight robot will float and roam a body of water to collect material in a skimming net for disposal or analysis. It will use GPS and sensors for efficient coverage and steering, allowing for it to return to a set of coordinates for emptying. Additionally, the system will include water quality sensors, such as a turbidity sensor, to monitor pollution levels. The turbidity sensor will be connected to LED lights to provide real time feedback on water clarity: a green light for normal conditions and a red light for high pollution levels. Our system can be tested in a nearby lake or pool on a small scale to evaluate both its collection capabilities and its ability to provide water quality data. Solution Components: Subsystem 1: Motor Control hardware The motor hardware consists of dual brushless DC motors with rotor attachments for water. We have selected the LICHIFIT RC Jet Boat Underwater Motor Thruster 7.4V 16800RPM CW, which should have sufficient torque for our slow-moving purpose. This will be attached to our power system and regulated by our microcontroller through PCB connections. Subsystem 2: Autonomous guidance (software) The actual steering will be done using a rudder which is moved in place by a servo motor, such as the RC Boat Model Servo Steering Gear. This will also be attached to the power system and microcontroller. A control algorithm will be implemented on the Arduino Uno Rev 3 controller board much like how an autonomous vacuum cleaner operates. It will be roaming the expanse of its body of water, adjusting the angle to avoid the gps-defined boundaries of the body of water. This will have to use a GPS Module Receiver, Navigation Satellite Positioning NEO-6M to determine when the front of the robot is nearing these edges. Additionally, after a set period of time, the robot will return to a specified set of coordinates using its GPS and IMU information in order to dispose of the contents of the net. Subsystem 3: Power Systems The 7.4 V Zeee battery should be sufficient to run all the sensors, motors, and steering servo. The components will be housed in a waterproof case to protect the electronics from any water damage. Subsystem 4: Chassis and Storage The main chassis will be made mostly of 3D printed parts and lightweight materials like PVC pipes and styrofoam. We will use a standard plastic debris net which has an entrance mounted at the end opening of the floating device, with the rest of the net trailing behind. Subsystem 5: On board Sensors At minimum, our robotic system will have a turbidity sensor to monitor particle content in the water for additional environmental data. This will be connected to our power system and the accompanying microcontroller. The robot will use LED lights to provide visual feedback based on the sensor readings. Green will indicate normal water conditions, and red will indicate water pollution. The sensors will change the lights if turbidity rises above a predefined level. Criterion for success: For our system to be deemed effective, we will test our robot in a small scale body of water such as a university pool or a nearby lake.We will set a general outline of boundaries where the boat should not cross, and place small floatable and retrievable pieces of debris within the water. We can also add contaminants like dirt to test the turbidity sensor. Our product will be deemed a success if the robot eventually picks up these pollutants and makes it to the disposal zone with the object still in the net, and if it can relay the water turbidity through LEDs. |
||||||
44 | Self Heating Bed |
Amaan Rehman Shah Hari Gopal Siddharth Kaza |
Jialiang Zhang | Cunjiang Yu | design_document1.pdf proposal2.pdf proposal1.pdf |
|
# Self Heating Bed Team Members: Siddharth Kaza (kaza3) Amaan Rehman Shah (arshah6) Hari Gopal (hrgopal2) # Problem Many prefer a fan or heater next to their bed, so as to get a restful night’s sleep. Certain solutions such as the BedJet or EightSleep have been produced, but are financially out of scope for the majority of people. Additionally, standing ventilation systems can often be loud or not provide temperature control for the entire bed, leaving a non-uniform warmth or coolness which may become uncomfortable over time. # Solution A heating mattress is our answer to the many who feel uncomfortable with frigid temperatures in the middle of winter. The system would be an attachment to one’s bed frame (through clamps), with hot air circulating through bed sheets to simulate a warmer environment. Four splits can be made for this project: heating, circulation, and safety. Each will be expanded on below. # Solution Components ## Subsystem 1 We intend on implementing heating using independent and smaller heating coils, due to their cost effectiveness compared to the circulatory system in most apartments and houses. This coil is usually a resistor in most heating systems, coupled into an electric system where more power sent through the resistor results in more heat being dissipated. An infrared heater is potentially another option, but considering the space is a bit larger than what infrared is meant to hit, coils seem like the better choice. McMaster sells heat coils for around 30-40 dollars, at this link: https://www.mcmaster.com/products/heating-coils/. To measure the temperature, we will use a thermometer at the output At the moment, we believe it is too complex/expensive to implement a cooling system for the bed; however, we’d like to discuss the idea further with a TA to understand the components needed and finalize it in our proposal. In the current implementation, we would be venting room temperature air underneath the covers, which can still serve to reduce the temperature similar to a tower fan. ## Subsystem 2 Circulation is an issue even in conventional air conditioning systems, which makes its implementation all the more pertinent in our project. Through a fan or air blower, we can circulate air under the blankets and bed sheets to increase the temperature of the bed without having the problems of Eight Sleep (leakage issues, temperature mismatches, etc.). Additionally, we intend on giving the user control of this function through a motor control system and receiver implemented on our PCB. Easy access and variability through an app or remote of some sort will most certainly satisfy user expectations and leave a good experience. This speed controller from Amazon is an example of what will be used to modulate the fan power. https://www.amazon.com/Controller-Adjustable-Portable-Interface-Accessories/dp/B0D2BJV1KY ## Subsystem 3 Safety and power are the last two issues, and largely hinge on limits that we need to implement on the heating system. The coils that we buy will likely have a wattage rating that we can abide by, and set hard limits for using fuses within the system and on the PCB. Furthermore, checks and balances will be made for the power system through multiple voltage valuations and current examinations, feeding back to the main controller on the PCB and allowing us to monitor the system at all times. A potential option for the feedback system is PID based, as it provides the most flexibility and has been tested numerous times in other projects. The feedback system will be core to how we control our fan and heating, and will require fine tuning at the end of our project to ensure that we stay within safe operating temperatures. # Criterion For Success Describe high-level goals that your project needs to achieve to be effective. These goals need to be clearly testable and not subjective. Our project should: Be able to modulate the temperature of its surroundings (defined as the temperature within a square box of the bed) within 3 degrees Fahrenheit of what the user inputs Have a quiet air ventilation system, measured around 50-60 decibels (when sleeping, noise around one should not exceed 50) Not power hungry and able to subsist off of the wattage of a normal fan or heater (1500W) |
||||||
45 | Keyboard DJ Set |
Jack Prokop Manas Gandhi Milind Sagaram |
Sainath Barbhai | Cunjiang Yu | design_document1.pdf design_document2.pdf proposal3.pdf proposal1.pdf proposal2.pdf |
|
# Keyboard DJ Set *Updated PRA after taking into consideration the comments given to us.* ## Team Members: - Manas Gandhi (manaspg2) - Jack Prokop (jprokop2) - Milind Sagaram (milinds2) ## Problem DJ boards have become the “hot topic” of today’s music industry, with the tool giving way to many of the greatest artists of our generation, including *John Summit* and *Twinsick*. While we have seen many great EDM artists shine due to the traditional DJ set, it has some drawbacks as well - namely the lack of portability, ease of use for new users, and high prices. ## Solution To address these challenges, we propose the DJ keyboard, a DJ board that simply uses the keys on an external keyboard, connected to a computer. This makes use of a microcontroller on a PCB for processing the inputs of the keyboard and converting that into commands for the software. We also will create software for the songs and audio processing, as well as the speaker technology. This approach simplifies the complicated DJ board, reduces the cost and size of getting a DJ board, and makes it easy to take anywhere, making the DJ experience for everyone much easier. The specific DJ board elements we want to incorporate are volume control, tempo control, music slicing and looping, song skipping functionality, and if we have time, auto-crossfade capabilities ## Solution Components ### 1. Keyboard This subsystem w inputs that our system will be receiving. - **Inputs**: keyboard should take inputs for all keys on the keyboard, and maybe combinations of keys (if we have time). ### 2. Microcontroller This subsystem is the core of the whole system, providing the communication between the input (keyboard) and the software. - **Communication between keyboard and software:** this will be our communication system between the input and the output. - **Power:** Will be used to power the keyboard. ### 3. Volume Control This subsystem is where we control the volume using the PCB. - **Volume controller:** We will use a potentiometer (which we might connect to a slider) to increase and decrease the volume (increased resistance = lower volume). - **Speaker:** We will put two small speakers on the PCB, and connect to output of the potentiometer to the input of the speaker. ### 4. Tempo Control This subsystem is where we control the tempo of the song (in BPM) that is currently playing. - **Tempo controller:** We will use a potentiometer (which we might connect to a slider) to send a signal to the microcontroller to increase or decrease the tempo. ### 5. Skip song This subsystem is where we skip the song that is currently playing, allowing us to switch to the next song. - **Skip button:** We will use a button that sends a signal to the microcontroller to skip the current song and go to the queued up song. ### 5. Power This subsystem where the power is supplied to the system, including the microcontroller and speakers. - **Rechargeable Battery:** We will have a rechargeable battery on our PCB that will supply the power to the rest of the system, including the microcontroller and all other hardware components. ### 6. Software This subsystem where the control and processing of the system happens. The inputs should be processed into some functionality based on a DJ board here. - **Input processing:** Inputs will be processed and translated into functions here. - **Communication with Keyboard:** embedded software will be written to communicate with the microcontroller through a USB using SPI. - **Audio processing:** audio processing will occur here, where the audio files will be processed and manipulated based on the inputs of the keyboard. - Will have to incorporate signal processing libraries. ### 7. Laptop This subsystem will be used for all the components we cannot buy, specifically for a hard drive. - **Hard drive:** hard drive is just for storage purposes, real DJ boards have hard drives that they plug into the boards typically. ## Criterion for Success - **Ability to read inputs:** the microcontroller should be able to read inputs from the keyboard and send that to the software to be processed. - **Ability of software to process inputs:** the software should be able to take the inputs from the keyboard through the microcontroller and translate them into functions that represent manipulations of the audio files. - **Ability to manipulate the audio files:** the software’s functions should be able to manipulate the audio files based on the keyboard inputs in real time without pausing the play of the music. - **Ability of system to have music on the “hard drive”:** the music should be stored as audio files on the “hard drive” (in our case we will just use the laptop file system). - **Ability to play selected music through speaker:** the system should play the selected music through the speaker without fail whilst taking input from the keyboard. |