Project

# Title Team Members TA Documents Sponsor
11 Early Response Drone for First Responders
Aditya Patel
Kevin Gerard
Lohit Muralidharan
Manvi Jha design_document1.pdf
design_document2.pdf
other1.pdf
proposal2.pdf
proposal1.pdf
**Problem:**
Every week, UIUC students receive emails from the Illini-Alert system regarding crimes that are committed, fires that are occuring, and other dangerous situations to be aware of. With the latest reported median response time of first responders to a 911 call being over 6 minutes in Champaign County ([source](https://dph.illinois.gov/topics-services/emergency-preparedness-response/ems/prehospital-data-program/emsresponsetimes.html)), the situation to which emergency personnel are responding can drastically change from the initial details that were provided. To best be able to manage the event, first responders need as much accurate information as they can possibly receive so that the situation can be handled in a timely manner and the safety of everyone involved is prioritized.

**Solution Overview:**
Our solution is to construct a cost-effective drone that first responders can deploy and immediately fly to the location of an emergency event. While en route, they can use the drone’s on board camera and computer vision capabilities to assess the situation at hand. There are multiple scenarios in which this drone could be particularly beneficial, such as:

- Police: monitor crime scenes and track suspicious individuals; provide aerial surveillance for events with a high density of people (such as sports games, concerts, or protests) to ensure the safety of everyone

- Fire: monitor the spread of fire at the location; obtain information on what kind of fire it is (electrical, chemical) and any potential hazards

- Medical: assess the type and number of injuries suffered, and locations of patients

Our drone system comprises 4 different elements: a cloud storage, a backend, a frontend, and the drone itself. The high level block diagram linked below illustrates which elements communicate with other elements through data transferring shown by the arrows.

[[Link](https://drive.google.com/file/d/12qx_syQQH0pHcrh7uVouneDARXH_6Dbi/view?usp=sharing)]

In order to create a baseline early response drone, we need to be able to control the drone as well as receive information from the drone such as capture frames, altitude, roll, pitch, and yaw. The capture frames and data will be visually displayed in the frontend. However, this data bundle will first be stashed onto a cloud storage, and when the backend is ready to receive the data, it will retrieve it. The reason why we have a backend is because if time permits, we want to perform machine learning processing using object tracking and detection models. The other data transmission that occurs is by sending command signals from the frontend to the drone itself. In other words, whenever there is a keyboard click, we can visually see the key click which is uploaded to the cloud storage.

**Solution Components:**
1. **Drone Hardware/Software:**
Utilizes ESP 32 with SIM7600 for data transmission.
Retrieve roll, pitch, and yaw using MPU6050 IMU sensor and altitude (using pressure) with BMP280.
Utilize Servos to control flaps, rudders, and aileron and brushless motor + ESC for single rotor control.
3. **Drone Structure:**
We will be utilizing foam board due to ease for repair just in case rather than LW PLA or PLA in general.
Utilize larger wingspan for easier control of the drone .

4. **Cloud Storage:**
The cloud storage will act as a Medium between the Drone itself and the C++ Backend.
EXTRA: We are trying to completely eliminate the use of Cloud storage. There seems to be a way of using either TCP or a higher level protocol like HTTP requests according to youtube, arduino forums, and Chat-GPT 4o.

5. **C++ Backend:**
Utilize HTTP Request to retrieve drone from the cloud storage to send and to the TypeScript Frontend using websockets.
Utilize Websockets to receive command signals.
EXTRA: Run the frames on a Deepsort Model for tracking humans using either a pre-trained Yolo Model or a trained Yolo Model (the train set will be generated by utilizing the drone itself).

6. **TypeScript Frontend:**
Use Websockets to send command signals and retrieve drone data to the C++ backend.
Visually display a command control for the user.

**Criterion for Success:**
- **Stability and Flight Controls:** Smooth operation of drone while in flight at varying altitudes, and non-jerky response to user-controlled inputs
- **Sophisticated UI:** Easy-to-use and proportional web-based user interface for viewing camera frames, sensor data, and controlling the drone’s movements
- **Frame Transmission:** Ability to transmit frames back and forth to the database, which then connects to the C++ backend using a cellular connection
- **Computer Vision:** Time permitting, ability to detect and track objects (people) from a high up, aerial view based on a self-trained ML model

Additionally, for testing the demonstration purposes, we plan to view the university guidelines and restrictions on drone flight. We will then find a suitable location, such as an open field or quadrangle, for launching and landing our drone. For permission, we will need to register the drone with the FAA and university, and each of our group members will need to take a short test to obtain a drone license.

Cypress Robot Kit

Todd Nguyen, Byung Joo Park, Alvin Wu

Cypress Robot Kit

Featured Project

Cypress is looking to develop a robotic kit with the purpose of interesting the maker community in the PSOC and its potential. We will be developing a shield that will attach to a PSoC board that will interface to our motors and sensors. To make the shield, we will design our own PCB that will mount on the PSoC directly. The end product will be a remote controlled rover-like robot (through bluetooth) with sensors to achieve line following and obstacle avoidance.

The modules that we will implement:

- Motor Control: H-bridge and PWM control

- Bluetooth Control: Serial communication with PSoC BLE Module, and phone application

- Line Following System: IR sensors

- Obstacle Avoidance System: Ultrasonic sensor

Cypress wishes to use as many off-the-shelf products as possible in order to achieve a “kit-able” design for hobbyists. Building the robot will be a plug-and-play experience so that users can focus on exploring the capabilities of the PSoC.

Our robot will offer three modes which can be toggled through the app: a line following mode, an obstacle-avoiding mode, and a manual-control mode. In the manual-control mode, one will be able to control the motors with the app. In autonomous modes, the robot will be controlled based off of the input from the sensors.