Project

# Title Team Members TA Documents Sponsor
25 Building Interior Reconnaissance Drone (BIRD)
Jack Lavin
Jacob Witek
Mark Viz
Shiyuan Duan design_document1.pdf
proposal1.pdf
# Building interior reconnaissance drone proposal

Team Members:
- Mark Viz (markjv2)
- Jack Lavin (jlavin4)
- Jacob Witek (witek5)

# Problem

There are many situations when law enforcement or emergency medical service professionals need quick, real-time, useful information about a non-visible location without sending in a human to gather this information due to present risks. One of the most important things to know in these situations is if there are people in a room or area, and if so, where they are located. While there are current promising solutions used by these professionals, they can rarely be operated by one person and take away time and manpower from situations which usually greatly require both. Our solution attempts to address these issues while providing an easy-to-use interface with critical information.

# Solution

Our solution to this issue is to use a reconnaissance drone equipped with a camera and other sensing components and simple autonomous behavior capabilities, and process the video feed on a separate laptop to determine an accurate location of all people in view of the drone relative to the location of a phone or viewing device nearby. This phone or viewing device would run an augmented-reality application using position information from the drone system to overlay the positions of people near the drone over first-person perspective video. The end result would allow someone to slide/toss the drone into a room, and after a second or two, be able to "see through the wall" where anyone in the room is.

# Solution Components

## Drone and Sensors

The drone itself will be a basic lightweight quadcopter design. The frame will be constructed using a 2D design cut from a sheet of carbon fiber and assembled with aluminum hardware and thread locks. The total volume including the rotor blades should not exceed 4" H by 8" W by 8" L at maximum (ideally much less). This simple frame will consist of a rectangular section to mount the PCB and a 2S (7.4 V) LiPo pack of about 2" x 2" or less, and four identical limbs mounted to the corners. On each of the four limbs will be brushless DC motors (EMAX XA2212 2-3S) driven by electronic speed controllers from the PCB (assuming they can't be pre-purchased). The PCB will have a two-pin DuPont/JST connectors for battery leads, a TP4056 LiPo discharging circuit, and buck converters for necessary voltage(s) all on the underside. On top, the PCB will house an ESP32-S3 microcontroller, an IMU with decent accuracy, a set of mmWave 24 GHz human presence sensor (like the LD2410) and ultrasonic transducers to form a phase array sensor with an accurate, narrow beam to scan for human presence with range. These components will allow the drone to be programmed with very simple and limited autonomous flight behaviors (fly up 5 feet, spin 360 degrees, land) and properly/safely control itself. The ultrasonic transducers and human sensing radars will be the primary method of determining human presence and mostly calculated on the ESP-32, however additional calculation will need to be made on the AR end with the received data. If time and budget allow, we may also include a small 2 MP or 5 MP camera for WiFi video stream or a composite video camera for an analog video stream as a backup/failsafe to the other sensors.

A working rough breakdown of the expected mass of each component will go as follows:

- 4 hobby motors: ~ 50 grams (based on consumer measurements)
- Carbon fiber frame: ~ 40 grams (estimate based on similar style and sized frames)
- 2S 500 mAh battery: ~30 grams (based on common commercial LiPo product info)
- PCB with MCU & peripherals: ~50 grams (based on measurements of similar boards)
- 10-20 ultrasonic transducers: ~50 grams (based on commercial component info)
- Metal hardware/fasteners & miscellaneous: ~25 grams (accounting for error as well)
- Total mass: ~255 grams
- Total thrust (at 7.6 V 7.3 A): ~2000 grams (from manufacturer ratings)
- Thrust/weight is well over 2.0 and should allow for quick movement and considerable stability along with the improved frame considerations, and also extra room for more weight if needed.

## AR Viewer or Headset

To create a useful augmented-reality display of the collected position data, the simplest way will be to write an app that uses the digital camera and gyroscope/IMU API's of a smart phone to overlay highlighted human position data on a live camera view. We would use the android studio platform to create this custom app which would interface with the data incoming from the drone. Building upon the android API's we would overlay the data to the phone camera. If we have more time to develop one, a headset or AR glasses could make the experience more useful (hands-free) and immersive. We may also use a laptop at this stage to run a server alongside the app for better processing.

# Working Supply List

*some can be found in student self-service, some need to be ordered
- Carbon fiber sheet (find appropriate size and 2-3 mm thick)
- Aluminum machine screws with lock-tite or bolt/nut with locking washer
- 4 EMAX brushless DC motors and mounting hardware
- 4 quadcopter rotor blades
- 2S (7.6 V) 500 mAh LiPo battery
- Custom PCB
- ESP32-S3 chip w/ PCB antenna
- 20 ultrasonic (40 kHz) transducer cans
- 4 mmWave 24 GHz human presence radar sensors
- TP 4056 LiPo Charging IC (find other necessary SMD components)
- DuPont two-pin connector for LiPo charging/discharging (choose whether removable battery design)
- Various SMD LEDs to indicate functionalities or states on PCB
- Voltage buck converter circuit components
- ESC circuit components
- Adafruit Accelerometer

# Criterion For Success

The best criteria for the success of this project is whether our handheld device or headset can effectively communicate human position data of a visually obstructed location to a nearby user within an accuracy of 1-2 meters while still allowing the user to carry out personal tasks. The video feed should be stable with minimal latency as to be effective and usable, and estimated human positions should be updated only when they are positively in view and information about the recency of data should be apparent (maybe a red highlight on new people, yellow on a stale location, and green for a newly updated position).

Illini Voyager

Cameron Jones, Christopher Xu

Featured Project

# Illini Voyager

Team Members:

- Christopher Xu (cyx3)

- Cameron Jones (ccj4)

# Problem

Weather balloons are commonly used to collect meteorological data, such as temperature, pressure, humidity, and wind velocity at different layers of the atmosphere. These data are key components of today’s best predictive weather models, and we rely on the constant launch of radiosondes to meet this need. Most weather balloons cannot control their altitude and direction of travel, but if they could, we would be able to collect data from specific regions of the atmosphere, avoid commercial airspaces, increase range and duration of flights by optimizing position relative to weather forecasts, and avoid pollution from constant launches. A long endurance balloon platform also uniquely enables the performance of interesting payloads, such as the detection of high energy particles over the Antarctic, in situ measurements of high-altitude weather phenomena in remote locations, and radiation testing of electronic components. Since nearly all weather balloons flown today lack the control capability to make this possible, we are presented with an interesting engineering challenge with a significant payoff.

# Solution

We aim to solve this problem through the use of an automated venting and ballast system, which can modulate the balloon’s buoyancy to achieve a target altitude. Given accurate GPS positioning and modeling of the jetstream, we can fly at certain altitudes to navigate the winds of the upper atmosphere. The venting will be performed by an actuator fixed to the neck of the balloon, and the ballast drops will consist of small, biodegradable BBs, which pose no threat to anything below the balloon. Similar existing solutions, particularly the Stanford Valbal project, have had significant success with their long endurance launches. We are seeking to improve upon their endurance by increasing longevity from a power consumption and recharging standpoint, implementing a more capable altitude control algorithm which minimizes helium and ballast expenditures, and optimizing mechanisms to increase ballast capacity. With altitude control, the balloon has access to winds going in different directions at different layers in the atmosphere, making it possible to roughly adjust its horizontal trajectory and collect data from multiple regions in one flight.

# Solution Components

## Vent Valve and Cut-down (Mechanical)

A servo actuates a valve that allows helium to exit the balloon, decreasing the lift. The valve must allow enough flow when open to slow the initial ascent of the balloon at the cruising altitude, yet create a tight seal when closed. The same servo will also be able to detach or cut down the balloon in case we need to end the flight early. A parachute will deploy under free fall.

## Ballast Dropper (Mechanical)

A small DC motor spins a wheel to drop [biodegradable BBs](https://www.amazon.com/Force-Premium-Biodegradable-Airsoft-Ammo-20/dp/B08SHJ7LWC/). As the total weight of the system decreases, the balloon will gain altitude. This mechanism must drop BBs at a consistent weight and operate for long durations without jamming or have a method of detecting the jams and running an unjamming sequence.

## Power Subsystem (Electrical)

The entire system will be powered by a few lightweight rechargeable batteries (such as 18650). A battery protection system (such as BQ294x) will have an undervoltage and overvoltage cutoff to ensure safe voltages on the cells during charge and discharge.

## Control Subsystem (Electrical)

An STM32 microcontroller will serve as our flight computer and has the responsibility for commanding actuators, collecting data, and managing communications back to our ground console. We’ll likely use an internal watchdog timer to recover from system faults. On the same board, we’ll have GPS, pressure, temperature, and humidity sensors to determine how to actuate the vent valve or ballast.

## Communication Subsystem (Electrical)

The microcontroller will communicate via serial to the satellite modem (Iridium 9603N), sending small packets back to us on the ground with a minimum frequency of once per hour. There will also be a LED beacon visible up to 5 miles at night to meet regulations. We have read through the FAA part 101 regulations and believe our system meets all requirements to enable a safe, legal, and ethical balloon flight.

## Ground Subsystem (Software)

We will maintain a web server which will receive location reports and other data packets from our balloon while it is in flight. This piece of software will also allow us to schedule commands, respond to error conditions, and adjust the control algorithm while in flight.

# Criterion For Success

We aim to launch the balloon a week before the demo date. At the demo, we will present any data collected from the launch, as well as an identical version of the avionics board showing its functionality. A quantitative goal for the balloon is to survive 24 hours in the air, collect data for that whole period, and report it back via the satellite modem.

![Block diagram](https://i.imgur.com/0yazJTu.png)