Project

# Title Team Members TA Documents Sponsor
25 Building Interior Reconnaissance Drone (BIRD)
Jack Lavin
Jacob Witek
Mark Viz
Shiyuan Duan
# Building interior reconnaissance drone proposal

Team Members:
- Mark Viz (markjv2)
- Jack Lavin (jlavin4)
- Jacob Witek (witek5)

# Problem

There are many situations when law enforcement or emergency medical service professionals need quick, real-time, useful information about a non-visible location without sending in a human to gather this information due to present risks. One of the most important things to know in these situations is if there are people in a room or area, and if so, where they are located. While there are current promising solutions used by these professionals, they can rarely be operated by one person and take away time and manpower from situations which usually greatly require both. Our solution attempts to address these issues while providing an easy-to-use interface with critical information.

# Solution

Our solution to this issue is to use a reconnaissance drone equipped with a camera and other sensing components and simple autonomous behavior capabilities, and process the video feed on a separate laptop to determine an accurate location of all people in view of the drone relative to the location of a phone or viewing device nearby. This phone or viewing device would run an augmented-reality application using position information from the drone system to overlay the positions of people near the drone over first-person perspective video. The end result would allow someone to slide/toss the drone into a room, and after a second or two, be able to "see through the wall" where anyone in the room is.

# Solution Components

## Drone and Sensors

The drone itself will be a basic lightweight quadcopter design. The frame will be constructed using a 2D design cut from a sheet of carbon fiber and assembled with aluminum hardware and thread locks. The total volume including the rotor blades should not exceed 4" H by 8" W by 8" L at maximum (ideally much less). This simple frame will consist of a rectangular section to mount the PCB and a 2S (7.4 V) LiPo pack of about 2" x 2" or less, and four identical limbs mounted to the corners. On each of the four limbs will be brushless DC motors (EMAX XA2212 2-3S) driven by electronic speed controllers from the PCB (assuming they can't be pre-purchased). The PCB will have a two-pin DuPont/JST connectors for battery leads, a TP4056 LiPo discharging circuit, and buck converters for necessary voltage(s) all on the underside. On top, the PCB will house an ESP32-S3 microcontroller, an IMU with decent accuracy, a set of mmWave 24 GHz human presence sensor (like the LD2410) and ultrasonic transducers to form a phase array sensor with an accurate, narrow beam to scan for human presence with range. These components will allow the drone to be programmed with very simple and limited autonomous flight behaviors (fly up 5 feet, spin 360 degrees, land) and properly/safely control itself. The ultrasonic transducers and human sensing radars will be the primary method of determining human presence and mostly calculated on the ESP-32, however additional calculation will need to be made on the AR end with the received data. If time and budget allow, we may also include a small 2 MP or 5 MP camera for WiFi video stream or a composite video camera for an analog video stream as a backup/failsafe to the other sensors.

A working rough breakdown of the expected mass of each component will go as follows:

- 4 hobby motors: ~ 50 grams (based on consumer measurements)
- Carbon fiber frame: ~ 40 grams (estimate based on similar style and sized frames)
- 2S 500 mAh battery: ~30 grams (based on common commercial LiPo product info)
- PCB with MCU & peripherals: ~50 grams (based on measurements of similar boards)
- 10-20 ultrasonic transducers: ~50 grams (based on commercial component info)
- Metal hardware/fasteners & miscellaneous: ~25 grams (accounting for error as well)
- Total mass: ~255 grams
- Total thrust (at 7.6 V 7.3 A): ~2000 grams (from manufacturer ratings)
- Thrust/weight is well over 2.0 and should allow for quick movement and considerable stability along with the improved frame considerations, and also extra room for more weight if needed.

## AR Viewer or Headset

To create a useful augmented-reality display of the collected position data, the simplest way will be to write an app that uses the digital camera and gyroscope/IMU API's of a smart phone to overlay highlighted human position data on a live camera view. We would use the android studio platform to create this custom app which would interface with the data incoming from the drone. Building upon the android API's we would overlay the data to the phone camera. If we have more time to develop one, a headset or AR glasses could make the experience more useful (hands-free) and immersive. We may also use a laptop at this stage to run a server alongside the app for better processing.

# Working Supply List

*some can be found in student self-service, some need to be ordered
- Carbon fiber sheet (find appropriate size and 2-3 mm thick)
- Aluminum machine screws with lock-tite or bolt/nut with locking washer
- 4 EMAX brushless DC motors and mounting hardware
- 4 quadcopter rotor blades
- 2S (7.6 V) 500 mAh LiPo battery
- Custom PCB
- ESP32-S3 chip w/ PCB antenna
- 20 ultrasonic (40 kHz) transducer cans
- 4 mmWave 24 GHz human presence radar sensors
- TP 4056 LiPo Charging IC (find other necessary SMD components)
- DuPont two-pin connector for LiPo charging/discharging (choose whether removable battery design)
- Various SMD LEDs to indicate functionalities or states on PCB
- Voltage buck converter circuit components
- ESC circuit components
- Adafruit Accelerometer

# Criterion For Success

The best criteria for the success of this project is whether our handheld device or headset can effectively communicate human position data of a visually obstructed location to a nearby user within an accuracy of 1-2 meters while still allowing the user to carry out personal tasks. The video feed should be stable with minimal latency as to be effective and usable, and estimated human positions should be updated only when they are positively in view and information about the recency of data should be apparent (maybe a red highlight on new people, yellow on a stale location, and green for a newly updated position).

S.I.P. (Smart Irrigation Project)

Jackson Lenz, James McMahon

S.I.P. (Smart Irrigation Project)

Featured Project

Jackson Lenz

James McMahon

Our project is to be a reliable, robust, and intelligent irrigation controller for use in areas where reliable weather prediction, water supply, and power supply are not found.

Upon completion of the project, our device will be able to determine the moisture level of the soil, the water level in a water tank, and the temperature, humidity, insolation, and barometric pressure of the environment. It will perform some processing on the observed environmental factors to determine if rain can be expected soon, Comparing this knowledge to the dampness of the soil and the amount of water in reserves will either trigger a command to begin irrigation or maintain a command to not irrigate the fields. This device will allow farmers to make much more efficient use of precious water and also avoid dehydrating crops to death.

In developing nations, power is also of concern because it is not as readily available as power here in the United States. For that reason, our device will incorporate several amp-hours of energy storage in the form of rechargeable, maintenance-free, lead acid batteries. These batteries will charge while power is available from the grid and discharge when power is no longer available. This will allow for uninterrupted control of irrigation. When power is available from the grid, our device will be powered by the grid. At other times, the batteries will supply the required power.

The project is titled S.I.P. because it will reduce water wasted and will be very power efficient (by extremely conservative estimates, able to run for 70 hours without input from the grid), thus sipping on both power and water.

We welcome all questions and comments regarding our project in its current form.

Thank you all very much for you time and consideration!