Project

# Title Team Members TA Documents Sponsor
2 Bird Simulator
Anthony Amella
Eli Yang
Emily Liu
Shiyuan Duan design_document1.pdf
proposal1.pdf
# Bird Simulator

Team Members:
- Anthony Amella (aamel2)
- Emily Liu (el20)
- Eli Yang (eliyang2)

# Problem

FPV drones give people a chance to experience immersive flight through FPV goggles, improving engagement. However, this immersion is primarily visual and does not allow for physical control such as motion cues or body orientation. This results in an experience with a realism factor missing for people who want an even more exhilarating experience.

# Solution

Our bird simulator will allow the pilot to control a drone using motion. This system will consist of a drone with a camera, FPV goggles, and a suit connected to IMUs that can be worn by a person that will read information about how their body moves and is oriented. The motion captured by the suit will then be converted to instructions that the drone can use to maneuver in its environment.


# Solution Components

## Visuals

We will use 5.8 GHz radio to transmit video data from the drone to the goggles using a pair of transmitters and receivers (RTC6705 and RTC6715). These RF modules handle amplifying, mixing, and modulating/demodulating signals, while leaving us the ability to configure and program the module through SPI with a microcontroller. We will use a camera that outputs analog video to be transmitted by the RTC6705 and received by the RTC6715 module in the goggles to be converted to composite video and displayed on a small screen.

We expect the development of the other subsystems to require a lot of trial and error, so we will develop a virtual simulation environment using JavaScript/WebGL that will allow testing with less safety concerns.

## Drone

We will design and manufacture a drone from scratch. The body of the drone will be made through a waterjet from carbon fiber, similar to existing COTS racing drones. Tentatively, we will make the drone on a 3-inch frame. Notably, the drone will have a servo attached to the FPV camera, which will allow for pitch to be changed mid-flight. This will allow the drone to look forward, regardless of the position of the actual drone body. This will allow the FPV pilot to feel more like a bird, since birds generally look forward during flight, regardless of their speed. The drone will consist of a 5.8GHz AM radio transmitter, as described above, as well as a 2.4GHz SX1280 receiver for control signals from the pilot. We will also make our own ESCs, allowing us to control the motors with a custom BLDC controller with FDMC8010 MOSFETs. The drone will have auto-leveling capabilities, harnessing the IMU in the drone body. This will allow for easier flight, with the drone staying roughly level.

## Control

There will be 4 IMUs embedded in a wearable suit that will collect data to be combined and used to determine the motion and orientation of the user: one on each arm, one on the head, and one on the torso. We plan to use the IIM-20670 which includes a gyroscope and accelerometer and communicates with the MCU using SPI. Movements such as head rotation, wing flapping, body orientation, and others to be determined will be translated to stick inputs on a normal drone controller.

We will also make a normal drone controller to override suit inputs and take over control in case the drone starts behaving unexpectedly. Both the suit and the controller will transmit signals using a 2.4 GHz transceiver (SX1280), which will be received by the drone also equipped with an SX1280. Using these modules requires writing driver code to facilitate communication with the MCU.

# Criterion For Success

At a minimum, we will make a drone that is able to control four BLDC motors, as well as receive 2.4GHz control signals and transmit 5.8GHz video. The drone will have some form of auto-leveling with a built in IMU, as well as a camera with variable pitch. We will also make a bird suit, with four IMUs that can generate signals that could control the drone. These signals will initially be used to control a drone simulator, programmed in WebGL. If time permits, these signals will also control the drone, allowing for real-world flight. Of note, Eli Yang has a FAA Remote Pilot Certification, allowing for legal outside flight. To start, we will use off-the-shelf FPV goggles, but we will make our own if time permits.



STRE&M: Automated Urinalysis (Pitched Project)

Gage Gulley, Adrian Jimenez, Yichi Zhang

STRE&M: Automated Urinalysis (Pitched Project)

Featured Project

Team Members:

- Gage Gulley (ggulley2)

- Adrian Jimenez (adrianj2)

- Yichi Zhang (yichi7)

The STRE&M: Automated Urinalysis project was pitched by Mukul Govande and Ryan Monjazeb in conjunction with the Carle Illinois College of Medicine.

#Problem:

Urine tests are critical tools used in medicine to detect and manage chronic diseases. These tests are often over the span of 24 hours and require a patient to collect their own sample and return it to a lab. With this inconvenience in current procedures, many patients do not get tested often, which makes it difficult for care providers to catch illnesses quickly.

The tedious process of going to a lab for urinalysis creates a demand for an “all-in-one” automated system capable of performing this urinalysis, and this is where the STRE&M device comes in. The current prototype is capable of collecting a sample and pushing it to a viewing window. However, once it gets to the viewing window there is currently not an automated way to analyze the sample without manually looking through a microscope, which greatly reduces throughput. Our challenge is to find a way to automate the data collection from a sample and provide an interface for a medical professional to view the results.

# Solution

Our solution is to build an imaging system with integrated microscopy and absorption spectroscopy that is capable of transferring the captured images to a server. When the sample is collected through the initial prototype our device will magnify and capture the sample as well as utilize an absorbance sensor to identify and quantify the casts, bacteria, and cells that are in the sample. These images will then be transferred and uploaded to a server for analysis. We will then integrate our device into the existing prototype.

# Solution Components

## Subsystem1 (Light Source)

We will use a light source that can vary its wavelengths from 190-400 nm with a sampling interval of 5 nm to allow for spectroscopy analysis of the urine sample.

## Subsystem2 (Digital Microscope)

This subsystem will consist of a compact microscope with auto-focus, at least 100x magnification, and have a digital shutter trigger.

## Subsystem3 (Absorbance Sensor)

To get the spectroscopy analysis, we also need to have an absorbance sensor to collect the light that passes through the urine sample. Therefore, an absorbance sensor is installed right behind the light source to get the spectrum of the urine sample.

## Subsystem4 (Control Unit)

The control system will consist of a microcontroller. The microcontroller will be able to get data from the microscope and the absorbance sensor and send data to the server. We will also write code for the microcontroller to control the light source. ESP32-S3-WROOM-1 will be used as our microcontroller since it has a built-in WIFI module.

## Subsystem5 (Power system)

The power system is mainly used to power the microcontroller. A 9-V battery will be used to power the microcontroller.

# Criterion For Success

- The overall project can be integrated into the existing STRE&M prototype.

- There should be wireless transfer of images and data to a user-interface (either phone or computer) for interpretation

- The system should be housed in a water-resistant covering with dimensions less than 6 x 4 x 4 inches

Project Videos