Project

# Title Team Members TA Documents Sponsor
52 STRE&M: Automated Urinalysis (Pitched Project)
Best Project in Biomedical Devices
Adrian Jimenez
Gage Gulley
Yichi Zhang
design_document1.pdf
final_paper1.pdf
photo1.jpg
photo2.jpg
presentation1.pptx
proposal2.pdf
video
Team Members:
- Gage Gulley (ggulley2)
- Adrian Jimenez (adrianj2)
- Yichi Zhang (yichi7)

The STRE&M: Automated Urinalysis project was pitched by Mukul Govande and Ryan Monjazeb in conjunction with the Carle Illinois College of Medicine.

#Problem:
Urine tests are critical tools used in medicine to detect and manage chronic diseases. These tests are often over the span of 24 hours and require a patient to collect their own sample and return it to a lab. With this inconvenience in current procedures, many patients do not get tested often, which makes it difficult for care providers to catch illnesses quickly.

The tedious process of going to a lab for urinalysis creates a demand for an “all-in-one” automated system capable of performing this urinalysis, and this is where the STRE&M device comes in. The current prototype is capable of collecting a sample and pushing it to a viewing window. However, once it gets to the viewing window there is currently not an automated way to analyze the sample without manually looking through a microscope, which greatly reduces throughput. Our challenge is to find a way to automate the data collection from a sample and provide an interface for a medical professional to view the results.

# Solution
Our solution is to build an imaging system with integrated microscopy and absorption spectroscopy that is capable of transferring the captured images to a server. When the sample is collected through the initial prototype our device will magnify and capture the sample as well as utilize an absorbance sensor to identify and quantify the casts, bacteria, and cells that are in the sample. These images will then be transferred and uploaded to a server for analysis. We will then integrate our device into the existing prototype.

# Solution Components

## Subsystem1 (Light Source)
We will use a light source that can vary its wavelengths from 190-400 nm with a sampling interval of 5 nm to allow for spectroscopy analysis of the urine sample.

## Subsystem2 (Digital Microscope)
This subsystem will consist of a compact microscope with auto-focus, at least 100x magnification, and have a digital shutter trigger.

## Subsystem3 (Absorbance Sensor)
To get the spectroscopy analysis, we also need to have an absorbance sensor to collect the light that passes through the urine sample. Therefore, an absorbance sensor is installed right behind the light source to get the spectrum of the urine sample.

## Subsystem4 (Control Unit)
The control system will consist of a microcontroller. The microcontroller will be able to get data from the microscope and the absorbance sensor and send data to the server. We will also write code for the microcontroller to control the light source. ESP32-S3-WROOM-1 will be used as our microcontroller since it has a built-in WIFI module.

## Subsystem5 (Power system)
The power system is mainly used to power the microcontroller. A 9-V battery will be used to power the microcontroller.

# Criterion For Success

- The overall project can be integrated into the existing STRE&M prototype.
- There should be wireless transfer of images and data to a user-interface (either phone or computer) for interpretation
- The system should be housed in a water-resistant covering with dimensions less than 6 x 4 x 4 inches

VoxBox Robo-Drummer

Craig Bost, Nicholas Dulin, Drake Proffitt

VoxBox Robo-Drummer

Featured Project

Our group proposes to create robot drummer which would respond to human voice "beatboxing" input, via conventional dynamic microphone, and translate the input into the corresponding drum hit performance. For example, if the human user issues a bass-kick voice sound, the robot will recognize it and strike the bass drum; and likewise for the hi-hat/snare and clap. Our design will minimally cover 3 different drum hit types (bass hit, snare hit, clap hit), and respond with minimal latency.

This would involve amplifying the analog signal (as dynamic mics drive fairly low gain signals), which would be sampled by a dsPIC33F DSP/MCU (or comparable chipset), and processed for trigger event recognition. This entails applying Short-Time Fourier Transform analysis to provide spectral content data to our event detection algorithm (i.e. recognizing the "control" signal from the human user). The MCU functionality of the dsPIC33F would be used for relaying the trigger commands to the actuator circuits controlling the robot.

The robot in question would be small; about the size of ventriloquist dummy. The "drum set" would be scaled accordingly (think pots and pans, like a child would play with). Actuators would likely be based on solenoids, as opposed to motors.

Beyond these minimal capabilities, we would add analog prefiltering of the input audio signal, and amplification of the drum hits, as bonus features if the development and implementation process goes better than expected.

Project Videos