Project

# Title Team Members TA Documents Sponsor
79 Universal Gesture Interface
Connor Michalec
Kenobi Carpenter
Kobe Duda
Lukas Dumasius design_document1.pdf
proposal1.pdf
# Universal Gesture Interface

Team members:
- Kenobi Carpenter (joseph48)
- Kobe Duda (ksduda2)
- Connor Michalec (connor15)
# Problem

Since the invention of the personal computer, the interface between humans and computers has remained relatively unchanged. The keyboard and mouse layout has proven highly effective for the majority of use cases, but its mostly-discrete nature greatly restricts the possible ways humans can interact with computer applications.

Much of the way we interact with the world requires expressive, free-flowing modes of interaction. Activities like playing an instrument, martial arts, dancing, or sculpting often can’t simply be described by a series of inputs in the correct order at the correct time. They take place in continuous, 3D space—yet, the most complex expression we typically get with a computer is the 2D plane that a mouse movement provides.

Some solutions exist to address this need, the most notable of these being VR headsets. However, these headsets tend to be expensive, large, and lead to feelings of fatigue and nausea for many users. As it currently stands, there is no low-cost, low-fatigue, desk-friendly input device that allows continuous spatial interaction on PC. Such a device would open new possibilities for how users interface with programs while also improving accessibility for those lacking in fine motor skills, i.e. limited finger dexterity.
# Solution

We propose a wearable gesture-detecting glove that allows users to interface with computer applications through hand and finger motions. This glove will have a wired USB connection (though wireless would be ideal, we are omitting it for the sake of scope) with two interfaces. The first interface is an HID compliant mouse, allowing the glove to be generally used for regular applications, while the second interface streams live 3D movement data to be interpreted by specialized applications. This dual-interface approach allows the glove to stand on its own as a general-purpose tool while also granting the extensibility to be leveraged to its full potential by specialized applications.

The sensor layout will consist of a 9-DOF IMU placed on the back of the hand for broad movements, three flex sensors in the index, middle finger, and thumb, and three force-sensitive resistors (FSRs) on the fingertips to detect touch.

Finally, the device will feature on-board DSP on the MCU. It will process raw sensor data and interpret a predefined set of gestures, then send those interpreted actions as discrete inputs to the target USB device.
# Solution Components

## Subsystem 1: IMU Unit

Components: ICM-20948

This 9-axis accelerometer will be used for detecting broad-phase translational and rotational movements of the hand. It will be mounted to the back of the palm, and raw sensor data will be sent over SPI to the MCU for processing.
## Subsystem 2: Flex sensors

Components: Adafruit Industries Short Flex/Bend Sensor

We will mount three flex sensors to the thumb, index finger, and middle finger. They will be connected each to an ADC port by voltage divider with a 50kOhm resistor. 0.1uF capacitors will be used for noise reduction. Used for detecting specific hand positions.
## Subsystem 3: Touch sensors

Components: Geekcreit FSR402 Force Sensitive Resistor

Three force-sensitive resistors will be attached to the tips of the thumb, index finger, and middle finger. Similar to the flex sensors, they will be wired to ADCs with voltage dividers (22kOhm) to be read by the MCU. Used for detecting pinching, tapping, and pressing.
## Subsystem 4: Microprocessor

Components STM32F405 Microprocessor

This microprocessor takes as input all of the aforementioned sensor data and sends USB as output. The processor itself has been chosen for its DSP capabilities, as processing sensor inputs and identifying them as gestures will constitute a considerable portion of this project. Attached to the PCB will be a USB port for connecting to a computer, over which identified gestures are sent as inputs to the computer.

This is also where most of our design decisions will be integrated. For example, the IMU is prone to drift, meaning we’ll have to make UX decisions that mitigate its influence, i.e. only moving the mouse when a finger is down on the desk.
## Subsystem 5: Physical Frame

Another important aspect of the project will be the physical design itself. In order for our project to be even moderately successful, it has to be wearable. This presents the unique challenge of designing a glove that is both comfortable and can house the electronic components in a way that does not impede movement.
## Subsystem 6: Associated Software

This is not a part of the actual project, but a testbed to demonstrate its capabilities. We will use Unreal Engine 5 to create a very basic flight simulation that allows for controlling the plane with orientation of the user’s hand.

For basic testing, we will also have a barebones program that receives gesture inputs and prints them to the screen when received over serial.
# Criterion for success
- Hand movements are able to reliably move a mouse on the attached device
- The following gestures/actions can be reliably detected and mirrored to test program
- Hand closed
- Hand open
- Light tap (index/middle/thumb)
- Firm press (index/middle/thumb)
- Pinching fingers (index-thumb, middle-thumb)
- Thumbs up
- Thumbs down
- User can successfully navigate a plane in the testbed program through a basic course using hand orientation

Remotely Controlled Self-balancing Mini Bike

Will Chen, Eric Tang, Jiaming Xu

Featured Project

# Remotely Controlled Self-balancing Mini Bike

Team Members:

- Will Chen hongyuc5

- Jiaming Xu jx30

- Eric Tang leweit2

# Problem

Bike Share and scooter share have become more popular all over the world these years. This mode of travel is gradually gaining recognition and support. Champaign also has a company that provides this service called Veo. Short-distance traveling with shared bikes between school buildings and bus stops is convenient. However, since they will be randomly parked around the entire city when we need to use them, we often need to look for where the bike is parked and walk to the bike's location. Some of the potential solutions are not ideal, for example: collecting and redistributing all of the bikes once in a while is going to be costly and inefficient; using enough bikes to saturate the region is also very cost inefficient.

# Solution

We think the best way to solve the above problem is to create a self-balancing and moving bike, which users can call bikes to self-drive to their location. To make this solution possible we first need to design a bike that can self-balance. After that, we will add a remote control feature to control the bike movement. Considering the possibilities for demonstration are complicated for a real bike, we will design a scaled-down mini bicycle to apply our self-balancing and remote control functions.

# Solution Components

## Subsystem 1: Self-balancing part

The self-balancing subsystem is the most important component of this project: it will use one reaction wheel with a Brushless DC motor to balance the bike based on reading from the accelerometer.

MPU-6050 Accelerometer gyroscope sensor: it will measure the velocity, acceleration, orientation, and displacement of the object it attaches to, and, with this information, we could implement the corresponding control algorithm on the reaction wheel to balance the bike.

Brushless DC motor: it will be used to rotate the reaction wheel. BLDC motors tend to have better efficiency and speed control than other motors.

Reaction wheel: we will design the reaction wheel by ourselves in Solidworks, and ask the ECE machine shop to help us machine the metal part.

Battery: it will be used to power the BLDC motor for the reaction wheel, the stepper motor for steering, and another BLDC motor for movement. We are considering using an 11.1 Volt LiPo battery.

Processor: we will use STM32F103C8T6 as the brain for this project to complete the application of control algorithms and the coordination between various subsystems.

## Subsystem 2: Bike movement, steering, and remote control

This subsystem will accomplish bike movement and steering with remote control.

Servo motor for movement: it will be used to rotate one of the wheels to achieve bike movement. Servo motors tend to have better efficiency and speed control than other motors.

Stepper motor for steering: in general, stepper motors have better precision and provide higher torque at low speeds than other motors, which makes them perfect for steering the handlebar.

ESP32 2.4GHz Dual-Core WiFi Bluetooth Processor: it has both WiFi and Bluetooth connectivity so it could be used for receiving messages from remote controllers such as Xbox controllers or mobile phones.

## Subsystem 3: Bike structure design

We plan to design the bike frame structure with Solidworks and have it printed out with a 3D printer. At least one of our team members has previous experience in Solidworks and 3D printing, and we have access to a 3D printer.

3D Printed parts: we plan to use PETG material to print all the bike structure parts. PETG is known to be stronger, more durable, and more heat resistant than PLA.

PCB: The PCB will contain several parts mentioned above such as ESP32, MPU6050, STM32, motor driver chips, and other electronic components

## Bonus Subsystem4: Collision check and obstacle avoidance

To detect the obstacles, we are considering using ultrasonic sensors HC-SR04

or cameras such as the OV7725 Camera function with stm32 with an obstacle detection algorithm. Based on the messages received from these sensors, the bicycle could turn left or right to avoid.

# Criterion For Success

The bike could be self-balanced.

The bike could recover from small external disturbances and maintain self-balancing.

The bike movement and steering could be remotely controlled by the user.

Project Videos