Project
| # | Title | Team Members | TA | Documents | Sponsor |
|---|---|---|---|---|---|
| 79 | Universal Gesture Interface |
Connor Michalec Kenobi Carpenter Kobe Duda |
Lukas Dumasius | proposal1.pdf |
|
| # Universal Gesture Interface Team members: - Kenobi Carpenter (joseph48) - Kobe Duda (ksduda2) - Connor Michalec (connor15) # Problem Since the invention of the personal computer, the interface between humans and computers has remained relatively unchanged. The keyboard and mouse layout has proven highly effective for the majority of use cases, but its mostly-discrete nature greatly restricts the possible ways humans can interact with computer applications. Much of the way we interact with the world requires expressive, free-flowing modes of interaction. Activities like playing an instrument, martial arts, dancing, or sculpting often can’t simply be described by a series of inputs in the correct order at the correct time. They take place in continuous, 3D space—yet, the most complex expression we typically get with a computer is the 2D plane that a mouse movement provides. Some solutions exist to address this need, the most notable of these being VR headsets. However, these headsets tend to be expensive, large, and lead to feelings of fatigue and nausea for many users. As it currently stands, there is no low-cost, low-fatigue, desk-friendly input device that allows continuous spatial interaction on PC. Such a device would open new possibilities for how users interface with programs while also improving accessibility for those lacking in fine motor skills, i.e. limited finger dexterity. # Solution We propose a wearable gesture-detecting glove that allows users to interface with computer applications through hand and finger motions. This glove will have a wired USB connection (though wireless would be ideal, we are omitting it for the sake of scope) with two interfaces. The first interface is an HID compliant mouse, allowing the glove to be generally used for regular applications, while the second interface streams live 3D movement data to be interpreted by specialized applications. This dual-interface approach allows the glove to stand on its own as a general-purpose tool while also granting the extensibility to be leveraged to its full potential by specialized applications. The sensor layout will consist of a 9-DOF IMU placed on the back of the hand for broad movements, three flex sensors in the index, middle finger, and thumb, and three force-sensitive resistors (FSRs) on the fingertips to detect touch. Finally, the device will feature on-board DSP on the MCU. It will process raw sensor data and interpret a predefined set of gestures, then send those interpreted actions as discrete inputs to the target USB device. # Solution Components ## Subsystem 1: IMU Unit Components: ICM-20948 This 9-axis accelerometer will be used for detecting broad-phase translational and rotational movements of the hand. It will be mounted to the back of the palm, and raw sensor data will be sent over SPI to the MCU for processing. ## Subsystem 2: Flex sensors Components: Adafruit Industries Short Flex/Bend Sensor We will mount three flex sensors to the thumb, index finger, and middle finger. They will be connected each to an ADC port by voltage divider with a 50kOhm resistor. 0.1uF capacitors will be used for noise reduction. Used for detecting specific hand positions. ## Subsystem 3: Touch sensors Components: Geekcreit FSR402 Force Sensitive Resistor Three force-sensitive resistors will be attached to the tips of the thumb, index finger, and middle finger. Similar to the flex sensors, they will be wired to ADCs with voltage dividers (22kOhm) to be read by the MCU. Used for detecting pinching, tapping, and pressing. ## Subsystem 4: Microprocessor Components STM32F405 Microprocessor This microprocessor takes as input all of the aforementioned sensor data and sends USB as output. The processor itself has been chosen for its DSP capabilities, as processing sensor inputs and identifying them as gestures will constitute a considerable portion of this project. Attached to the PCB will be a USB port for connecting to a computer, over which identified gestures are sent as inputs to the computer. This is also where most of our design decisions will be integrated. For example, the IMU is prone to drift, meaning we’ll have to make UX decisions that mitigate its influence, i.e. only moving the mouse when a finger is down on the desk. ## Subsystem 5: Physical Frame Another important aspect of the project will be the physical design itself. In order for our project to be even moderately successful, it has to be wearable. This presents the unique challenge of designing a glove that is both comfortable and can house the electronic components in a way that does not impede movement. ## Subsystem 6: Associated Software This is not a part of the actual project, but a testbed to demonstrate its capabilities. We will use Unreal Engine 5 to create a very basic flight simulation that allows for controlling the plane with orientation of the user’s hand. For basic testing, we will also have a barebones program that receives gesture inputs and prints them to the screen when received over serial. # Criterion for success - Hand movements are able to reliably move a mouse on the attached device - The following gestures/actions can be reliably detected and mirrored to test program - Hand closed - Hand open - Light tap (index/middle/thumb) - Firm press (index/middle/thumb) - Pinching fingers (index-thumb, middle-thumb) - Thumbs up - Thumbs down - User can successfully navigate a plane in the testbed program through a basic course using hand orientation |
|||||