Project

# Title Team Members TA Documents Sponsor
59 Gesture Controlled Surveillance Robot
Kushl Saboo
Roshni Mathew
Suvid Singh
Argyrios Gerogiannis
# Gesture Controlled Surveillance Robot

Team Members:
- Roshni Mathew (roshnim3)
- Kushl Saboo (kushls2)
- Suvid Singh (suvids2)

# Problem
In disaster and rescue scenarios (collapsed structures, smoke-filled buildings, unstable debris fields), responders often need quick situational awareness without putting people at additional risk. Small ground robots can provide remote surveillance, but many are controlled using joysticks or complex interfaces that require training and constant fine-grained input. In high-stress environments, precise manual control becomes a liability as it increases cognitive load, slows down deployment, and makes it harder for responders to focus on interpreting the scene and coordinating rescue actions. The result is that existing teleoperated robots can be underutilized or difficult to operate effectively when time and attention are limited.

# Solution
We will build a rescue surveillance robot with an intuitive gesture-based control interface that translates simple hand motions into high-level movement commands, paired with onboard safety behaviors to reduce operator burden. The operator wears a gesture device (IMU-based glove or wrist module) that detects orientation/motion and wirelessly transmits commands such as move forward, turn, stop, rotate/scan, and return. The robot executes these commands while enforcing safety constraints (slowing/stopping near obstacles), and provides real-time situational awareness through video streaming and sensor feedback. This enables faster, more natural control than a traditional remote controller, allowing responders to deploy the robot quickly and maintain attention on the environment rather than micromanaging the robot’s motion.

# Solution Components

## Subsystem 1
We want to make a glove that would recognize the different gestures made and transmit the corresponding motion to the robot. The motions we want the glove to recognize are forward/backward, turn left/right, and stop. Additional features, if we have time, would include “come back” and “spin/dance”.

Base System - Custom PCB
1. IMU
2. Bluetooth Transmitter/Receiver
3. 3-4 Flex sensors (1 for each finger)
4. 1 MCU (think Raspberry Pi chip)
5. Buttons to control the mode and turn on
6. Battery (PSU)

Additional System:
1. 1 Haptic Feedback Module

With the base system, the purpose of the IMU would be to detect pitch and roll because these motions would correspond with directions. Then the flex sensors would be used to detect stop and come back. We would have an MCU on the glove that will detect the different movements and send commands to the robot.

For the bonus features, we would like to have a receiver that recognizes it for our bonus feature of obstacle avoidance. When the robot has detected an obstacle and has stopped, it lets the user know through haptic feedback that it cannot move in that direction. Another bonus feature would have the glove be in different modes where it can control either the camera move (spin to see different areas).

## Subsystem 2
We want to build a system on the robot. The robot will be receiving the commands from the glove and then moving in the corresponding direction. Here are the components that will be required:

Base System - not PCB
1. Bluetooth Transmitter/Receiver
2. Motors
3. Caterpillar Track (For multi-terrain compatibility)
4. Raspberry Pi Board

Additional System
1. Camera for surveillance
2. TOF(Lidar) sensors
3. Heat/Night vision camera? (Better at looking through debris?)(Maybe too expensive?)

The robot base system will accept commands from the glove and then move accordingly. We have a caterpillar track for multi-train capability. We will use a Raspberry Pi board for receiving and executing the commands. The purpose of the board is so that we can easily add other modules for the additional system features.

The additional system will include a camera that will transmit the camera data to an external laptop. Then we will have Lidar sensors for obstacle avoidance so that if you give an instruction to the robot but it will hit an obstacle to do the command it will stop and transmit that back to the arm.

# Criterion For Success

The project will be considered successful if the following functional and performance objectives are met:

## 1. Reliable Gesture Recognition (Glove Subsystem)

The glove must accurately detect user gestures using IMU orientation (pitch and roll) and finger flex sensor inputs. The system must correctly classify and generate control commands corresponding to:

- Move forward
- Move backward
- Turn left
- Turn right
- Stop

## 2. Wireless Communication
The glove subsystem must transmit gesture commands to the robot wirelessly using Bluetooth (BLE).

## 3. Robot Motion Execution
The robot subsystem must correctly interpret received commands and translate them into motion, reliably performing:
- Forward and backward motion
- Left and right turns
- A 360° surveillance spin

## Stretch Goals (Advanced Success Criteria)
### 1. Safety Through Obstacle Avoidance
The robot must integrate onboard distance sensing (ToF/LiDAR) to prevent unsafe movements. The robot must stop before impact. The system must override unsafe commands in real time.


### 2. Haptic Feedback to User (Closed-Loop System)
When the robot is unable to execute a command due to an obstacle, haptic feedback must be sent to the glove to notify the user.
### 3. Camera/visual feedback
We will add a camera or thermal/infrared sensing method to detect human presence in low-visibility environments and provide easy remote control.

The Marching Band Assistant

Wynter Chen, Alyssa Louise Licudine, Prashant Shankar

The Marching Band Assistant

Featured Project

NetID/Names

wynterc2 (Wynter Chen), alyssal3 (Alyssa Licudine), shankar7 (Prashant Shankar)

Problem

Drum majors lead and conduct marching bands. One of their main jobs is to maintain tempo for the musicians by moving their hands in specific patterns. However, many drum majors, especially high school students, need to learn how to conduct specific tempos off the top of their head and maintain a consistent tempo without assistance for performances. Even those with musical experience have difficulty knowing for certain what tempo they're conducting without a metronome.

Solution Overview

Our project consists of an arm attachment that aids drum major conducting. The attachment contains an accelerometer that helps determine the tempo in beats per minute via hand movement. A display shows the beats per minute, which allows the drum major to adjust their speed as necessary in real time. The microcontroller data is wirelessly transmitted, and a program can be downloaded that not only visualizes the data in real-time, but provides an option to save recorded data for later. There is also a convenient charging port for the device.

This project is a unique invention that aims to help marching bands. There have been previous projects and inventions that have also digitized the conducting experience, such as the Digital Conducting Baton from Spring 2015. However, these have been in the form of a baton rather than a glove, and are used to alter music files as opposed to providing feedback. Additionally, orchestra conductors use very delicate motions with a baton, while drum majors create large, sharper motions with their arms; thus, we believed that an arm attachment was better suited for marching band usage. Unlike other applications that only integrate digital instruments, this project seeks to assist live performers.

Link to RFA: https://courses.grainger.illinois.edu/ece445/pace/view-topic.asp?id=37939

Project Videos