Project

# Title Team Members TA Documents Sponsor
11 Glove Controlled Drone
Aneesh Nagalkar
Atsi Gupta
Zach Greening
Wenjing Song design_document1.pdf
proposal1.pdf
Glove Controlled Drone

Team Members
- Aneesh Nagalkar (aneeshn3)
- Zach Greening (zg29)
- Atsi Gupta (atsig2)

# Problem
Controlling drones typically requires handheld remote controllers or smartphones, which may not feel natural and can limit user interaction. A more intuitive way to control drones could increase accessibility, improve user experience, and open possibilities for new applications such as training, entertainment, or assistive technology.


# Solution
Our group proposes building a wearable gesture-control glove that sends commands to a quadcopter. The glove will use motion sensors to detect the user’s hand orientation and movements, translating them into drone commands (e.g., tilting forward moves the drone forward). The glove will transmit these commands wirelessly to the quadcopter through an ESP32 Wi-Fi module. The drone will be purchased in parts to simplify integration and ensure reliable flight mechanics, while the glove will be custom-built.

To improve from previous iterations of similar projects, we plan to:
- Use IMU sensors instead of flex sensors for more precise and complex gesture detection.
- Add haptic feedback to communicate status updates to the user (e.g., low battery, weak signal).
- Implement an emergency shutoff mechanism triggered by a specific hand gesture (e.g., closing the hand).
- Potentially integrate a camera onto the quad copter that will be signalled by a different hand gesture.

The system is also scalable to include advanced commands such as speed adjustments based on motion severity.

# Solution Subsystems
**Subsystem 1: Gesture Detection**
- IMU and gyroscope sensors embedded in the glove to detect orientation and movement.
- Sensor fusion algorithms to interpret gestures into defined drone commands.

1. Three axis gyroscope: mpu-6050
2. IMU: Pololu MinIMU-9 v6
Controls:
Here is a clear definition of how the drone will move
- Drone maintains a constant hover height (handled by the drone’s onboard flight controller barometer/altimeter stabilization)
- The glove only controls horizontal motion and yaw (turning
- Pitch forward (tilt hand down): Move forward
- Pitch backward (tilt hand up): Move backward
- Roll left (tilt hand left): Strafe left
- Roll right (tilt hand right): Strafe right
- Yaw (rotate wrist clockwise/counter-clockwise): Turn left/right
- Clenched fist (or another distinct gesture): Emergency stop / shutoff

**Subsystem 2: Communication Module**
- ESP32 microcontroller on the glove acts as the transmitter.
- Wi-Fi connection to the drone for sending control signals.

1. ESP32 microcontroller
2. Integrated ESP32 wifi chip
3. Voltage regulation

**Subsystem 3: Quadcopter Hardware**
- Drone hardware purchased off-the-shelf to ensure stable flight.
- Integrated with receiver to interpret Wi-Fi commands from the glove

1. LiteWing – ESP32-Based Programmable Drone

**Subsystem 4: Feedback and Safety Enhancements**
- Haptic motors embedded in the glove to provide vibration-based feedback.
- Emergency shutoff gesture detection for immediate drone power-down.

1. Vibrating Actuator: Adafruit 10 mm Vibration Motor
2. Driver for actuator: Precision Microdrives 310-117
3. Battery: Adafruit 3.7 V 1000 mAh Li-Po
4. Glove that components will be affixed to

# Criterion for Success, minimum 5/7 of these
- The glove reliably detects and distinguishes between multiple hand movements.
- The drone responds in real time to glove commands with minimal delay.
- Basic directional commands (forward, back, left, right, up, down) work consistently.
- Scaled commands (e.g., varying speed/acceleration) function correctly.
- Haptic feedback provides clear communication of system status to the user.
- The emergency shutoff mechanism works reliably and immediately.
- The system demonstrates smooth, safe, and intuitive user control during a test flight.

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos