Project

# Title Team Members TA Documents Sponsor
2 Bird Simulator
Anthony Amella
Eli Yang
Emily Liu
Shiyuan Duan design_document1.pdf
proposal1.pdf
# Bird Simulator

Team Members:
- Anthony Amella (aamel2)
- Emily Liu (el20)
- Eli Yang (eliyang2)

# Problem

FPV drones give people a chance to experience immersive flight through FPV goggles, improving engagement. However, this immersion is primarily visual and does not allow for physical control such as motion cues or body orientation. This results in an experience with a realism factor missing for people who want an even more exhilarating experience.

# Solution

Our bird simulator will allow the pilot to control a drone using motion. This system will consist of a drone with a camera, FPV goggles, and a suit connected to IMUs that can be worn by a person that will read information about how their body moves and is oriented. The motion captured by the suit will then be converted to instructions that the drone can use to maneuver in its environment.


# Solution Components

## Visuals

We will use 5.8 GHz radio to transmit video data from the drone to the goggles using a pair of transmitters and receivers (RTC6705 and RTC6715). These RF modules handle amplifying, mixing, and modulating/demodulating signals, while leaving us the ability to configure and program the module through SPI with a microcontroller. We will use a camera that outputs analog video to be transmitted by the RTC6705 and received by the RTC6715 module in the goggles to be converted to composite video and displayed on a small screen.

We expect the development of the other subsystems to require a lot of trial and error, so we will develop a virtual simulation environment using JavaScript/WebGL that will allow testing with less safety concerns.

## Drone

We will design and manufacture a drone from scratch. The body of the drone will be made through a waterjet from carbon fiber, similar to existing COTS racing drones. Tentatively, we will make the drone on a 3-inch frame. Notably, the drone will have a servo attached to the FPV camera, which will allow for pitch to be changed mid-flight. This will allow the drone to look forward, regardless of the position of the actual drone body. This will allow the FPV pilot to feel more like a bird, since birds generally look forward during flight, regardless of their speed. The drone will consist of a 5.8GHz AM radio transmitter, as described above, as well as a 2.4GHz SX1280 receiver for control signals from the pilot. We will also make our own ESCs, allowing us to control the motors with a custom BLDC controller with FDMC8010 MOSFETs. The drone will have auto-leveling capabilities, harnessing the IMU in the drone body. This will allow for easier flight, with the drone staying roughly level.

## Control

There will be 4 IMUs embedded in a wearable suit that will collect data to be combined and used to determine the motion and orientation of the user: one on each arm, one on the head, and one on the torso. We plan to use the IIM-20670 which includes a gyroscope and accelerometer and communicates with the MCU using SPI. Movements such as head rotation, wing flapping, body orientation, and others to be determined will be translated to stick inputs on a normal drone controller.

We will also make a normal drone controller to override suit inputs and take over control in case the drone starts behaving unexpectedly. Both the suit and the controller will transmit signals using a 2.4 GHz transceiver (SX1280), which will be received by the drone also equipped with an SX1280. Using these modules requires writing driver code to facilitate communication with the MCU.

# Criterion For Success

At a minimum, we will make a drone that is able to control four BLDC motors, as well as receive 2.4GHz control signals and transmit 5.8GHz video. The drone will have some form of auto-leveling with a built in IMU, as well as a camera with variable pitch. We will also make a bird suit, with four IMUs that can generate signals that could control the drone. These signals will initially be used to control a drone simulator, programmed in WebGL. If time permits, these signals will also control the drone, allowing for real-world flight. Of note, Eli Yang has a FAA Remote Pilot Certification, allowing for legal outside flight. To start, we will use off-the-shelf FPV goggles, but we will make our own if time permits.



Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos