Project

# Title Team Members TA Documents Sponsor
45 RF-based Long-range Motion Recognition and Communication System
James Tian
Jason Zhang
Joe Luo
Vishal Dayalan design_document2.pdf
final_paper1.pdf
other1.pdf
photo1.jpeg
photo2.jpeg
presentation1.pptx
proposal2.pdf
video1.mp4
video
# Title
RF-based Long-range Motion Recognition and Communication System

Team Members:
- Joe Luo (luo42)
- Zekai Zhang (zekaiz2)
- James Tian (zeyut2)

# Problem
While society accelerates into the digital age, an avid demand for more aggressively intimate ways of communication rises especially as the world welcomes a post-covid traumatic recovery. We witness the emergence of many novel products, albeit with mixed reception, that embrace this new concept, such as VR games, Metaverse, and holographic projection. It’s apparent that in modern days we crave information that goes beyond texts, videos, and sounds, but something mobile, three-dimensional, and interactive – for instance, transferring and reproducing motion across a long distance. It would be impossible if we want to shake hands with a friend that's located a mile apart from us.

Aside from peer-to-peer communication, a long-range motion communication system can be useful in a variety of scenarios. In a classroom setting, whenever a Physics teacher wants to dig further into a relatively abstract concept, like lattice structure and electron concentration in materials, board and chalk and other variants are their only reliable helpers. However, it would be more engaging for both the lecturer and the learner to see a 3D presentation of the topic in question that's able to move and change at our commands. Likewise, a controller that's able to move extended robot arms can sometimes prove to be ineffective, as not everyone is acquainted with controller maneuvers. It will be much easier to understand and control if one is able to move the arm in real time with points of reference placed on limb joints that match the ones on the machine. Other utilities include but are not limited to workplace security, drone navigation, and smart home. All of which can and will be made simpler with a motion recognition and communication system.

# Solution
We propose a duo-terminal system that reads motion data and sends the encoded information through RF communication to the other terminal which deciphers the data and reproduces the motion in real-time with 3D software simulation or mechanical integration like a motor. Built upon a previous project that clones movement data generated from MEMS sensor measurements to 3D animation, we will still work with discreet accelerometer and gyroscope measurements with appropriate sampling rate to ensure a seamless recreation even in a wireless setting.

Two PCBs are needed for each terminal. While most other components for this project will be printed to PCB, for the sake of flexibility of arrangement, IMUs, aka motion sensors, will preferably connect to the rest of the system via STEMMA QT or long wires. Unfettered from the confinement of circuit board, IMUs in free space can adapt to more situations and diversify the motions they can output. A great example is the VR controllers that accompany most VR headsets.

# Solution Components
## Power Supply:
Both RF components and MEMS sensors require a voltage of 3.3V. We will use AA batteries for the power supply, but onboard LDO from selected microcontrollers will help us control the output voltage. Both terminals use the same form of power supply.

## Motion-capturing Subsystem
This system consists of two or more IMUs that are in free space. The LSM6DSO32 6-DoF Accelerometer and Gyroscope IC will fullfill the need. Since we are aiming to use STEMMA QT as the connector, the I2C communication protocol is favored. This subsystem in particular will likely reuse some of the codes and concepts developed in a previous project that can be found here: https://wiki.illinois.edu/wiki/pages/viewpage.action?pageId=785286420.

## RF Transmitter Subsystem
Measurements from the registers of LSM6DSO32 will be sent to the Arduino or an SPI and I2C-enabled microcontroller that processes the information and packs them into a 16 to 32 bit code with at least 4 bits for position and 4 bits for rotation for each three-dimensional axis. The code sent via SPI interface will be transmitted wirelessly through RFM69HCW transceiver with external Antenna connector.

## RF Receiver Subsystem
Information sent through the transmitter will be recovered by another RFM69HCW module connected to another SPI-enabled microcontroller. The microcontroller will analyze and unpack the code to extract the information and prepare the data for respective motion recreation.

## Motion Reproduction Subsystem
The motion reproduction subsystem ideally consists of two major parts – software and hardware.
The software section is realized by receiving data sent through the serial port from the microcontroller at the receiver’s end. The Unity 3D engine will decode the information and animate a 3D model in a fashion similar to the previous project done by Joe Luo mentioned above. (https://wiki.illinois.edu/wiki/pages/viewpage.action?pageId=785286420)
The hardware component consists of a mechanical integration that’s able to recreate simple directional movements, like 3D printed structures or pulse-controlled continuous rotation servo motors (FS90R) that rotate on a 2D plane on a scale dependent on the degrees of rotation of MEMS sensors.

# Criterion For Success

Positional and rotational motions are captured through MEMS sensors and converted to human–readable data.
RF system is able to function properly and transmits the aforementioned motion data from one device to another at least 0.8-1 mile apart.
The reproduction system at the receiving end is able to dutifully repeat the motion set at the transmitting end on the software end for the minimum success criterion. If met, a 3D printed and/or motor-controlled hardware system can be built to further explore the potential of the project.

Autonomous Sailboat

Riley Baker, Arthur Liang, Lorenzo Rodriguez Perez

Autonomous Sailboat

Featured Project

# Autonomous Sailboat

Team Members:

- Riley Baker (rileymb3)

- Lorenzo Pérez (lr12)

- Arthur Liang (chianl2)

# Problem

WRSC (World Robotic Sailing Championship) is an autonomous sailing competition that aims at stimulating the development of autonomous marine robotics. In order to make autonomous sailing more accessible, some scholars have created a generic educational design. However, these models utilize expensive and scarce autopilot systems such as the Pixhawk Flight controller.

# Solution

The goal of this project is to make an affordable, user- friendly RC sailboat that can be used as a means of learning autonomous sailing on a smaller scale. The Autonomous Sailboat will have dual mode capability, allowing the operator to switch from manual to autonomous mode where the boat will maintain its current compass heading. The boat will transmit its sensor data back to base where the operator can use it to better the autonomous mode capability and keep track of the boat’s position in the water. Amateur sailors will benefit from the “return to base” functionality provided by the autonomous system.

# Solution Components

## On-board

### Sensors

Pixhawk - Connect GPS and compass sensors to microcontroller that allows for a stable state system within the autonomous mode. A shaft decoder that serves as a wind vane sensor that we plan to attach to the head of the mast to detect wind direction and speed. A compass/accelerometer sensor and GPS to detect the position of the boat and direction of travel.

### Actuators

2 servos - one winch servo that controls the orientation of the mainsail and one that controls that orientation of the rudder

### Communication devices

5 channel 2.4 GHz receiver - A receiver that will be used to select autonomous or manual mode and will trigger orders when in manual mode.

5 channel 2.4 GHz transmitter - A transmitter that will have the ability to switch between autonomous and manual mode. It will also transfer servos movements when in manual mode.

### Power

LiPo battery

## Ground control

Microcontroller - A microcontroller that records sensor output and servo settings for radio control and autonomous modes. Software on microcontroller processes the sensor input and determines the optimum rudder and sail winch servo settings needed to maintain a prescribed course for the given wind direction.

# Criterion For Success

1. Implement dual mode capability

2. Boat can maintain a given compass heading after being switched to autonomous mode and incorporates a “return to base” feature that returns the sailboat back to its starting position

3. Boat can record and transmit servo, sensor, and position data back to base

Project Videos