Project

# Title Team Members TA Documents Sponsor
36 Personal Carrier Robot
Alex Tanthiptham
Deniz Caglar
Okan Kocabalkanli
Raman Singh design_document1.pdf
final_paper1.pdf
other2.jpg
other1.mp4
photo1.jpg
photo2.PNG
photo3.png
photo4.png
photo5.jpg
proposal3.pdf
proposal1.pdf
proposal2.pdf
video1.mp4
Team Members:
- Okan Kocabalkanli (okan2)
- Deniz Caglar (dcaglar2)
- Jirawatchara Tanthiptham (jt20)

# Problem

In our current society, there are individuals who may lack the ability to carry objects by themselves. An example of this is elderly individuals who may be unable to carry heavy groceries.

# Solution

We can create a path-finding robot that will follow the individual while avoiding obstacles. We are planning on implementing this using ultrasonic depth imaging to detect obstacles. A series of rotating ultrasonic sensors will be imaging the surroundings of the robot. The person of interest will be sending GPS data to the robot through Bluetooth and another GPS chip will be present on the robot. The robot will calculate the distance between itself and the person of interest using the GPS data, and move in the correct direction based on the heading provided by an onboard compass chip. Combining the obstacle and goal direction data, we will employ a path-finding/SLAM algorithm to direct and move the robot through the terrain.
# Solution Components

## Mechanical
This subsystem will encompass the frame for mounting other components as well as the propulsion system of the unit. The system will be rear-wheel driven with each wheel powered by separate motors to allow for differential steering.

### Components:
- Wooden chassis
- A tank drive system with 4 wheels
- 2 DC motors

## Power Management
This subsystem will be powering the rest of the circuit including the PCB and the motors.

### Components:
- A LiPo battery
- LiPo battery charging circuit

## PCB
This subsystem is the sensor suite and brain of our system, performing simultaneous localization and mapping (SLAM) and pathfinding for the system. This system will be generating a PWM signal for the stepper motor. The stepper motor then rotates the Radar Imaging sensors to generate a full field of view. From measured ultrasonic sensor data, obstacles in the systems environment are mapped. The subsystem uses this mapping in addition to data received from the RPI subsystem via SPI for path finding. When the user is in line-of-sight, MCU will be using the distance data from the RPI subsystem camera. When the user is out of line-of-sight, MCU will be using the user's gyroscope and accelerometer data from the RPI subsystem. Using either RPI data, the location of the user is set as the target point with Kalman Filter being used to predict these mapped points' trajectories. Using this trajectory information, the subsystem will create a probability grid. This grid will consist of specific size blocks with each having a collision probability. Using a path finding algorithm like A*, we draw a path between blocks to the target point in order to find the safest and shortest path. The system will compute the path from this and control the DC motors accordingly.

### Components:
- A microcontroller
- DC Motor controller
- Step Motor Controller (TB67S128FTG)
- Radar Imaging System Connector
- Programmer Circuit
- SPI Connection circuit to RPI
- Simultaneous localization and mapping (SLAM) Algorithm
- Kalman filter for Obstacle tracking and prediction
- Roadmap/Grid path planning with A*

## RPI
This subsystem obtains and processes the data necessary for simultaneous localization and mapping (SLAM) using a Raspberry Pi. By using a camera, the robot will detect a fixed-size tag. The fixed size will allow us to detect the distance using the camera perspective.This distance will be pass to the MCU over SPI. In case of a person blocks the camera view, we will switch to a "search mode" where the RPI will forward the phone's heading information (accelerometer, gyroscope ) to the MCU, which then will head the same heading as the user while avoiding obstacles until we find our user with our camera.

### Components:
- Raspberry Pi
- Bluetooth connection
- RPI Camera

## User
This is the subsystem that will directly interact with our users. In this subsystem, we will use a mobile app in order to send user’s GPS data over Bluetooth. For prototyping, we are planning on using an app called “Blynk”, which lets user transfer sensor data from a smartphone via Bluetooth.
### Components:
- Smartphone

# Criterion For Success
- The robot should be able to consistently follow the phone holder through flat terrain with solid, straightforward obstacles.
- The person of interest can be 3-10 meters away from the robot.
- The obstacles should have a height of at least 30 cm over ground level.
- The robot should also be able to carry a load of 3 kg over level ground.


[Discussion thread]( https://courses.engr.illinois.edu/ece445/pace/view-topic.asp?id=72004)


Autonomous Sailboat

Riley Baker, Arthur Liang, Lorenzo Rodriguez Perez

Autonomous Sailboat

Featured Project

# Autonomous Sailboat

Team Members:

- Riley Baker (rileymb3)

- Lorenzo Pérez (lr12)

- Arthur Liang (chianl2)

# Problem

WRSC (World Robotic Sailing Championship) is an autonomous sailing competition that aims at stimulating the development of autonomous marine robotics. In order to make autonomous sailing more accessible, some scholars have created a generic educational design. However, these models utilize expensive and scarce autopilot systems such as the Pixhawk Flight controller.

# Solution

The goal of this project is to make an affordable, user- friendly RC sailboat that can be used as a means of learning autonomous sailing on a smaller scale. The Autonomous Sailboat will have dual mode capability, allowing the operator to switch from manual to autonomous mode where the boat will maintain its current compass heading. The boat will transmit its sensor data back to base where the operator can use it to better the autonomous mode capability and keep track of the boat’s position in the water. Amateur sailors will benefit from the “return to base” functionality provided by the autonomous system.

# Solution Components

## On-board

### Sensors

Pixhawk - Connect GPS and compass sensors to microcontroller that allows for a stable state system within the autonomous mode. A shaft decoder that serves as a wind vane sensor that we plan to attach to the head of the mast to detect wind direction and speed. A compass/accelerometer sensor and GPS to detect the position of the boat and direction of travel.

### Actuators

2 servos - one winch servo that controls the orientation of the mainsail and one that controls that orientation of the rudder

### Communication devices

5 channel 2.4 GHz receiver - A receiver that will be used to select autonomous or manual mode and will trigger orders when in manual mode.

5 channel 2.4 GHz transmitter - A transmitter that will have the ability to switch between autonomous and manual mode. It will also transfer servos movements when in manual mode.

### Power

LiPo battery

## Ground control

Microcontroller - A microcontroller that records sensor output and servo settings for radio control and autonomous modes. Software on microcontroller processes the sensor input and determines the optimum rudder and sail winch servo settings needed to maintain a prescribed course for the given wind direction.

# Criterion For Success

1. Implement dual mode capability

2. Boat can maintain a given compass heading after being switched to autonomous mode and incorporates a “return to base” feature that returns the sailboat back to its starting position

3. Boat can record and transmit servo, sensor, and position data back to base

Project Videos