Project

# Title Team Members TA Documents Sponsor
36 Personal Carrier Robot
Honorable Mention
Alex Tanthiptham
Deniz Caglar
Okan Kocabalkanli
Raman Singh design_document1.pdf
final_paper1.pdf
other2.jpg
other1.mp4
photo1.jpg
photo2.PNG
photo3.png
photo4.png
photo5.jpg
proposal3.pdf
proposal1.pdf
proposal2.pdf
video1.mp4
Team Members:
- Okan Kocabalkanli (okan2)
- Deniz Caglar (dcaglar2)
- Jirawatchara Tanthiptham (jt20)

# Problem

In our current society, there are individuals who may lack the ability to carry objects by themselves. An example of this is elderly individuals who may be unable to carry heavy groceries.

# Solution

We can create a path-finding robot that will follow the individual while avoiding obstacles. We are planning on implementing this using ultrasonic depth imaging to detect obstacles. A series of rotating ultrasonic sensors will be imaging the surroundings of the robot. The person of interest will be sending GPS data to the robot through Bluetooth and another GPS chip will be present on the robot. The robot will calculate the distance between itself and the person of interest using the GPS data, and move in the correct direction based on the heading provided by an onboard compass chip. Combining the obstacle and goal direction data, we will employ a path-finding/SLAM algorithm to direct and move the robot through the terrain.
# Solution Components

## Mechanical
This subsystem will encompass the frame for mounting other components as well as the propulsion system of the unit. The system will be rear-wheel driven with each wheel powered by separate motors to allow for differential steering.

### Components:
- Wooden chassis
- A tank drive system with 4 wheels
- 2 DC motors

## Power Management
This subsystem will be powering the rest of the circuit including the PCB and the motors.

### Components:
- A LiPo battery
- LiPo battery charging circuit

## PCB
This subsystem is the sensor suite and brain of our system, performing simultaneous localization and mapping (SLAM) and pathfinding for the system. This system will be generating a PWM signal for the stepper motor. The stepper motor then rotates the Radar Imaging sensors to generate a full field of view. From measured ultrasonic sensor data, obstacles in the systems environment are mapped. The subsystem uses this mapping in addition to data received from the RPI subsystem via SPI for path finding. When the user is in line-of-sight, MCU will be using the distance data from the RPI subsystem camera. When the user is out of line-of-sight, MCU will be using the user's gyroscope and accelerometer data from the RPI subsystem. Using either RPI data, the location of the user is set as the target point with Kalman Filter being used to predict these mapped points' trajectories. Using this trajectory information, the subsystem will create a probability grid. This grid will consist of specific size blocks with each having a collision probability. Using a path finding algorithm like A*, we draw a path between blocks to the target point in order to find the safest and shortest path. The system will compute the path from this and control the DC motors accordingly.

### Components:
- A microcontroller
- DC Motor controller
- Step Motor Controller (TB67S128FTG)
- Radar Imaging System Connector
- Programmer Circuit
- SPI Connection circuit to RPI
- Simultaneous localization and mapping (SLAM) Algorithm
- Kalman filter for Obstacle tracking and prediction
- Roadmap/Grid path planning with A*

## RPI
This subsystem obtains and processes the data necessary for simultaneous localization and mapping (SLAM) using a Raspberry Pi. By using a camera, the robot will detect a fixed-size tag. The fixed size will allow us to detect the distance using the camera perspective.This distance will be pass to the MCU over SPI. In case of a person blocks the camera view, we will switch to a "search mode" where the RPI will forward the phone's heading information (accelerometer, gyroscope ) to the MCU, which then will head the same heading as the user while avoiding obstacles until we find our user with our camera.

### Components:
- Raspberry Pi
- Bluetooth connection
- RPI Camera

## User
This is the subsystem that will directly interact with our users. In this subsystem, we will use a mobile app in order to send user’s GPS data over Bluetooth. For prototyping, we are planning on using an app called “Blynk”, which lets user transfer sensor data from a smartphone via Bluetooth.
### Components:
- Smartphone

# Criterion For Success
- The robot should be able to consistently follow the phone holder through flat terrain with solid, straightforward obstacles.
- The person of interest can be 3-10 meters away from the robot.
- The obstacles should have a height of at least 30 cm over ground level.
- The robot should also be able to carry a load of 3 kg over level ground.


[Discussion thread]( https://courses.engr.illinois.edu/ece445/pace/view-topic.asp?id=72004)


Smart Frisbee

Ryan Moser, Blake Yerkes, James Younce

Smart Frisbee

Featured Project

The idea of this project would be to improve upon the 395 project ‘Smart Frisbee’ done by a group that included James Younce. The improvements would be to create a wristband with low power / short range RF capabilities that would be able to transmit a user ID to the frisbee, allowing the frisbee to know what player is holding it. Furthermore, the PCB from the 395 course would be used as a point of reference, but significantly redesigned in order to introduce the transceiver, a high accuracy GPS module, and any other parts that could be modified to decrease power consumption. The frisbee’s current sensors are a GPS module, and an MPU 6050, which houses an accelerometer and gyroscope.

The software of the system on the frisbee would be redesigned and optimized to record various statistics as well as improve gameplay tracking features for teams and individual players. These statistics could be player specific events such as the number of throws, number of catches, longest throw, fastest throw, most goals, etc.

The new hardware would improve the frisbee’s ability to properly moderate gameplay and improve “housekeeping”, such as ensuring that an interception by the other team in the end zone would not be counted as a score. Further improvements would be seen on the software side, as the frisbee in it’s current iteration will score as long as the frisbee was thrown over the endzone, and the only way to eliminate false goals is to press a button within a 10 second window after the goal.