# Title Team Members TA Documents Sponsor
21 Campus Tour Guide by AI-Powered Autonomous System
Bob Jin
Hao Ren
Weiang Wang
Yuntong Gu
Simon Hu
This [link]( contains the html version of our project description.

# Team Members

* Hao Ren 3200110807 haor2
* Xuanbo Jin 3200110464 xuanboj2
* Weiang Wang 3200111302 weiangw2
* Yuntong Gu 3200110187 yuntong7

> 💡 Note: this doc provides an overview of the project “Campus Tour Guide by AI-Powered Autonomous System”. We start by re-iterating the problem. We then present our proposal and solution. We also draft an initial plan to help build `v0`solution.

# đź‘€ Problem

Anyone entering a place for the first time, like an university, can be quite challenging. Knowing where you are, how to get to your destination, how to optimize your routes, knowing factors that will influence your routes can be complicated. Having a real-time interactive system that guides people through this process is needed. It has been possible yet not able to scale because it’s not open-sourced, and its hardware isn’t standardized, and is expensive. The interaction isn’t versatile enough to adapt well under the ever-changing applications. A cheap and versatile solution is needed.


# đź’­ Proposal

## Solution Overview

Our solution utilizes autonomous UAV to guide our clients, sensing them and the environment, such as obstacles and drone’s location with a sensor module, controlled by a control unit which orchestrate a series of tasks. Our solution is cheap, open-sourced, and versatile to meet the need of a generalized and sustainable long-term solution for our campus and many other applications.

## Solution Components

Our solution contains the following parts: a sensor subsystem, a control subsystem, a mobility subsystem, an inter-connect module.

### Sensor Subsystem

- Identify obstacles
- Identify the person to lead, exclude the other people
- GPS location

### Control Subsystem

- Deploy routes

### Mobility Subsystem

- A drone

### Inter-connect Module

- Inter-communication of control unit, peripheral sensors, and the drone
- Supply power to the sensor module and control unit.

## Criteria for Success

### Milestone 1

- drone can be controlled and moved independently
- GPS can sense the location
- Sensors can be powered

### Milestone 2

- Drone can be controlled by control subsystem
- control subsystem can receive signal from GPS module and sensors
- Routes can be output (not necessarily by moving the drones)

### Milestone 3

- Without obstacle, the system can follow the human
- Without obstacle, the system can fly from A to B and slow down / stop when human is too far away
- System can identify obstacle and plan a route to avoid them

### Milestone 4

- With obstacle, the system can fly from A to B and slow down / stop when human is too far away
- The starting point and ending destination pairs can be selected, e.x. 5 pairs of (A,B) is available.

### Milestone 5 [optional]

- An easy web app which sends signal to the system
- System can receive our instruction (vocal) and design a destination and lead the clients
- Support interactive chatting mode to help understand the surroundings

## Alternatives

*SKYCALL* currently provides a similar version of guiding tour for MIT. But that project isn’t open-sourced and the hardware are not cheap enough, or easy-to-maintain. Our solution is different in that we provide

- Cheap solution
- Open sourced solution (software + hardware), each component will be documented
- Unnecessary functionality will give its way to generality
- Versatile enough to support our campus (which is drastically different to MIT)


# 🛫 Division of Work

- Xuanbo Jin: Xuanbo excels at software works. He should do the algorithm part of the design and also takes part in the firmware integration.
- Yuntong Gu: Yutong’s strong background at electrical engineering makes him a great candidate to test the validity of different hardware and connect them to the object. He should also helps the communication between each components.
- Weiang Wang: Enabled by weiang’s strong background in electrical engineering, he should actively helps the communication and interfaces between components.
- Hao Ren: Hao can do assorted works. Hao should actively do the software and firmware part of the work. Hao should explore the validity of possible direction and iterate the version of the projects properly. Hao should organize the roadmap and update it frequently, examining the priority of each part by experimentation and analysis.

Augmented Reality and Virtual Reality for Electromagnetics Education

Zhanyu Feng, Zhewen Fu, Han Hua, Daosen Sun

Featured Project


Many students found electromagnetics a difficult subject to master partly because electromagnetic waves are difficult to visualize directly using our own eyes. Thus, it becomes a mathematical abstract that heavily relies upon mathematical formulations.


We focus on using AR/VR technology for large-scale, complex, and interactive visualization for the electromagnetic waves. To speed up the calculation, we are going to compute the field responses and render the fields out in real-time probably accelerated by GPU computing, cluster computation, and other more advanced numerical algorithms. Besides, we propose to perform public, immersive, and interactive education to users. We plan to use the existing VR equipment, VR square at laboratory building D220 to present users with a wide range of field of view, high-resolution, and high-quality 3D stereoscopic images, making the virtual environment perfectly comparable to the real world. Users can work together and interact with each other while maneuvering the virtual objects. This project also set up the basis for us to develop digital-twins technology for electromagnetics that effectively links the real world with digital space.


1.Numerical computation component: The part that responsible for computing the field lines via Maxwell equations. We will try to load the work on the GPU to get better performance.

2.Graphic rendering component: The part will receive data from the numerical computation component and use renderers to visualize the data.

3.User interface component: This part can process users’ actions and allow the users to interact with objects in the virtual world.

4.Audio component: This part will generate audio based on the electromagnetic fields on charged objects.

5.Haptic component: This part will interact with the controller to send vibration feedback to users based on the field strength.


Set up four distinct experiments to illustrate the concept of four Maxwell equations. Students can work together and use controllers to set up different types of charged objects and operate the orientation/position of them. Students can see both static and real-time electromagnetic fields around charged objects via VR devices. Achieve high frame rates in the virtual world and fasten the process of computation and using advanced algorithms to get smooth electromagnetic fields.


We will build four distinct scenarios based on four Maxwell Equations rather than the one Gaussian’s Law made by UIUC team. In these scenarios, we will render both electric and magnetic field lines around charged objects, as well as the forces between them.

The experiments allow users to interact with objects simultaneously. In other words, users can cooperate with each other while conducting experiments. While the lab scene made by UIUC team only allows one user to do the experiment alone, we offer the chance to make the experiment public and allow multiple users to engage in the experiments.

We will use different hardware to do the computation. Rather than based on CPU, we will parallelize the calculation and using GPU to improve the performance and simulate large-scale visualization for the fields to meet the multi-users needs.

Compared to the project in the UIUC, we will not only try to visualize the fields, but also expand the dimension that we can perceive the phenomena i.e., adding haptic feedback in the game and also using audio feedback to give users 4D experience.