Project

# Title Team Members TA Documents Sponsor
44 Brain-controlled portable programmable embedded system
Research Award
Shiyang Liu
Xuanyu Zhong
Yujie Chen
design_document0.pdf
final_paper0.pdf
presentation0.pdf
proposal0.pdf
video0.mp4
Nowadays, people use their hands to control modern computing systems as well as consumer electronics. We type keyboards, or swipe on tablets with our fingers as a means of input. Many other people also take the advantage of voice control everyday which is being considered as one of the very innovative inputting methods. Based on the trend of how technology gets developed today, we see the next step of inputting as we getting use of our brains.

Imagine that we need to take a look at the next step of a recipe when we get our hands messed with all the food while cooking. Swiping on the iPad then sounds very tedious. Instead, would it be nice to do so by just staring at a specific region on the screen and turn to another page of cookbook. This region blinks at a predefined frequency. By looking at it, our brains will also "blink" at the same frequency and the generated signals can be captured and distinguished from other signals with different frequencies, which will consequently allows various control options. (not just flipping recipe pages)

Our goal of this project is to build a prototype of brain-controlled portable programmable embedded system with a LCD screen that will satisfy basic functionality of our everyday computation and its user interface. With the help of electroencephalography, our device will be built on top of a micro-controller which reads input from various signals from our brains and thus supports hands-free interactions between users and computing system which will be reflected on a built-in LCD display.

A simple diagram can be found here which illustrates the basic idea of this project:
http://i1285.photobucket.com/albums/a599/sc21cn/ScreenShot2013-01-31at25324PM_zps11908c3f.png
[Note that our project consists of the micro-controller, LCD screen along with some other hardware components and wireless part. The graph represents what we propose to do within this semester (a sort of prototype). However, it may be made more advanced in the future, such as integrating the screen onto the glasses or caps people wear everyday. But it is just for future consideration.]

Cloud-controlled quadcopter

Anuraag Vankayala, Amrutha Vasili

Cloud-controlled quadcopter

Featured Project

Idea:

To build a GPS-assisted, cloud-controlled quadcopter, for consumer-friendly aerial photography.

Design/Build:

We will be building a quad from the frame up. The four motors will each have electronic speed controllers,to balance and handle control inputs received from an 8-bit microcontroller(AP),required for its flight. The firmware will be tweaked slightly to allow flight modes that our project specifically requires. A companion computer such as the Erle Brain will be connected to the AP and to the cloud(EC2). We will build a codebase for the flight controller to navigate the quad. This would involve sending messages as per the MAVLink spec for sUAS between the companion computer and the AP to poll sensor data , voltage information , etc. The companion computer will also talk to the cloud via a UDP port to receive requests and process them via our code. Users make requests for media capture via a phone app that talks to the cloud via an internet connection.

Why is it worth doing:

There is currently no consumer-friendly solution that provides or lets anyone capture aerial photographs of them/their family/a nearby event via a simple tap on a phone. In fact, present day off-the-shelf alternatives offer relatively expensive solutions that require owning and carrying bulky equipment such as the quads/remotes. Our idea allows for safe and responsible use of drones as our proposed solution is autonomous, has several safety features, is context aware(terrain information , no fly zones , NOTAMs , etc.) and integrates with the federal airspace seamlessly.

End Product:

Quads that are ready for the connected world and are capable to fly autonomously, from the user standpoint, and can perform maneuvers safely with a very simplistic UI for the common user. Specifically, quads which are deployed on user's demand, without the hassle of ownership.

Similar products and comparison:

Current solutions include RTF (ready to fly) quads such as the DJI Phantom and the Kickstarter project, Lily,that are heavily user-dependent or user-centric.The Phantom requires you to carry a bulky remote with multiple antennas. Moreover,the flight radius could be reduced by interference from nearby conditions.Lily requires the user to carry a tracking device on them. You can not have Lily shoot a subject that is not you. Lily can have a maximum altitude of 15 m above you and that is below the tree line,prone to crashes.

Our solution differs in several ways.Our solution intends to be location and/or event-centric. We propose that the users need not own quads and user can capture a moment with a phone.As long as any of the users are in the service area and the weather conditions are permissible, safety and knowledge of controlling the quad are all abstracted. The only question left to the user is what should be in the picture at a given time.

Project Videos