Project

# Title Team Members TA Documents Sponsor
1 RFA: Any-Screen to Touch-Screen Device
Ganesh Arunachalam
Sakhi Yunalfian
Chi Zhang design_document1.pdf
proposal1.pdf
# Any-Screen to Touch-Screen Device

Team Members:
\- Sakhi Yunalfian (sfy2)
\- Muthu Arunachalam (muthuga2)
\- Zhengjie Fan (zfan11)

# Problem

While touchscreens are becoming increasingly popular, not all screens come equipped with touch capabilities. Upgrading or replacing non-touch displays with touch-enabled ones can be costly and impractical. Users need an affordable and portable solution that can turn any screen into a fully functional touchscreen.

# Solution

The any-screen-to-touch-screen device uses four ultra-wideband sensors attached to the four corners of a screen to detect the position of a specially designed pen or hand wearable. Ultrawideband (UWB) is a positioning technology that is lower-cost than Lidar/Camera yet more accurate than Bluetooth/Wifi/RFID. Since UWB is highly accurate we will use these sensors to track the location of a UWB antenna (placed in the pen). In addition to the UWB tag, the pen will also feature a touch-sensitive tip to detect contact with the screen (along with a redundant button to simulate screen contact if the user prefers to not constantly make contact with the screen). The pen will also have a gyroscope and low profile buttons to track tilt data and offer customizable hotkeys/shortcuts. The pen and sensors communicate wirelessly with the microcontroller which converts the pen’s input data along with its location on the screen into touchscreen-like interactions.


# Solution Components
## Location Sensing Subsystem (Hardware)
This subsystem will employ Spark Microsystems SR1010 digitally programmable ultra-wideband wireless transceiver. The transceiver will be housed in a enclosure that can be attached to the corners of a screen or monitor. Each sensor unit will also need a bluetooth module in order to communicate with the microcontroller.
## Signal Processing Subsystem (Hardware and Software)
A microcontroller, specifically the STM32F4 series microcontroller (STM32F407 or STM32F429). Real-time sensor data processing takes away a considerable amount of computing power. The STM32F4 series contain DSP instructions that allow a smoother way to perform raw data processing and noise reduction. This subsystem will allow us to perform triangulation to accurately estimate the location on the screen, smooth real-time data processing, latency minimization, sensitivity, and noise reduction.
A bluetooth module, in order for the sensor to send its raw data to the microcontroller. We are planning to make the communication between the sensors and the pen to the microcontroller to be wireless. One bluetooth module we are considering is the HC05 bluetooth module.
The microcontroller itself will be wired to the relevant computer system via USB 2.0 for data transfer of touchscreen interactions.
## Pen/Hand Wearable Subsystem (Hardware)
The pen subsystem will employ a simple spring switch as a pen tip to detect pen to screen contact. We will also use a Sparkfun DEV-08776 Lilypad button to simulate a press/pen-to-screen contact for redundancy and if the user wishes to control the pen without contact to the screen. The pen will also contain several low profile buttons and a STMicroelectronics LSM6DSO32TR gyroscope/accelerator sensor to provide further customizable pen functionality and potentially aid in motion tracking calculations. The pen will contain a Taoglas UWC.01 ultra-wideband tag to allow detection by the location sensing subsystem and a bluetooth module to allow communication with the microcontroller. The unit will need to be enclosed within a plastic or 3D printed housing.
## Touch Screen Emulation Subsystem (Software)
A microcontroller with embedded HID device functionalities in order to control mouse cursors of a specific device connected to it. We are planning to utilize the STM32F4 series microcontroller with built in USB HID libraries to help emulating the touch screen effects. We will also include a simple GUI to allow the user to customize the shortcuts mapped to the pen buttons and specify optional parameters like screen resolution, screen curve, etc.
## Power Subsystem (Hardware)
The power subsystem is not localized in one area since our solution consists of multiple wireless devices, however we specify all power requirements and solutions here for organization purposes.
For the wireless sensors in our location sensing subsystem, we plan on using battery power. Given the UWB transceiver has ultra-low power consumption and an internal DC-DC converter, it makes sense to power each sensor unit with a small 3.3V 650mAh rechargeable battery (potential option: [https://a.co/d/acFLsSu](https://a.co/d/acFLsSu)). We will include recharging capability and micro usb recharging port.
For our pen, we plan on using battery power too. The gyroscope module, UWB antenna, and bluetooth module all have low-power consumption so we plan on using the same rechargeable battery system as specified above.
The microcontroller will be wired via USB 2.0 directly to the computer subsystem in order to transmit mouse data/touchscreen interaction and will receive 5V 0.9A power supply through this connection.


# Criterion For Success
## Hardware

The UWB sensor system is able to track the pens location on the screen.
The pen is able to detect clicks, screen contact, and tilt.
The microcontroller is able to take input from the wireless pen and the wireless sensors.
Each battery-powered unit is successfully powered and able to be charged.

## Software

The pen’s input and sensor location data can be converted to mouse clicks and presses.
The pen’s buttons can be mapped to customizable shortcuts/hotkeys.

## Accuracy and Responsiveness

Touch detection and location accuracy is the most crucial criteria for our project’s success. We expect our device to have a 95% touch detection precision. In order to correctly control embedded HID protocols of a device, the data sent and processed by the microcontroller to the device has to have a low error threshold when comparing cursor movements with wearable location.
Touch recognition and responsiveness is the next most important thing. We want our system, by a certain distance threshold, able to detect the device with a relatively low margin of error of about 1% or less. More specifically, this criteria for success is the conclusion to see if our communication network protocol between the sensors, USB HID peripherals, and the microcontroller are able to efficiently transfer data in real-time for the device to interpret these data in a form of cursor location updates, scrolls, clicks, and more.
Latency and lags should have a time interval of less than 60 millisecond. This will be judged based on the DSP pipeline formed in the STM32F4 microcontroller.

## Reliability and Simplicity

We want our device to be easily usable for the users. It should be intuitive and straightforward to start the device and utilize its functionalities.
We want our device to also be durable in the sense of low chances of battery failures, mechanical failures, and systematic degradations.

## Integration and Compatibility

We want our device to be able to be integrated with any type of screens of different architectural measurements and operating systems.

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos