Project

# Title Team Members TA Documents Sponsor
78 Pitched Project: CfA Flying Area Accuracy Determination
Alex Hu
Bella Altenbach
Juliana Temple
Sarath Saroj design_document2.pdf
final_paper1.pdf
other1.MOV
photo1.jpg
presentation1.pptx
proposal2.pdf
Team members:
- Alex Hu (alexxh2)
- Juliana Temple (jtemple4)
- Bella Altenbach (ialten2)

# PROBLEM:
The challenge faced right now is that the Intelligent Robotics Lab Facility would like to design a software that will analyze how consistently the position of a drone is able to be tracked throughout the Flying Arena based on the configuration of the motion tracking setup. The current motion tracker system (Vicon Tracker 3) gathers up to mm position accuracy, but it may be less reliable in some areas where the configuration does not allow for optimal observation. The goal is to see how the accuracy changes when going higher, lower, and further away into the arena and away from the cameras. Ideally, throughout testing the calibration of the motion tracking configuration can be improved based on where we identify these areas of high and low efficiency to be.

# SOLUTION OVERVIEW:
For our solution we will have to be able to actively track location and configuration of a test object or a flying drone using infrared LEDs. In order to track accurately we will be creating a calibration device using infrared LEDs (active marker) rather than reflective balls (passive marker) because the IR LEDs will be triggered from the flash of a camera. This allows location to be measured more reliably at a further distance. Additionally we would like to set individual IP addresses for each Led on the PCB so we can individually identify each marker and see the overall orientation during real time. A reference deck framework has already been made that has initial LED placements on all 4 arms for the main directions of the propellers, and we aim to design ours in a similar way. Using Vicon tracker 3 Motion Capture System and cameras, data on the location will be continuously recorded, and compared with the actual location of the calibration device. We plan on designing a software to measure the relative error in these measurements and assess where points of higher and lower accuracies are. Based on this, we will be able to reconfigure the camera locations in order to collect the most accurate position tracking data at all points throughout the arena.

Example Reference deck: https://www.bitcraze.io/2019/09/the-active-marker-deck/

## SOLUTION COMPONENTS:
- PCB board
- Infrared LEDS
- Controllers to process data and orientation to actively see location
- Vicon motion capture Camera for triggering LEDS and continuous recording
- Vicon Tracker 3 software
- Drone device/Test object to carry the PCB

## SUBSYSTEM 1: PCB and MARKERS
- PCB board that contains multiple infrared LEDs that can be programmed into different configurations. These are active markers, and they activate upon camera flashes. Because LEDs emit light instead of just reflecting, they will be more effective in taking data throughout the whole volume of the arena.
- Placing the LEDs in various configurations will help to get an accurate 3D location of the object, as they will allow the user to differentiate between up, down, left, and right.
- This will be connected to the drone or test object discussed in Subsystem #2.
- This will allow for testing of the calibration

## SUBSYSTEM 2: DRONE/TEST OBJECT
- The drone can either be flown around the arena with the PCB attached, or the PCB itself can be carried throughout the arena to different locations. As our group is new to flying drones, the latter may be a safer option as to not harm any equipment.

## SUBSYSTEM 3: MOTION TRACKER and CAMERA SYSTEM
- Vicon Tracker 3 Software
-- This will allow the position of the object or drone specified in Subsystem #2 to be pinpointed
- Vicon motion capture Camera
-- The flash of this Camera will trigger the IR LEDS and continuously record data that can be analyzed by the user.

## SUBSYSTEM 4: SOFTWARE
- Based on the actual location and the recorded location from what was recorded in Subsystem #3, we will design a software to determine how accurate the data is.
- Based on this, we will analyze where the higher or lower areas of accuracy are and why.
- Recorded data from the motion tracker setup will be streamed to the software over local WiFi.

# CRITERION FOR SUCCESS:
In order to reach success in this project we will separate the subsystems and take on each system individually to ensure that as we go each component functions as expected. First we will need to understand and improve our abilities using the Vicon software for the motion capture aspects, as none of us have prior experience using it. Next, we will need to design a PCB board that has IR LEDs strategically placed to gain the most efficient and readable directions from the test object moving around the arena. It should be programmable to test various configurations of the LEDs to achieve optimal calibration. Once this is done we will have to brainstorm and design a software program that will determine the accuracy of the motion-tracked locations versus the actual locations. Understanding the reason for why the accuracy is affected in certain places and creating a successful PCB board will be the strong points for our project to be successful. If time is permitted, repeated testing can be done to place the motion tracking cameras at different locations to improve the accuracy.

Musical Hand

Ramsey Foote, Thomas MacDonald, Michelle Zhang

Musical Hand

Featured Project

# Musical Hand

Team Members:

- Ramesey Foote (rgfoote2)

- Michelle Zhang (mz32)

- Thomas MacDonald (tcm5)

# Problem

Musical instruments come in all shapes and sizes; however, transporting instruments often involves bulky and heavy cases. Not only can transporting instruments be a hassle, but the initial purchase and maintenance of an instrument can be very expensive. We would like to solve this problem by creating an instrument that is lightweight, compact, and low maintenance.

# Solution

Our project involves a wearable system on the chest and both hands. The left hand will be used to dictate the pitches of three “strings” using relative angles between the palm and fingers. For example, from a flat horizontal hand a small dip in one finger is associated with a low frequency. A greater dip corresponds to a higher frequency pitch. The right hand will modulate the generated sound by adding effects such as vibrato through lateral motion. Finally, the brains of the project will be the central unit, a wearable, chest-mounted subsystem responsible for the audio synthesis and output.

Our solution would provide an instrument that is lightweight and easy to transport. We will be utilizing accelerometers instead of flex sensors to limit wear and tear, which would solve the issue of expensive maintenance typical of more physical synthesis methods.

# Solution Components

The overall solution has three subsystems; a right hand, left hand, and a central unit.

## Subsystem 1 - Left Hand

The left hand subsystem will use four digital accelerometers total: three on the fingers and one on the back of the hand. These sensors will be used to determine the angle between the back of the hand and each of the three fingers (ring, middle, and index) being used for synthesis. Each angle will correspond to an analog signal for pitch with a low frequency corresponding to a completely straight finger and a high frequency corresponding to a completely bent finger. To filter out AC noise, bypass capacitors and possibly resistors will be used when sending the accelerometer signals to the central unit.

## Subsystem 2 - Right Hand

The right subsystem will use one accelerometer to determine the broad movement of the hand. This information will be used to determine how much of a vibrato there is in the output sound. This system will need the accelerometer, bypass capacitors (.1uF), and possibly some resistors if they are needed for the communication scheme used (SPI or I2C).

## Subsystem 3 - Central Unit

The central subsystem utilizes data from the gloves to determine and generate the correct audio. To do this, two microcontrollers from the STM32F3 series will be used. The left and right hand subunits will be connected to the central unit through cabling. One of the microcontrollers will receive information from the sensors on both gloves and use it to calculate the correct frequencies. The other microcontroller uses these frequencies to generate the actual audio. The use of two separate microcontrollers allows for the logic to take longer, accounting for slower human response time, while meeting needs for quicker audio updates. At the output, there will be a second order multiple feedback filter. This will get rid of any switching noise while also allowing us to set a gain. This will be done using an LM358 Op amp along with the necessary resistors and capacitors to generate the filter and gain. This output will then go to an audio jack that will go to a speaker. In addition, bypass capacitors, pull up resistors, pull down resistors, and the necessary programming circuits will be implemented on this board.

# Criterion For Success

The minimum viable product will consist of two wearable gloves and a central unit that will be connected together via cords. The user will be able to adjust three separate notes that will be played simultaneously using the left hand, and will be able to apply a sound effect using the right hand. The output audio should be able to be heard audibly from a speaker.

Project Videos