Project

# Title Team Members TA Documents Sponsor
26 Lens Controller for Biomedical Cameras
Ji Hun Kim
Kevin Sha
Siddharth Sharma
Zhicong Fan design_document1.pdf
other1.pdf
photo1.png
photo2.jpg
presentation1.pdf
proposal1.pdf
video1.mp4
#Title : Lens Controller for Biomedical Cameras

#Team Members:

Siddharth Sharma

Kevin Sha

Jihun Kim

#Problem

In many operations, the margin for error is very slim. This is especially true for cancer treatment, where operation on tumors is considered one of, if not the only solution to cancer sickness. Operating on tumors requires a high degree of accuracy and so, the use of cameras to aid surgeons in the operating room would significantly reduce the risks associated with any mistakes involved in the removal of tumors. According to a study, incomplete tumor removal occurs in 25% of breast cancer patients, 35% of colon cancer patients and 40% of head and neck cancer patients (Citation needed). From this, it can be seen that the problem is significant and requires a solution to this problem.

#Solution

The solution to this problem is to develop a system where the lens of the camera can be adjusted based on a user input (A surgeon or surgery assistant), which would then help the surgeons in identifying any cancerous tumors and fully removing the tumors. We are planning to use the FPGA to move the lens of the camera so that we can remotely control the lens. The flexible PCB board will be used to provide interconnection between the FPGA and the lens ports. We will be implementing the finite state machine using the FPGA to control the overall operation of the camera. Users will be interacting with the movement of the camera using python code.

#Solution Components

Lens Flexible PCB FPGA

##Subsystem 1 Lens

The lens will be responsible for zooming in and out of the target object according to the commands from the user.
##Subsystem 2 Flexible PCB

The flexible PCB board will be used to provide interconnection between the input and output ports of the lens and the FPGA board. We cannot use regular wire because the ports are small and are not one-to-one mapped so we need to figure out the signal value for each port.
##Subsystem 3 FPGA

The FPGA model that we will be using in our project is from OpalKelly with the model number XEM7310. It has a high end FPGA with many digital IO pins for this task. We will be using the FPGA board to control the lens using a finite state machine using SystemVerilog code.
##Criterion For Success

Camera zooms and moves as programmed

Camera functions promptly upon the command

Port mapping is correctly functioning

Image / videos taken from the camera correctly shown on the computer screen

Low Cost Myoelectric Prosthetic Hand

Michael Fatina, Jonathan Pan-Doh, Edward Wu

Low Cost Myoelectric Prosthetic Hand

Featured Project

According to the WHO, 80% of amputees are in developing nations, and less than 3% of that 80% have access to rehabilitative care. In a study by Heidi Witteveen, “the lack of sensory feedback was indicated as one of the major factors of prosthesis abandonment.” A low cost myoelectric prosthetic hand interfaced with a sensory substitution system returns functionality, increases the availability to amputees, and provides users with sensory feedback.

We will work with Aadeel Akhtar to develop a new iteration of his open source, low cost, myoelectric prosthetic hand. The current revision uses eight EMG channels, with sensors placed on the residual limb. A microcontroller communicates with an ADC, runs a classifier to determine the user’s type of grip, and controls motors in the hand achieving desired grips at predetermined velocities.

As requested by Aadeel, the socket and hand will operate independently using separate microcontrollers and interface with each other, providing modularity and customizability. The microcontroller in the socket will interface with the ADC and run the grip classifier, which will be expanded so finger velocities correspond to the amplitude of the user’s muscle activity. The hand microcontroller controls the motors and receives grip and velocity commands. Contact reflexes will be added via pressure sensors in fingertips, adjusting grip strength and velocity. The hand microcontroller will interface with existing sensory substitution systems using the pressure sensors. A PCB with a custom motor controller will fit inside the palm of the hand, and interface with the hand microcontroller.

Project Videos