Project

# Title Team Members TA Documents Sponsor
69 A Comprehensive Approach to Tumor Detection using RGB, NIR, and Immersive 3D Visualization
Amy He
TJ Shapiro
Zach Mizrachi
Jason Zhang design_document1.pdf
final_paper1.pdf
other1.pdf
ECE 445 Senior Design RFA

A Comprehensive Approach to Tumor Detection using RGB, NIR, and Immersive 3D Visualization

Team Members:
- Zach Mizrachi (zdm3)
- TJ Shapiro (tylers5)
- Yue (Amy) He (yuehe4)

# Problem

The most widely used approach for tumor removal today is traditional surgery, which introduces a host of problems. This traditional method relies solely on the surgeon's visual and tactile feedback, which is subject to human error. The surgeon is also operating on his or her own view of the tumor, which is often limited when the tumor is not easily visible. All of the above can lead to excess damage being done to the patient in order to increase tumor visibility, or accidental damage caused by human error.

# Solution

We propose a camera system meant to assist a surgeon in their removal of a tumor. The system is intended to perform two main tasks: detect the tumor by segmenting it from the surrounding biological material, and reconstruct the detected tumor in 3D. The camera system is small and highly mobile, such as to allow the surgeon to view all areas of the tumor. The presented solution will improve the visual capabilities of the surgeon, allowing for continuous visualization and informed decision making.

The setup: putting some fluorescent drug over the area of interest, and the tissue would reflect NIR light while the tumor wouldn’t. We use that to distinguish between the tumor area and the healthy area via the tumor-detecting pen system. This method has been validated in the pilot study.

Specifically, we intend to visualize the operating surface in real time in the Apple Vision Pro, highlighting the tumor in augmented reality from the NIR. This will allow the surgeon to record the area of interest guided by the highlighting from NIR, contributing to more accurate photos for the tumor reconstruction. Then, in post processing, we will generate a 3D model of the tumor that will allow the surgeon to have a more detailed view of the region of surgical interest.

## Subsystem

### Casing Module
- **Part Name:** 3D Print
- **Part #:** N/A
- **Protocol:** N/A
- **Purpose of Part:** Hold all components rigidly together

### Imaging Module

- **Part Name:** Beam Splitter
**Part #:** Edmund Optics, Family ID #2185, Visible and NIR Plate Beamsplitters
**Protocol:** N/A
**Purpose of Part:** Take in Visible Light, split the beam into two equal beams

- **Part Name:** NIR Filter
**Part #:** 49950 - RT – Raman 785nm Laser Longpass Set
**Protocol:** N/A
**Purpose of Part:** Filter beam for NIR light

- **Part Name:** NIR Sensor
**Part #:** LI-OV5640-MIPI-AF-NIR
**Protocol:** MIPI
**Purpose of Part:** Record NIR signal

- **Part Name:** RGB Filter
**Part #:** Chroma 27040 - Lum
**Protocol:** N/A
**Purpose of Part:** Filter beam for RGB light

- **Part Name:** RGB Sensor
**Part #:** Digikey, 2289-LI-IMX185-MIPI-M12-ND
**Protocol:** MIPI
**Purpose of Part:** Record RGB signal

- **Part Name:** Lens
**Part #:** Edmund Optics, Family ID #1748, Uncoated Double-Convex (DCX) Lenses
**Protocol:** N/A
**Purpose of Part:** Focus the light on the camera sensors

### Processing Module
- **Part Name:** NVIDIA Jetson
- **Part #:** Digikey, 1597-102110417-ND
- **Protocol:** MIPI
- **Purpose of Part:** Image analysis and sensor fusion
- **Additional Notes:**
- Uses SPI to connect the device and the display
- GPU is responsible for generating a 3D representation based on the input data
- Storing the frames from real-time for later processing

### PCB Components
- **Part Name:** IMU
- **Part #:** Digikey LSM6DSO iNEMO™
- **Protocol:** SPI/I2C, and MIPI I3CSM serial interface
- **Purpose of Part:** Record pose information for the camera via 'Structure from Motion' Algorithm. See Software Overview.

### Modeling Module
- **Part Name:** Apple Vision pro
- **Part #:**N/A
- **Protocol:** N/A
- **Purpose of Part:** Communicates with Mac, which is communicating with Jetson. Project 3D reconstruction of tumor detection/biological information via head-mounted display through augmented reality. This will be done using Apple’s proprietary VisionPro platform as well as SwiftUI and ARKit frameworks.



## Hardware Components

Explain what the subsystem does. Explicitly list what sensors/components you will use in this subsystem. Include part numbers.

We look to replicate a similar hardware setup to the parent study of this project. In this, we will work closely with Professor Gruev to ensure a feasible approach to the hardware system.

Software Overview:

For 3D models to be useful in a surgical scenario, we need the reconstruction to have high levels of detail. For this, we prioritize detail over real-time analysis, and look to implement an open source Structure from Motion algorithm. To further improve upon existing algorithms, we intend to fuse IMU data to eliminate the need to estimate camera pose. We believe this will improve the accuracy of our 3D models.

Existing work has shown that implementing IMU within SFM is not only feasible but improves the robustness of 3D models for small objects. In this work, we look to follow a similar approach to existing literature.



# Criterion For Success

Describe high-level goals that your project needs to achieve to be effective. These goals need to be clearly testable and not subjective.

This project can be separated into goals for 3 stages.


Hardware:
3D Print a casing, allowing for adjustment of beam splitter distance to image sensors
assemble all electrical components correctly
Successfully integrate IMU with PCB
Software
Receive and validate all data on NVIDIA Jetson
RGB Data
NIR Data
IMU Data
Filter RGB images by using NIR region of interest
Set up and run open SFM software on NVIDIA
Improve SFM model with IMU
Perform optimal frame selection using IMU
Augmented/Virtual Reality
Establish communication between Jetson and Vision Pro
Set up pass through mode on Vision Pro, with NIR tumor highlighting
View 3D SFM Point Cloud on Vision Pro
Interact with Point Cloud on Vision Pro

Each bullet here is a goal that we would like to achieve over the course of the semester. Given the difficulty of the task, we plan on utilizing the IMU to improve SFM as a final step, after the pipeline to the Vision Pro has been completed.









Electronic Mouse (Cat Toy)

Jack Casey, Chuangy Zhang, Yingyu Zhang

Electronic Mouse (Cat Toy)

Featured Project

# Electronic Mouse (Cat Toy)

# Team Members:

- Yingyu Zhang (yzhan290)

- Chuangy Zhang (czhan30)

- Jack (John) Casey (jpcasey2)

# Problem Components:

Keeping up with the high energy drive of some cats can often be overwhelming for owners who often choose these pets because of their low maintenance compared to other animals. There is an increasing number of cats being used for service and emotional support animals, and with this, there is a need for an interactive cat toy with greater accessibility.

1. Get cats the enrichment they need

1. Get cats to chase the “mouse” around

1. Get cats fascinated by the “mouse”

1. Keep cats busy

1. Fulfill the need for cats’ hunting behaviors

1. Interactive fun between the cat and cat owner

1. Solve the shortcomings of electronic-remote-control-mouses that are out in the market

## Comparison with existing products

- Hexbug Mouse Robotic Cat Toy: Battery endurance is very low; For hard floors only

- GiGwi Interactive Cat Toy Mouse: Does not work on the carpet; Not sensitive to cat touch; Battery endurance is very low; Can't control remotely

# Solution

A remote-controlled cat toy is a solution that allows more cat owners to get interactive playtime with their pets. With our design, there will be no need to get low to the ground to adjust it often as it will go over most floor surfaces and in any direction with help from a strong motor and servos that won’t break from wall or cat impact. To prevent damage to household objects it will have IR sensors and accelerometers for use in self-driving modes. The toy will be run and powered by a Bluetooth microcontroller and a strong rechargeable battery to ensure playtime for hours.

## Subsystem 1 - Infrared(IR) Sensors & Accelerometer sensor

- IR sensors work with radar technology and they both emit and receive Infrared radiation. This kind of sensor has been used widely to detect nearby objects. We will use the IR sensors to detect if the mouse is surrounded by any obstacles.

- An accelerometer sensor measures the acceleration of any object in its rest frame. This kind of sensor has been used widely to capture the intensity of physical activities. We will use this sensor to detect if cats are playing with the mouse.

## Subsystem 2 - Microcontroller(ESP32)

- ESP32 is a dual-core microcontroller with integrated Wi-Fi and Bluetooth. This MCU has 520 KB of SRAM, 34 programmable GPIOs, 802.11 Wi-Fi, Bluetooth v4.2, and much more. This powerful microcontroller enables us to develop more powerful software and hardware and provides a lot of flexibility compared to ATMegaxxx.

Components(TBD):

- Product: [https://www.digikey.com/en/products/detail/espressif-systems/ESP32-WROOM-32/8544298](url)

- Datasheet: [http://esp32.net](url)

## Subsystem 3 - App

- We will develop an App that can remotely control the mouse.

1. Control the mouse to either move forward, backward, left, or right.

1. Turn on / off / flashing the LED eyes of the mouse

1. keep the cat owner informed about the battery level of the mouse

1. Change “modes”: (a). keep running randomly without stopping; (b). the cat activates the mouse; (c). runs in cycles(runs, stops, runs, stops…) intermittently (mouse hesitates to get cat’s curiosity up); (d). Turn OFF (completely)

## Subsystem 4 - Motors and Servo

- To enable maneuverability in all directions, we are planning to use 1 servo and 2 motors to drive the robotic mouse. The servo is used to control the direction of the mouse. Wheels will be directly mounted onto motors via hubs.

Components(TBD):

- Metal Gear Motors: [https://www.adafruit.com/product/3802](url)

- L9110H H-Bridge Motor Driver: [https://www.adafruit.com/product/4489](url)

## Subsystem 5 - Power Management

- We are planning to use a high capacity (5 Ah - 10 Ah), 3.7 volts lithium polymer battery to enable the long-last usage of the robotic mouse. Also, we are using the USB lithium polymer ion charging circuit to charge the battery.

Components(TBD):

- Lithium Polymer Ion Battery: [https://www.adafruit.com/product/5035](url)

- USB Lithium Polymer Ion Charger: [https://www.adafruit.com/product/259](url)

# Criterion for Success

1. Can go on tile, wood, AND carpet and alternate

1. Has a charge that lasts more than 10 min

1. Is maneuverable in all directions(not just forward and backward)

1. Can be controlled via remote (App)

1. Has a “cat-attractor”(feathers, string, ribbon, inner catnip, etc.) either attached to it or drags it behind (attractive appearance for cats)

1. Retains signal for at least 15 ft away

1. Eyes flash

1. Goes dormant when caught/touched by the cats (or when it bumps into something), reactivates (and changes direction) after a certain amount of time

1. all the “modes” worked as intended

Project Videos