Project

# Title Team Members TA Documents Sponsor
32 Sensing your heartbeat (and others)
Qiyang Wu
Xin Chen
Xuanqi Wang
Yukai Han
design_document1.pdf
final_paper1.pdf
final_paper2.pdf
proposal1.pdf
Howard Yang
# Problem
Traditional human activity monitoring systems often rely on cameras, wearable sensors, or specialized hardware, which can be intrusive, expensive, or inconvenient. However, WiFi signals, which are already ubiquitous in indoor environments, can be repurposed for non-contact human sensing. The challenge lies in accurately extracting and interpreting fine-grained Channel State Information (CSI) to detect subtle human activities, such as breathing, gestures, and potentially even heartbeats, while mitigating environmental interference.

Solution Overview
Our solution for utilizing WiFi as a radar is to leverage Channel State Information (CSI) to sense human activities. We achieve this by extracting fine-grained CSI signals from WiFi devices and applying signal processing techniques to interpret movement patterns. The system consists of a WiFi transmitter and receiver and these devices can continuously capture CSI variations caused by human motion. Advanced algorithms are then used to distinguish different actions like heartbeats and body gestures by analyzing phase shifts and amplitude changes in the wireless signals. This approach enables non-contact human activity sensing, making it suitable for applications in health monitoring and human-computer interaction.

Solution Components
Subsystem1: WIFI signal transmission system
The WiFi signal transmission system consists of Intel AX200 or AX210 network cards and external antennas to ensure stable and high-quality signal transmission. These components work together to provide a robust wireless communication setup necessary for collecting Channel State Information (CSI).
Subsystem2: CSI Extraction Tool/Software
The CSI signal processing system extracts WiFi CSI data using Ubuntu 22.04 LTS and PicoScenes software, which enables real-time signal analysis for detecting fine-grained variations in the wireless channel.
Subsystem3: Human Action Recognition System
The human action recognition system leverages CSI data to detect human movements by analyzing signal variations. Using MATLAB, Python and specialized CSI analysis toolboxes, it processes amplitude and phase changes to detect different human activities accurately.
Criterion for Success
Accurate Respiration Detection: The system must reliably detect human breathing patterns using CSI data by analyzing amplitude and phase variations in WiFi signals.
Robust Interference Mitigation: The system should effectively filter out environmental noise and external disturbances, such as movement from non-human objects or signal fluctuations caused by multipath effects.
Detection of Heartbeat and Other Physiological Signals (If possible): The system should capture and differentiate finer physiological signals, such as heartbeats, using advanced signal processing techniques.
Distribution of Works
Xin Chen [ECE] – Signal Processing
Develops signal processing algorithms to analyze CSI data, extracting key features such as amplitude and phase variations for human activity recognition. Implements filtering and denoising techniques to improve signal quality and enhance detection accuracy. Works closely with system integration to ensure seamless data flow and efficient processing of CSI signals.

Qiyang Wu [EE] – System Integration and Data Transmission
Manages real-time data transmission between WiFi hardware and processing units, ensuring minimal latency and packet loss. Develops communication protocols to synchronize CSI data collection with processing algorithms. Optimizes data handling and storage to support continuous CSI analysis and facilitate system scalability.

Xuanqi Wang [EE] – Hardware Setup and Optimization
Configures WiFi devices, antennas, and receivers to ensure stable and high-quality CSI signal collection. Optimizes antenna placement to maximize sensitivity to movement and reduce interference. Works on power management and circuit adjustments to ensure system reliability and efficiency in different environments.

Yukai Han [ME] – Mechanical Design
Designs mounting structures and enclosures to securely position WiFi devices for optimal signal reception. Ensures stability and repeatability of the setup to maintain consistency in experiments. Assists in planning and executing test scenarios, considering environmental factors that may impact CSI signal variations.

An Intelligent Assistant Using Sign Language

Qianzhong Chen, Howie Liu, Haina Lou, Yike Zhou

Featured Project

# TEAM MEMBERS

Qianzhong Chen (qc19)

Hanwen Liu (hanwenl4)

Haina Lou (hainal2)

Yike Zhou (yikez3)

# TITLE OF THE PROJECT

An Intelligent Assistant Using Sign Language

# PROBLEM & SOLUTION OVERVIEW

Recently, smart home accessories are more and more common in people's home. A center, which is usually a speaker with voice user interface, is needed to control private smart home accessories. But a interactive speaker may not be the most ideal for people who are hard to speak or hear. Therefore, we aim to develop a intelligent assistant using sign language, which can understand sign languages, interact with people, and act as a real assistant.

# SOLUTION COMPONENTS

## Subsystem1: 12-Degree-of-Freedom Bionic Hand System

- Two moveable joints every finger driven by 5-V servo motors

- The main parts of the hand manufactured with 3D printing

- The bionic hand is fixed on a 2-DOF electrical platform

- All of the servo motors controlled by PWM signals transmitted by STM32 micro controller

## Subsystem2: The Control System

- The controlling system consists of embedded system modules including the microcontroller, high performance edge computing platform which will be used to run dynamic gesture recognition model and more than 20 motors which can control the delicate movement of our bionic hand. It also requires a high-precision camera to capture the hand gesture of users.

## Subsystem3: Dynamic Gesture Recognition System

- A external camera capturing the shape, appearance, and motion of objective hands

- A pre-trained model to help other subsystems to figure out the meaning behind the sign language. To be more specific, at the step of objects detection, we intended to adopt YOLO algorithm as well as Mediapipe, a machine learning framework developed by Google to recognize different sign language efficiently. Considering the characteristic of dynamic gesture, we also hope to adopt 3D-CNN and RNN to build our models to better fit in the spatio-temporal features.

# CRITERION OF SUCCESS

- The bionic hand can move free and fluently as designed, all of the 12 DOFs fulfilled. The movement of single joint of the finger does not interrupt or be interrupted by other movements. The durability and reliability of the bionic hand is achieved.

- The controlling system needs to be reliable and outputs stable PWM signals to motors. The edge computing platform we choose should have high performance when running the dynamic gesture recognition model.

- Our machine could recognize different sign language immediately and react with corresponding gestures without obvious delay.

# DISTRIBUTION OF WORK

- Qianzhong Chen(ME): Mechanical design and manufacture the bionic hand; tune the linking between motors and mechanical parts; work with Haina to program on STM32 to generate PWM signals and drive motors.

- Hanwen Liu(CompE): Record gesture clips to collect enough data; test camera modules; draft reports; make schedules.

- Haina Lou(EE): Implement the embedded controlling System; program the microcontroller, AI embedded edge computing module and implement serial communication.

- Yike Zhou(EE): Accomplish object detection subsystem; Build and train the machine learning models.