Project

# Title Team Members TA Documents Sponsor
15 Vision-Based Sign Language Recognition System for Smart Furniture Control
Chongying Yue
Licheng Xu
Mingzhi Gu
Zihan Xu
proposal1.pdf
Yushi Cheng
## Problem
Current smart home systems rely primarily on voice control or mobile apps for operation. However, these interaction methods are not user-friendly for the hearing impaired, and controlling furniture devices via mobile apps requires additional steps, resulting in low interaction efficiency. Therefore, this project aims to develop a system that can directly control furniture devices through visual gesture recognition, providing a more intuitive and accessible interaction method for smart homes.
## Solution Overview
Our solution is a vision-based sign language recognition smart furniture control system. The system uses a camera to capture the user's hand movements in real time and utilizes computer vision technology to detect key hand points and gestures, converting them into corresponding furniture control commands, *such as turning on the lights*. The system sends the gesture recognition results to the main control unit, where the main controller parses the control commands and generates corresponding control signals to drive the furniture devices.
## Solution Components
### Software Component
- **Real-time Gesture Recognition**: real-time gesture recognition on the vision processing unit. The system acquires hand images through a camera and uses MediaPipe to extract gesture features. Based on these features, a lightweight machine learning model classifies gestures and recognizes the user's input control gestures.
- **Control Logic**: The main controller receives gesture recognition results from the vision recognition module and parses them into specific control commands. The system generates PWM or GPIO control signals based on different commands to drive physical devices.
### Hardware Component
- **Vision Processing Unit**: Includes a camera module and vision processing board *(e.g., K230)* , which acquires user hand images and running gesture recognition algorithms.
- **Main Control Unit**: An STM32 microcontroller used to receive recognition results and generate corresponding control signals.
- **Execution Drive Module**: Motor drive circuits and relay modules control the actual furniture devices, *e.g., smart lighting systems*.
## Criteria of Success
- The system can stably recognize at least 5 predefined gestures with an accuracy rate of over 70%.
- The system latency from user gesture input to furniture device response is less than 1 second.
- The system can successfully control at least two types of furniture devices.
## Distribution of Work
- **Zihan Xu** Develops the visual recognition module and is responsible for testing the accuracy of gesture recognition under different environments.
- **Licheng Xu:** Designs STM32 control programs, parsing gesture commands, and generating PWM/GPIO control signals.
- **Chongying Yue:** Responsible for hardware circuit design and implementation, including motor drive circuits and power management.
- **Mingzhi Gu:** Responsible for system architecture design and overall integration, including the design and debugging of the furniture control interface and system stability testing.

3D Scanner

Peiyuan Liu, Jiayi Luo, Yifei Song, Chenchen Yu

Featured Project

# Team Members

Yifei Song (yifeis7)

Peiyuan Liu (peiyuan6)

Jiayi Luo (jiayi13)

Chenchen Yu (cy32)

# 3D Scanner

# Problem

Our problem is how to design an algorithm that uses a mobile phone to take multiple angle photos and generate 3D models from multiple 2D images taken at various positions. At the same time, we will design a mechanical rotating device that allows the mobile phone to rotate 360 degrees and move up and down on the bracket.

# Solution Overview

Our solution for reconstructing a 3D topology of an object is to build a mechanical rotating device and develop an image processing algorithm. The mechanical rotating device contains a reliable holder that can steadily hold a phone of a regular size, and an electrical motor, which is fixed in the center of the whole system and can rotate the holder 360 degrees at a constant angular velocity.

# Solution Components

## Image processing algorithms

- This algorithm should be capable of performing feature detection which is essential for image processing. It should be able to accurately identify and extract relevant features of an object from multiple 2D images, including edges, corners, and key points.

- This algorithm should be designed to minimize the memory requirement and energy consumption, because mobile phones have limited memory and battery.

## Mechanical rotating system

Phone holder that can adjust its size and orientation to hold a phone steadily

Base of the holder with wheels that allows the holder to move smoothly on a surface

Electrical motor for rotating the holder at a constant angular velocity

Central platform to place the object

The remote-control device can be used to control the position of the central platform. Different types of motors and mechanisms can be used for up and down, such as the stepper motors, servo motors, DC motors, and AC motors.

# Criterion for Success

- Accuracy: The app should be able to produce a 3D model that is as accurate as possible to the real object, with minimal distortion, errors or noise.

- Speed: The app should be able to capture and process the 3D data quickly, without requiring too much time or processing power from the user's device.

- Output quality: The app should be able to produce high-quality 3D models that can be easily exported and used in other software applications or workflows.

- Compatibility: Any regular phone can be placed and fixed on the phone holder with a certain angle and does not come loose

- Flexibility: The holder with a phone must be able to rotate 360 degrees smoothly without violent tremble at a constant angular velocity

# Distribution of Work

Yifei Song

Design a mobile app and deploy a modeling algorithm to it that enables image acquisition and 3D modeling output on mobile devices.

Peiyuan Liu:

Design an algorithm for modeling 3D models from multiple view 2D images.

Jiayi Luo:

Design the remote-control device. Using the electrical motors to control the central platform of the mechanical rotating system.

Chenchen Yu:

Design the mechanical part. Build, test and improve the mechanical rotating system to make sure the whole device works together.