Project

# Title Team Members TA Documents Sponsor
27 An Intelligent Assistant Using Sign Language
Best Integrated
Haina Lou
Howie Liu
Qianzhong Chen
Yike Zhou
Xiaoyue Li design_document1.pdf
final_paper1.pdf
proposal2.pdf
Liangjing Yang
# TEAM MEMBERS
Qianzhong Chen (qc19)
Hanwen Liu (hanwenl4)
Haina Lou (hainal2)
Yike Zhou (yikez3)

# TITLE OF THE PROJECT
An Intelligent Assistant Using Sign Language

# PROBLEM & SOLUTION OVERVIEW
Recently, smart home accessories are more and more common in people's home. A center, which is usually a speaker with voice user interface, is needed to control private smart home accessories. But a interactive speaker may not be the most ideal for people who are hard to speak or hear. Therefore, we aim to develop a intelligent assistant using sign language, which can understand sign languages, interact with people, and act as a real assistant.

# SOLUTION COMPONENTS
## Subsystem1: 12-Degree-of-Freedom Bionic Hand System
- Two moveable joints every finger driven by 5-V servo motors
- The main parts of the hand manufactured with 3D printing
- The bionic hand is fixed on a 2-DOF electrical platform
- All of the servo motors controlled by PWM signals transmitted by STM32 micro controller


## Subsystem2: The Control System
- The controlling system consists of embedded system modules including the microcontroller, high performance edge computing platform which will be used to run dynamic gesture recognition model and more than 20 motors which can control the delicate movement of our bionic hand. It also requires a high-precision camera to capture the hand gesture of users.


## Subsystem3: Dynamic Gesture Recognition System
- A external camera capturing the shape, appearance, and motion of objective hands
- A pre-trained model to help other subsystems to figure out the meaning behind the sign language. To be more specific, at the step of objects detection, we intended to adopt YOLO algorithm as well as Mediapipe, a machine learning framework developed by Google to recognize different sign language efficiently. Considering the characteristic of dynamic gesture, we also hope to adopt 3D-CNN and RNN to build our models to better fit in the spatio-temporal features.

# CRITERION OF SUCCESS

- The bionic hand can move free and fluently as designed, all of the 12 DOFs fulfilled. The movement of single joint of the finger does not interrupt or be interrupted by other movements. The durability and reliability of the bionic hand is achieved.
- The controlling system needs to be reliable and outputs stable PWM signals to motors. The edge computing platform we choose should have high performance when running the dynamic gesture recognition model.
- Our machine could recognize different sign language immediately and react with corresponding gestures without obvious delay.


# DISTRIBUTION OF WORK
- Qianzhong Chen(ME): Mechanical design and manufacture the bionic hand; tune the linking between motors and mechanical parts; work with Haina to program on STM32 to generate PWM signals and drive motors.
- Hanwen Liu(CompE): Record gesture clips to collect enough data; test camera modules; draft reports; make schedules.
- Haina Lou(EE): Implement the embedded controlling System; program the microcontroller, AI embedded edge computing module and implement serial communication.
- Yike Zhou(EE): Accomplish object detection subsystem; Build and train the machine learning models.

Intelligent Texas Hold 'Em Robot

Xuming Chen, Jingshu Li, Yiwei Wang, Tong Xu

Featured Project

## Problem

Due to the severe pandemic of COVID-19, people around the world have to keep a safe social distance and to avoid big parties. As one of famous Poker games in the western world, the Texas Hold’em is also influenced by the pandemic and tends to turn to online game platform, which, unfortunately, brings much less real excites and fun to its players. We hope to develop a product to assist Poker players to get rid of the limit of time and space, trying to let them enjoy card games just as before the pandemic.

## Solution Overview

Our solution is to develop an Intelligent Texas Hold’em robot, which can make decisions in real Texas poker games. The robot is expected to play as an independent real player and make decisions in game. It means the robot should be capable of getting the information of public cards and hole cards and making the best possible decisions for betting to get as many chips as possible.

## Solution Components

-A Decision Model Based on Multilayer Neural Network

-A Texas Hold'em simulation model which based on traditional probabilistic models used for generating training data which are used for training the decision model

-A module of computer vision enabling game AI to recognize different faces and suits of cards and to identify the game situation on the table.

-A manipulation robot hand which is able to pick, hold and rotate cards.

-Several Cameras helping to movement of robot hand and the location of cards.

## Criterion for Success

- Training a decision model for betting using deep learning techniques (mainly reinforcement learning).

- Using cv technology to transform the information of public cards and hole cards and the chips of other players to valid input to the decision-making model.

- Using speech recognition technology to recognize other players’ actions for betting as valid input to the decision model.

Using the PTZ to realize the movement of the cameras which are used to capture the information of pokers and chips.

- Finish the mechanical design of an interactive robot, which includes actions like draw cards, move cards to camera, move chips and so on. Utilize MCU to control the robot.

Project Videos