# Title Team Members TA Documents Sponsor
26 Teaching Heat to Student
Kaihua Hu
Tianyu Feng
Yongxin Xie
Ziang Liu
Wee-Liat Ong
# Team Member
- Kaihua Hu, kaihua2
- Tianyu Feng, tianyuf2
- Yongxin Xie, yjie3
- Ziang Liu, ziangl4

# Title
Teaching Heat to Student

# Problem
The need for an effective and engaging educational tool to introduce elementary and middle school students to fundamental concepts of heat transfer and thermal energy conversion.

# Solution Overview
We propose the design and manufacture of an integrated thermal experiment platform that provides a safe and hands-on environment for students. The platform will include visual demonstrations of heat conduction and convection, a coating for thermal radiation visualization, and an introduction to thermoelectricity.

# Solution Components
## Subsystem 1
Use a metal rod with one end heated and temperature sensors to detect every temperature on certain position, then visualize it on a computer. And we can use another hollow metal rod with fluid in it, detect and visualize the temperature in same way and show the influence of convection.

## Subsystem 2
Demonstrate the principle of electric heating material which could generate current when people put their hands on it and light up LEDs displaying 'ZJUI'.

## Subsystem 3
Design a coating material which could reflect certain wavelength of a light source and allow the electronicmagnetic wave from human to pass through. Demonstrating a scenario that people could feel cool with this material when received heat source radiation.

# Criterion for Success
- Engaging and safe educational experience.
- Clear understanding of heat transfer concepts by students.
- Successful demonstration of thermal radiation and thermoelectricity.

# Distribution of Work
- Kaihua Hu: Design and Manufacturing
- Tianyu Feng: Design and Manufacturing
- Yongxin Xie: Control and Electrical circuit
- Ziang Liu: Control and Electrical circuit

An Intelligent Assistant Using Sign Language

Qianzhong Chen, Howie Liu, Haina Lou, Yike Zhou

Featured Project


Qianzhong Chen (qc19)

Hanwen Liu (hanwenl4)

Haina Lou (hainal2)

Yike Zhou (yikez3)


An Intelligent Assistant Using Sign Language


Recently, smart home accessories are more and more common in people's home. A center, which is usually a speaker with voice user interface, is needed to control private smart home accessories. But a interactive speaker may not be the most ideal for people who are hard to speak or hear. Therefore, we aim to develop a intelligent assistant using sign language, which can understand sign languages, interact with people, and act as a real assistant.


## Subsystem1: 12-Degree-of-Freedom Bionic Hand System

- Two moveable joints every finger driven by 5-V servo motors

- The main parts of the hand manufactured with 3D printing

- The bionic hand is fixed on a 2-DOF electrical platform

- All of the servo motors controlled by PWM signals transmitted by STM32 micro controller

## Subsystem2: The Control System

- The controlling system consists of embedded system modules including the microcontroller, high performance edge computing platform which will be used to run dynamic gesture recognition model and more than 20 motors which can control the delicate movement of our bionic hand. It also requires a high-precision camera to capture the hand gesture of users.

## Subsystem3: Dynamic Gesture Recognition System

- A external camera capturing the shape, appearance, and motion of objective hands

- A pre-trained model to help other subsystems to figure out the meaning behind the sign language. To be more specific, at the step of objects detection, we intended to adopt YOLO algorithm as well as Mediapipe, a machine learning framework developed by Google to recognize different sign language efficiently. Considering the characteristic of dynamic gesture, we also hope to adopt 3D-CNN and RNN to build our models to better fit in the spatio-temporal features.


- The bionic hand can move free and fluently as designed, all of the 12 DOFs fulfilled. The movement of single joint of the finger does not interrupt or be interrupted by other movements. The durability and reliability of the bionic hand is achieved.

- The controlling system needs to be reliable and outputs stable PWM signals to motors. The edge computing platform we choose should have high performance when running the dynamic gesture recognition model.

- Our machine could recognize different sign language immediately and react with corresponding gestures without obvious delay.


- Qianzhong Chen(ME): Mechanical design and manufacture the bionic hand; tune the linking between motors and mechanical parts; work with Haina to program on STM32 to generate PWM signals and drive motors.

- Hanwen Liu(CompE): Record gesture clips to collect enough data; test camera modules; draft reports; make schedules.

- Haina Lou(EE): Implement the embedded controlling System; program the microcontroller, AI embedded edge computing module and implement serial communication.

- Yike Zhou(EE): Accomplish object detection subsystem; Build and train the machine learning models.