Project

# Title Team Members TA Documents Sponsor
49 Smart autochasing lamp
Feiyang Liu
Jincheng Yu
Yiyan Zhang
Luoyan Li design_document1.pdf
design_document2.pdf
final_paper1.pdf
other1.pptx
proposal2.pdf
proposal1.pdf
video
# **Team Members**

Feiyang Liu (feiyang5)

Yiyan Zhang (yiyanz3)

Jincheng Yu (jy54)


# **Problem**

When performing precise tasks on a desk, such as soldering or assembling LEGO, the position of the lamp can often be a source of frustration. Shadows cast by the hands can obscure the parts you're searching for, and tiny components in your hands may not be sufficiently illuminated, leading to discomfort and inefficiency. Furthermore, my ceiling light broke last week, and I've had to rely solely on a desk lamp for illumination. In such a dark environment, the brightness of the desk lamp is overwhelming and strains my eyes. There's a need for a desk lamp that can adjust its brightness and color temperature according to external light conditions. Additionally, the traditional ways of controlling desk lamps are inconvenient, often interrupting our workflow to make adjustments.

# **Solution**
We propose a smart desk lamp equipped with a camera and several servo motors forming a mechanical arm. This lamp can capture images and communicate with a computer for image processing. It can identify human hands and move the lamp closer and at an angle to the hands as they move, minimizing large shadows on the desk. Through a photoresistor, it can respond to changes in external light sources. The camera can also detect specific hand gestures, such as opening the thumb and forefinger to increase brightness or pinching them to decrease brightness. These gestures can also control the computer or play music, which I believe is simpler than voice input.

# **Subsystem**
## Mechanical Arm Subsystem:
Three servo motors and linear potentiometers ensure the basic movement of the mechanical arm, with additional circuits for these components. To avoid interference between light sources, a small aperture for the light-sensitive element will be located on the mechanical arm. This data is communicated to the central control subsystem.

## Lighting and Camera Subsystem:
The lighting bulb, adjustable in terms of color temperature and brightness, receives instructions from the central control system. A camera is positioned near the bulb for better target tracking. Captured information is sent to the central controller.
## Central Control Subsystem:
This system integrates the ESP32 module and necessary I/O modules. It needs to process images captured by the camera, determine how much each motor in the mechanical arm should move to track the bulb and be sensitive to specific gestures to adjust various parameters of the bulb. It can also communicate remotely with a computer to control specific programs.

# **Standard of Success**
When tracking mode is activated, the bulb moves to an appropriate position following the hand's movement.

As the ambient light changes, the bulb adjusts to the appropriate brightness and color temperature.

The lamp's switch and brightness can be adjusted through gestures.

Specific programs (like Spotify) can be opened on the computer through hand gestures.

Low Cost Myoelectric Prosthetic Hand

Michael Fatina, Jonathan Pan-Doh, Edward Wu

Low Cost Myoelectric Prosthetic Hand

Featured Project

According to the WHO, 80% of amputees are in developing nations, and less than 3% of that 80% have access to rehabilitative care. In a study by Heidi Witteveen, “the lack of sensory feedback was indicated as one of the major factors of prosthesis abandonment.” A low cost myoelectric prosthetic hand interfaced with a sensory substitution system returns functionality, increases the availability to amputees, and provides users with sensory feedback.

We will work with Aadeel Akhtar to develop a new iteration of his open source, low cost, myoelectric prosthetic hand. The current revision uses eight EMG channels, with sensors placed on the residual limb. A microcontroller communicates with an ADC, runs a classifier to determine the user’s type of grip, and controls motors in the hand achieving desired grips at predetermined velocities.

As requested by Aadeel, the socket and hand will operate independently using separate microcontrollers and interface with each other, providing modularity and customizability. The microcontroller in the socket will interface with the ADC and run the grip classifier, which will be expanded so finger velocities correspond to the amplitude of the user’s muscle activity. The hand microcontroller controls the motors and receives grip and velocity commands. Contact reflexes will be added via pressure sensors in fingertips, adjusting grip strength and velocity. The hand microcontroller will interface with existing sensory substitution systems using the pressure sensors. A PCB with a custom motor controller will fit inside the palm of the hand, and interface with the hand microcontroller.

Project Videos