Project

# Title Team Members TA Documents Sponsor
71 Automatic Puzzle Solver
Alex Kim
Conor Devlin
Eric Chen
Angquan Yu design_document2.pdf
final_paper1.pdf
other1.zip
photo1.jpeg
photo2.jpeg
presentation1.pptx
proposal2.pdf
# Automatic Puzzle Solver for Accessibility and User Convenience


Team Members:
- Eric Chen (egchen2)
- Alex Kim (alexk4)
- Conor Devlin (conorbd2)

# Problem

Jigsaw puzzles remain a popular pastime, offering enjoyment and cognitive benefits. However, manual assembly can be challenging for individuals with motor skill limitations, visual impairments, or limited attention spans. Existing automated solutions are often expensive, complex, or limited in puzzle sizes and complexities.

This project addresses the need for an accessible and user-friendly automatic jigsaw puzzle solver. Our solution aims to empower individuals of all abilities to enjoy the benefits of puzzle solving while reducing frustration and increasing user satisfaction.

# Solution

This project will deliver an accessible and user-friendly solution to enhance the puzzle-solving experience for individuals of all abilities. We propose an innovative Automatic Jigsaw Puzzle Solver equipped with a precision-controlled robotic arm and computer vision system.

# Solution Components

## 3D Movement System

Function: Precisely position the robotic arm above puzzle pieces.

Components:
- Stepper motors (e.g., Nema 17 series) with high torque and speed for accurate movement.
- Belt/pulley system or leadscrew system for linear motion on X and Y axes.
- End-stop switches for precise positioning.

## Rotation System

Function: Rotate puzzle pieces for proper orientation before pickup.

Components:
- Servo motor (e.g., MG996) with sufficient torque for desired rotation angle.
- Gears/belt system for rotating a platform holding the puzzle piece.
- Limit switch for accurate positioning at specific angles.

## Piece Picking System

Function: Securely lift and place puzzle pieces without damage.

Components:
- Vacuum suction cup(s) with size and material suitable for puzzle pieces (e.g., foam or silicone).
- Venturi vacuum generator with sufficient flow rate and pressure for suction.
- Compressed air supply with regulator for controlling suction strength.

## Computer Vision System

Function: Identify and locate puzzle pieces within the complete image.

Components:
- Camera sensor (e.g., ArduCam OV5642 or Olimex OV7670) with high resolution and auto-focus capability.
- Microcontroller (e.g., Raspberry Pi Zero W, Raspberry Pi 3, STMicroelectronics STM32F103C8T6) for initial image processing and communication.
- Processing Unit (e.g., dedicated AI accelerator or cloud-based processing) for intensive image analysis (optional).

## Control Software

Function: Orchestrate the entire system, interpret vision data, and control robotic movements.

Environment: Open-source libraries like OpenCV for image processing and Python for overall control.

Modularity: Designed for easy maintenance and future improvements.

# Criterion For Success

- Camera Accuracy: 95% of puzzle pieces correctly identified and oriented within the complete image.
- Arm Performance: 90% success rate in accurately picking and placing puzzle pieces.
- Puzzle Completion Time: Solve a 100-piece puzzle of moderate complexity within 60 minutes.

Low Cost Myoelectric Prosthetic Hand

Michael Fatina, Jonathan Pan-Doh, Edward Wu

Low Cost Myoelectric Prosthetic Hand

Featured Project

According to the WHO, 80% of amputees are in developing nations, and less than 3% of that 80% have access to rehabilitative care. In a study by Heidi Witteveen, “the lack of sensory feedback was indicated as one of the major factors of prosthesis abandonment.” A low cost myoelectric prosthetic hand interfaced with a sensory substitution system returns functionality, increases the availability to amputees, and provides users with sensory feedback.

We will work with Aadeel Akhtar to develop a new iteration of his open source, low cost, myoelectric prosthetic hand. The current revision uses eight EMG channels, with sensors placed on the residual limb. A microcontroller communicates with an ADC, runs a classifier to determine the user’s type of grip, and controls motors in the hand achieving desired grips at predetermined velocities.

As requested by Aadeel, the socket and hand will operate independently using separate microcontrollers and interface with each other, providing modularity and customizability. The microcontroller in the socket will interface with the ADC and run the grip classifier, which will be expanded so finger velocities correspond to the amplitude of the user’s muscle activity. The hand microcontroller controls the motors and receives grip and velocity commands. Contact reflexes will be added via pressure sensors in fingertips, adjusting grip strength and velocity. The hand microcontroller will interface with existing sensory substitution systems using the pressure sensors. A PCB with a custom motor controller will fit inside the palm of the hand, and interface with the hand microcontroller.

Project Videos