Project
| # | Title | Team Members | TA | Documents | Sponsor |
|---|---|---|---|---|---|
| 30 | American Sign Language Robot Hand Interpreter |
Ankur Prasad Matthew Uthayopas Tunc Gozubuyuk |
|||
| **American Sign Language Robot Hand Interpreter** **Team Members**: - Ankur Prasad (ankurp3) - Experienced in Control Systems, Machine Learning, and some Embedded programming. Have done projects that train models using Python and have worked with programming and communicating sensors. Addtionally have experience building mechanical systems. - Tunc Gozubuyuk (tuncg2) - Have some experience in PCB design and experience in Control Systems. - Matthew Uthayopas (mnu2) - Experienced in Circuit Design and Signal Processing. Have done internships focused on AI/ML models. Have some experience with PCB design and programming with MCUs. **Problem** There are 500,000 to 1,000,000 people worldwide who use American Sign Language (ASL) to convey their ideas. Every idea matters, and we want every idea to be addressed, understood, and communicated between individuals without having any communication barriers. Therefore, we wanted to engineer a cost-efficient ASL Robot Hand Interpreter to be used as a teaching tool for anyone who wants to learn ASL. Voices of the Unheard: Conversational Challenges Between Signers and Non-Signers and Design Interventions for Adaptive SLT Systems: https://dl.acm.org/doi/10.1145/3706599.3720201 Students With Disabilities: https://nces.ed.gov/programs/coe/pdf/2024/CGG_508c.pdf **Solution** Describe your design at a high-level, how it solves the problem, and introduce the subsystems of your project. Our solution is to design a programmable robotic hand that will be able to perform all letters of the alphabet in American Sign Language. This hand will be able to be trained through multiple sensors attached to a separate glove, so we can potentially train the glove to sign whole words. We will be focusing on our hand displaying ASL words, but if time permits, we will be adding features that will allow interaction with the hand. If Time Permits: The robotic hand will be able to teach the American Sign Language without the need for a teacher/interpreter. This can be done by adding audio recognition to our robotic hand so that it will be able to sign words that it picks up. **Solution Components** **Subsystem 1**: Robotic Hand and Actuation Controls This subsystem will be able to bend and restore the joints of the robotic hand. It will function similarly to tendons when it curls and extends fingers. Mechanical Structure: Fingers made out of popsicle sticks that will be cut and sanded down and connected with screws and nuts: Popsicle Sticks - https://www.hobbylobby.com/crafts-hobbies/wood-crafts($0.99) For the palm, it will be made out of cardboard, being layered and then glued together. Additionally, there will be cut wood to mount the servo motors. Cardboards - https://a.co/d/1botWA0 ($5) For the tendons we plan to use nylon string that will be routed through the fingers using small screws/holes on the finger segments. We will place winches and spools on the servo horns to wind the string that controls the fingers. Additionally we will utilise elastic cords to provide a restoring force which will return the finger back to its original state. Elastic Cords - https://www.amazon.com/Elastic-Bracelets-Bracelet-Stretchy-Necklaces ($7) We will also potentially utilise springs to ensure that the fingers have enough force when holding a specific hand position. Motor system: Servo motors (x9) which will provide the torque to pull the tendons. Each finger will contain one servo motor except the thumb which will contain three. Then we will have two servo motor for the wrist to allow for movement in both directions Servo Motors - https://www.adafruit.com/product/1143?utm ($10) Microcontroller (Nano V3.0) - https://a.co/d/bsRC3nZ ($16) We are planning to use an ATmega328P MCU to determine the resistance at which each finger is able to have for each certain letter. The microcontroller will be hooked up to flex sensors which will be attached to each finger. The microcontroller and motor system will be placed inside of a recyclable water bottle. Flex Sensors - https://www.pcb-hero.com/products/2-2-resistive-flex-sensor ($2.15) Power System: Our system will eventually be powered by a portable power module. It will be connected to the microcontroller, which will then provide power to all the other components. Power Source: For bench: AC-DC adapter (12 V or 6–8 V, depending on motors) For portable: Turnigy 3300mAh 3S 11.1V Shorty LiPo Battery ($20) - https://hobbyking.com/en_us/turnigy-3300mah-3s-11-1v-30c-shorty.html?wrh_pdp=2&countrycode=US&utm_source=chatgpt.com **Subsystem 2**: Interaction and Teaching This subsystem will be responsible for training and programming the robotic hand. Sensor Glove: Main Glove: Standard cloth glove made for winter We will use 9 flex sensors to gauge the movements of the specific joints and fingers An Arduino Nano, which will be mounted on the glove to read all of the flex sensor data A HC-05 Bluetooth module will be used to send the glove’s sensor data to the main robot hand controller **Criterion For Success** Describe high-level goals that your project needs to achieve to be effective. These goals need to be clearly testable and not subjective. Sign Language Accuracy The robotic hand should sign each letter of the ASL alphabet perfectly when programmed to do so Any words or letters signalled should be able to be recognized by at least 3 testers The device should be able to spell out a 6-letter word in a reasonable amount of time which can be understood by 3 testers Machine Learning Feedback The robotic hand must be able to replicate signs that were performed from the glove at 85% accuracy The robotic hand should replicate signs within 2-3 seconds of glove movement Battery Life and Power Supply The robotic hand must have at least 2 hours of battery life The device should be able to perform at least 26 different hand signals before losing functionality Time Permitting Features The robotic hand should be able to replicate words spoken at 75% accuracy The camera should be able to detect a human doing sign language with a single color background |
|||||