NameNetIDCourse
Tejus Kurdukartkurdu2ECE 110
Taylor Xiatyxia2ECE 110


Purpose: To construct a synthetic arm that responds and interacts with user. Our initial designs incorporate either computer vision and machine learning algorithms or wireless communication. We were considering accomplishing this using flex sensors attached to the user via a glove or strap or some sort, or, if taking the computer vision approach, a set of cameras positioned in a way to register and recognize different gestures.

  1. Introduction

    1. Background Research

      We came up with the idea for this project because we wanted to gain experience with wireless communication systems, Arduino, and fabrication. The robotic hand is meant to be a lighthearted, creative project that can interact with the user in multiple ways and allows us to learn and practice these skills. 
      We've found that there are many projects that have been done previously to mimic the motion of a user, but our project aims to go even further beyond! The first step is having the prosthetic simply mimic the user's hand motion. Then, once we are confident with its ability to recognize motion, we can expand its capabilities to interacting and responding rather than just mimicking. For example, the hand could greet the user with a handshake, a wave, or even the classic game of good old rock-paper-scissors. 
       
      The first article we found laid the foundations for our general design: use flex sensors to detect movement in a glove worn by the user, which then send wireless signals to an Arduino. Then, the Arduino relays the signals to a servo-based pulley system that dictates the movement of the fingers[1]. This would allow the hand to detect the user's motions and react in a variety of ways. The article also provides a method on how to physically design the hand itself, using 3D-printed hinges. We also found an article that goes into detail about gesture recognition using computer vision algorithms, such as Canny edge detection and background subtraction[3]. Paired with a video we found about tracking objects using an endoscope camera, we may also incorporate some form of computer vision in our project. If we make enough progress, we may even be able to replace the flex sensors all together, though we may also use just use this to react to some visual stimulus[2]. 



  2. Design Details

    1. Block Diagram / Flow Chart



    2. System Overview

      The above block diagram shows the connections between the essential parts of the system, though we may add more sensors such as a camera, if time allows. The most important sensor, however, will be our flex sensors; these will be responsible for communicating with the Arduino NANO and, by extension, the Arduino UNO. First, an ultrasonic sensor checks if a person is close enough before activating the prosthetic's NRF for receiving instructions. This is both to save power and ensure the user is within range for the NRFs to be able to communicate with each other. After that requirement is fulfilled, the Arduino NANO attached to the user will send instructions to the NRF transceiver on the UNO attached to the prosthetic. These instructions are then interpreted into activating the various servos on the prosthetic, moving the fingers.   

    Parts


    Part NamePriceRetailer + Link
    Arduino NANO$19.78 ($12.30)Cheaper price from Amazon: https://www.amazon.com/Arduino-A000005-ARDUINO-Nano/dp/B0097AU5OU/ref=pd_sbs_23_t_1/136-3287983-0498652?_encoding=UTF8&pd_rd_i=B0097AU5OU&pd_rd_r=ac9a69e0-4869-4408-b4b9-68463e7f24b1&pd_rd_w=aEPYW&pd_rd_wg=DsG0x&pf_rd_p=5cfcfe89-300f-47d2-b1ad-a4e27203a02a&pf_rd_r=8DB9Q8306AFR1ZHRF2JP&psc=1&refRID=8DB9Q8306AFR1ZHRF2JP
    NRF24L01+ Wireless communication module$5.99Amazon https://www.amazon.com/Aideepen-Wireless-Transceiver-NRF24L01-Antenna/dp/B01ICU18XC/ref=asc_df_B01ICU18XC/?tag=hyprod-20&linkCode=df0&hvadid=309707619534&hvpos=1o2&hvnetw=g&hvrand=3799246570259578545&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9022196&hvtargid=pla-634201318517&psc=1 
    Flex Sensor$7.95 x5

    Adafruit

    https://www.adafruit.com/product/1070?gclid=CjwKCAiAhJTyBRAvEiwAln2qB8TXdQ3gdMrh-lT_GEgkQqfAr6RlNV1-X6x2AjejFo7IAKDcdPsHgRoCJxQQAvD_BwE

    Arduino UNO$18Amazon https://www.amazon.com/Arduino-A000066-ARDUINO-UNO-R3/dp/B008GRTSV6/ref=asc_df_B008GRTSV6/?tag=hyprod-20&linkCode=df0&hvadid=309751315916&hvpos=&hvnetw=g&hvrand=9613496399990754865&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9022196&hvtargid=pla-457497319401&psc=1&tag=&ref=&adgrpid=67183599252&hvpone=&hvptwo=&hvadid=309751315916&hvpos=&hvnetw=g&hvrand=9613496399990754865&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9022196&hvtargid=pla-457497319401
    Ultrasonic sensor$0Already have on hand
    Micro-Servo$5.95 x5 Adafruit https://www.adafruit.com/product/169?gclid=CjwKCAiAp5nyBRABEiwApTwjXjSQO2Ph4f1LMo3_a30Z_6MgPD3Cgn1CA9H-GvNEv47vDXSB5xEH3BoCmC4QAvD_BwE
    Resistors, batteries, jumper wires, breadboards$0Already have on hand






    Total$108.77


    Note: Can get the Arduinos for considerably cheaper if we go for a cheaper brand, save some $

    May purchase an endoscope camera later on in the semester.

  3. Possible Challenges

    Proper integration of electronics, code for transceivers, assembling the arm and wiring the servo-pulley system properly, and using a correct circuit design are all potential challenges we will likely face. If we decide to do computer vision later, that's a whole 'nother beast.

  4. References

    [1]M. Kilic, "How to Make Wireless / Gesture Control Robotic Hand", Hackster.io, 2018. [Online]. Available: https://www.hackster.io/mertarduino/how-to-make-wireless-gesture-control-robotic-hand-cc7d07. [Accessed: 15- Feb- 2020].

    [2]C. Macintosh, Computer Vision with Processing, Arduino and Robotic Arm. 2017. Available: https://www.youtube.com/watch?v=usZoV4OlgfU [Accessed: 15- Feb- 2020]. 


    [3]Z. Xian and J. Yeo, Hand Recognition and Gesture Control Using a Laptop Web-camera. Stanford: Stanford University. Available: https://web.stanford.edu/class/cs231a/prev_projects_2016/CS231A_Project_Final.pdf[Accessed: 15- Feb- 20].


    Final report and Demo Video


    Video: https://www.wevideo.com/view/1691759625

Attachments:

Comments:

Are you guys planning on designing and 3D printing the parts needed to build the hand, or are there design files already available online that you can use?

Overall, the project seems very doable, keep up the good work (smile)

Posted by jamesw10 at Feb 15, 2020 19:30

Cool idea! If you plan to design the hand from scratch, I would suggest that you use flex sensors since it would take you time to calibrate it and use its data to accurately control the servos.

Posted by yuchenc2 at Feb 16, 2020 03:03

This seems like a great project idea! I am not super sure but I believe there is a 'hand' already in the lab that you could get started with. As Jonny said flex sensors don't have a high range of resistance so maybe you could use an amplifier. 


Posted by akhatua2 at Feb 16, 2020 22:32

I have nothing extra to add! Get a concise idea of the design of the hand ASAP as those types of things tend to drag out and need multiple revisions. Good work!

Posted by dbycul2 at Feb 16, 2020 22:38