Intelligent Prosthetic Arm & Hand

Page

Project Description:

The overall goal of this project is to use a person’s EEG and EMG signals to control a prosthetic arm and hand. This will allow for amputees to complete basic tasks. In order to accomplish this goal, a database of sampled EEG and EMG signals will be complied. This database will hold the raw data relating to arm and hand movements/gestures. A microcontroller will receive a signal from electrodes placed on the user’s body. Once the EEG/EMG signals are received, the microcontroller will process the movement using the defined database. The microcontroller will then allow for the prosthetic to move in the desired motion of the user as smooth as possible. The prosthetic is powered by servo motors so every joint is movable just like a real arm and hand. The client goal of this product is to be as user friendly as possible, this means attach and go.

Deliverables:

1.) Obtain smooth movement of the prosthetic arm and hand.

2.) Read EEG/EMG signals and find patterns that pertain to movement.

3.) Recognize EEG/EMG signals with a microcontroller.

4.) Use recognized signals to move the prosthetic arm and hand.

Milestones:

1.) Obtain smooth movement of the prosthetic arm and hand.

i.) With basic gestures.

ii.) With complex gestures.

2.) Read EEG/EMG signals and find patterns that pertain to movement.

i.) Read EEG/EMG signals using PowerLab 8/35 and LabChart 8 software.

ii.) Understand patterns and how to locate them.

iii.) Map out patterns and convert to raw data.

3.) Recognize EEG/EMG signals with a microcontroller.

i.) Build amplifier circuit.

ii.) Create database from collected raw data.

iii.) Create algorithm to process EEG/EMG signals.

4.) Use recognized signals to move the prosthetic arm and hand.

i.) Create algorithm to control servos based on desired arm and hand movement.

ii.) Intelligent corrections (if needed return to deliverable 2).

Work Breakdown Structure:

1.) Obtain smooth movement of the prosthetic arm and hand.

            Green  1.1. Use microcontroller to control individual servos.

            Green  1.2. Control multiple servos to obtain basic gestures.

            Yellow 1.3. Smooth out movement by trying different algorithms.

            Green  1.4. Test complex gestures.

2.) Read EEG/EMG signals and find patterns that pertain to movement.

            Green  2.1. Learn to use PowerLab 8/35 and LabChart 8.

            Yellow 2.2. Experiment with placement of electrodes.

            Red    2.3. Map out patterns based off gestures.

            Red    2.4. Convert patterns to raw data.

(End of fall semester)

3.) Recognize EEG/EMG signals with a microcontroller.

                   3.1. Build amplifier circuit.

                   3.2. Create database with raw data (account for gain of amp. circuit).

                   3.3. Develop algorithm to process data.

4.) Use recognized signals to move the prosthetic arm and hand.

                   4.1. Develop algorithm to control movement.

                   4.2. Apply corrections intelligence.

Extension/Changes:

In order to complete the communication between the RaspberryPi and The EEG/EMG signals, a Microsoft Kinect will be used to bridge the gap. The Kinect application tracks a users arm movement and position in space. The will help create a database for the brain signals. The ultimate goal is to eliminate the Kinect which was used as a calibration tool.

Final Project Report:

Project Management Concepts: