Back close

EEG-Controlled Prosthetic Arm for Micromechanical Tasks

Publication Type : Conference Proceedings

Publisher : (eds) Proceedings of the Second International Conference on Computational Intelligence and Informatics. Advances in Intelligent Systems and Computing, Springer Singapore.

Source : (eds) Proceedings of the Second International Conference on Computational Intelligence and Informatics. Advances in Intelligent Systems and Computing, Springer Singapore, Volume 712, Singapore (2018)

ISBN : 9789811082283

Campus : Amritapuri

School : School of Engineering

Department : Electronics and Communication

Year : 2018

Abstract : Brain-controlled prosthetics has become one of the significant areas in brain–computer interface (BCI) research. A novel approach is introduced in this paper to extract eyeblink signals from EEG to control a prosthetic arm. The coded eyeblinks are extracted and used as a major task commands for control of prosthetic arm movement. The prosthetic arm is built using 3D printing technology. The major task is converted to micromechanical tasks by the microcontroller. In order to classify the commands, features are extracted in time and spectral domain of the EEG signals using machine learning methods. The two classification techniques used are: Linear Discriminant Analysis (LDA) and K-Nearest Neighbor (KNN). EEG data was obtained from 10 healthy subjects and the performance of the system was evaluated for accuracy, precision, and recall measures. The methods gave accuracy, precision and recall for LDA as 97.7%, 96%, and 95.3% and KNN as 70.7%, 67.3%, and 68% respectively

Cite this Research Publication : G. Gayathri, Udupa, G., Nair, G. J., and Poorna S. S., “EEG-Controlled Prosthetic Arm for Micromechanical Tasks”, (eds) Proceedings of the Second International Conference on Computational Intelligence and Informatics. Advances in Intelligent Systems and Computing, vol. 712. Springer Singapore, Singapore, 2018

Admissions Apply Now