Extracting Motor Imagery Features to Control Two Robotic Hands

2018 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT)(2018)

引用 9|浏览7
暂无评分
摘要
Brain-Machine Interface (BMI) technology has the potential to restore physical movement of the arm or leg by acquiring electroencephalogram (EEG) signals from the human brain, detecting changes associated with a human arm or leg movements, and generate control signals for the assistive devices in real-time. This project was designed to understand motor imagery tasks associated with human hand movement during visual stimulation, record EEG signals for actual and imagery tasks, and train artificial neural network algorithms u sing three different methods: Scaled Conjugate Gradient, Levenberg-Marquardt, and Bayesian Regularization. Hjorth parameters were calculated prior to train neural network algorithm in order to extract four features: rest, right hand, left hand and both hands. The experiment includes 16-channel wired EEG system from g.tec to acquire real-time signals from the human scalp in Simulink at a sampling rate of 512 samples/second. Eight human subjects between ages of 18 to 52 were recruited to perform both studies associated with human hand movements. Motor imagery signals from C3, FCz, and C4 were used for feedforward pattern recognition neural network algorithm. Sixteen features were calculated during EEG signals recording to achieve overall 95 percent accuracy to successfully detect four different classes. A successful BMI model was developed to control two robotic hands using Arduino-Simulink library in real time with trained artificial neural networks.
更多
查看译文
关键词
Electroencephalography,Feature extraction,Neural networks,Robots,Training,Real-time systems,Bayes methods
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要