Shared Three-Dimensional Robotic Arm Control Based on Asynchronous BCI and Computer Vision

IEEE Trans Neural Syst Rehabil Eng. 2023:31:3163-3175. doi: 10.1109/TNSRE.2023.3299350. Epub 2023 Aug 7.

Abstract

Objective: A brain-computer interface (BCI) can be used to translate neuronal activity into commands to control external devices. However, using noninvasive BCI to control a robotic arm for movements in three-dimensional (3D) environments and accomplish complicated daily tasks, such as grasping and drinking, remains a challenge.

Approach: In this study, a shared robotic arm control system based on hybrid asynchronous BCI and computer vision was presented. The BCI model, which combines steady-state visual evoked potentials (SSVEPs) and blink-related electrooculography (EOG) signals, allows users to freely choose from fifteen commands in an asynchronous mode corresponding to robot actions in a 3D workspace and reach targets with a wide movement range, while computer vision can identify objects and assist a robotic arm in completing more precise tasks, such as grasping a target automatically.

Results: Ten subjects participated in the experiments and achieved an average accuracy of more than 92% and a high trajectory efficiency for robot movement. All subjects were able to perform the reach-grasp-drink tasks successfully using the proposed shared control method, with fewer error commands and shorter completion time than with direct BCI control.

Significance: Our results demonstrated the feasibility and efficiency of generating practical multidimensional control of an intuitive robotic arm by merging hybrid asynchronous BCI and computer vision-based recognition.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain-Computer Interfaces*
  • Computers
  • Electroencephalography / methods
  • Evoked Potentials, Visual
  • Humans
  • Movement / physiology
  • Robotic Surgical Procedures*