Prediction of auditory and visual p300 brain-computer interface aptitude

PLoS One. 2013;8(2):e53513. doi: 10.1371/journal.pone.0053513. Epub 2013 Feb 14.

Abstract

Objective: Brain-computer interfaces (BCIs) provide a non-muscular communication channel for patients with late-stage motoneuron disease (e.g., amyotrophic lateral sclerosis (ALS)) or otherwise motor impaired people and are also used for motor rehabilitation in chronic stroke. Differences in the ability to use a BCI vary from person to person and from session to session. A reliable predictor of aptitude would allow for the selection of suitable BCI paradigms. For this reason, we investigated whether P300 BCI aptitude could be predicted from a short experiment with a standard auditory oddball.

Methods: Forty healthy participants performed an electroencephalography (EEG) based visual and auditory P300-BCI spelling task in a single session. In addition, prior to each session an auditory oddball was presented. Features extracted from the auditory oddball were analyzed with respect to predictive power for BCI aptitude.

Results: Correlation between auditory oddball response and P300 BCI accuracy revealed a strong relationship between accuracy and N2 amplitude and the amplitude of a late ERP component between 400 and 600 ms. Interestingly, the P3 amplitude of the auditory oddball response was not correlated with accuracy.

Conclusions: Event-related potentials recorded during a standard auditory oddball session moderately predict aptitude in an audiory and highly in a visual P300 BCI. The predictor will allow for faster paradigm selection.

Significance: Our method will reduce strain on patients because unsuccessful training may be avoided, provided the results can be generalized to the patient population.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation
  • Adolescent
  • Adult
  • Aptitude*
  • Brain-Computer Interfaces*
  • Electroencephalography / methods*
  • Event-Related Potentials, P300*
  • Female
  • Humans
  • Male
  • Middle Aged
  • Models, Biological
  • Photic Stimulation
  • Young Adult

Grants and funding

Funded by Deutsche Forschungsgemeinschaft (DFG) KU 1453/3-1 and BI 195/58-1. This work is supported by the European ICT Programme Project FP7-224631, SFB 550/B5 and C6, BMBF (Bundesministerium für Bildung und Forschung) Bernstein Center for Neurocomputation (Nr 01GQ0831) and the European Research Council Grant (ERC 227632-BCCI). The research leading to these results has also received funding from the European Community's, Seventh Framework Programme FP7/2007–2013, BackHome project grant agreement number 288566. This publication was funded by the German Reasearch Foundation (DFG) and the University of Würzburg in the funding programme Open Access Publishing. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.