Objective: Current brain-computer interface (BCI) studies demonstrate the potential to decode neural signals obtained from structured and trial-based tasks to drive actuators with high performance within the context of these tasks. Ideally, to maximize utility, such systems will be applied to a wide range of behavioral settings or contexts. Thus, we explore the potential to augment such systems with the ability to decode abstract behavioral contextual states from neural activity.
Approach: To demonstrate the feasibility of such context decoding, we used electrocorticography (ECoG) and stereo-electroencephalography (sEEG) data recorded from the cortical surface and deeper brain structures, respectively, continuously across multiple days from three subjects. During this time, the subjects were engaged in a range of naturalistic behaviors in a hospital environment. Behavioral contexts were labeled manually from video and audio recordings; four states were considered: engaging in dialogue, rest, using electronics, and watching television. We decode these behaviors using a factor analysis and support vector machine (SVM) approach.
Main results: We demonstrate that these general behaviors can be decoded with high accuracies of 73% for a four-class classifier for one subject and 71% and 62% for a three-class classifier for two subjects.
Significance: To our knowledge, this is the first demonstration of the potential to disambiguate abstract naturalistic behavioral contexts from neural activity recorded throughout the day from implanted electrodes. This work motivates further study of context decoding for BCI applications using continuously recorded naturalistic activity in the clinical setting.