Brain-machine interfaces for car drivers


Applications of brain computer interfaces (BCI) have traditionally focused on the substitution or restoration of communication and motor capabilities of severely disabled individuals. To a smaller extent, applications for able-bodied persons have also been proposed, in particular in very specific contexts like space applications (e.g. tackling situational disabilities); military applications (e.g. target recognition from satellite images) or games. Alternatively, we propose the application of BCI technology to ease the interaction with intelligent cars in order to enhance the driving experience. Ranging from parking and lane changing assistance to fully autonomous navigation, today’s cars can autonomously assess and perform different driving maneuvers. In our approach, instead of removing the human from the loop, a BCI is used to monitor the driver’s cognitive state and use that information to modulate the assistance provided by the intelligent car. These systems take into account the external context (as perceived by in-car sensors) and the user’s intention (as decoded from electroencephalographic signals, EEG) to provide suitable and timely assistance; seamlessly interacting with the driver (Chavarriaga et al., 2018).

Extending previous work on decoding driver’s level of attention and emergency braking from brain activity [Haufe et al., 2011, Lin et al., 2009; Kohlmorgen et al., 2007], we studied EEG correlates of cognitive states in order to predict future actions or to evaluate whether the decisions of the intelligent system are coherent with the user intentions. This work builds upon previous results on the decoding of cognition-related signals in realistic environments, as well as the blending of human and machine intelligences through shared control. Our results show the feasibility of single-trial recognition of anticipation- and error-related potentials elicited while driving in a car simulator. This information (i.e. internal context), combined with external environmental cues captured by in-car sensors allows the evaluation of the instantaneous needs of the user. Consider a case where the car approaches a crossroad. Sensors in the car can perceive a red traffic light while brain signals can tell us whether the driver is aware of it and is getting ready to stop. This information, appearing as early as 500ms before any noticeable action or muscular activity, allows for the assistance to be provided only if the driver was unprepared to act.

Based on experiments in a realistic car-driving simulator and in a real vehicle where steering angles; brake and acceleration pedals are synchronously recorded with EEG, EOG and EMG signals, we found EEG correlates of anticipation and movement preparation that are consistent with signals previously reported in much simpler setups.

The outcome of our research was positively adopted by Nissan Motor Co. They unveiled a first prototype at the Consumer Electronics Society conference in January 2018, and continue in-house development for integration with their vehicles. Multiple scientific journals have been published as a result of this work. This research was selected among the top 10 submissions to the International BCI award in 2015.

This work was supported by Nissan Motor Corporation

Media converage

Selected publications