Moving Auditory EEGs out of the Lab
It's now time to take EEG out of the lab and into the real world. Our speaker on February 13 will talk about his efforts to build a mobile EEG recording device and validate its performance with an auditory task. Maarten De Vos received his PhD from KU Leuven (Belgium) and is visiting Stanford for a few months, before heading to a faculty position at Oxford.
His validation experiments are interesting because he describes them as a means of decoding attention. He's using P300 attention-modulated correlates to measure what people are attending to. Very interesting.
Who: Maarten De Vos (University of Oldenberg, Oxford and visiting Stanford)
What: How much brain is in the ears? Towards a fully unobtrusive mobile EEG system
When: February 13, 2015 at 11AM
Where: CCRMA Seminar Room
Why: Because we want to measure humans in the real world
How much brain is in the ears? Towards a fully unobtrusive mobile EEG system
All non-invasive technologies for the study of human brain activity suffer from the requirement that only artificial, movement-constrained behavior is allowed. However, by reducing “normal” behaviour to a minimum the ecological validity of the results can be limited. To overcome these limitations we started developing a truly mobile EEG system suitable for field recordings and natural situations. Our first study confirmed that it is possible to identify single-trial P300 event-related potentials in response to auditory target stimuli while subjects walk outdoors. In a second study we controlled for the stimulus probability effect on P300 generation and found that task-relevance alone can evoke reliable P300 responses good enough in quality for driving a brain-computer interface (BCI). We will also demonstrate that signal quality of our mobile EEG system is equivalent to that of a standard lab amplifier in a traditional BCI experiment. Besides mobility and robustness with respect to motion, mobile EEG systems should ideally also interfere as little as possible with the participants behaviour. Towards this goal we present a near-invisible EEG solution, that will potentially enable hearing aid users to control the device by thought alone. Also predispositions for online decoding the attended speech stream in a natural multi-speaker setting will be discussed.