TY - GEN
T1 - EEG-Based decoding of auditory attention to a target instrument in polyphonic music
AU - Cantisani, Giorgia
AU - Essid, Slim
AU - Richard, Gael
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/10/1
Y1 - 2019/10/1
N2 - Auditory attention decoding aims at determining which sound source a subject is "focusing on". In this work, we address the problem of EEG-based decoding of auditory attention to a target instrument in realistic polyphonic music. To this end, we exploit a stimulus reconstruction model which was proven to decode successfully the attention to speech in multi-speaker environments. To our knowledge, this model was never applied to musical stimuli for decoding attention. The task we consider here is quite complex as the stimuli used are polyphonic, including duets and trios, and are reproduced using loudspeakers instead of headphones. We consider the decoding of three different audio representations and investigate the influence on the decoding performance of multiple variants of musical stimuli, such as the number and type of instruments in the mixture, the spatial rendering, the music genre and the melody/rhythmical pattern that is played. We obtain promising results, comparable to those obtained on speech data in previous works, and confirm that it is possible to correlate the human brain activity with musically relevant features of the attended source.
AB - Auditory attention decoding aims at determining which sound source a subject is "focusing on". In this work, we address the problem of EEG-based decoding of auditory attention to a target instrument in realistic polyphonic music. To this end, we exploit a stimulus reconstruction model which was proven to decode successfully the attention to speech in multi-speaker environments. To our knowledge, this model was never applied to musical stimuli for decoding attention. The task we consider here is quite complex as the stimuli used are polyphonic, including duets and trios, and are reproduced using loudspeakers instead of headphones. We consider the decoding of three different audio representations and investigate the influence on the decoding performance of multiple variants of musical stimuli, such as the number and type of instruments in the mixture, the spatial rendering, the music genre and the melody/rhythmical pattern that is played. We obtain promising results, comparable to those obtained on speech data in previous works, and confirm that it is possible to correlate the human brain activity with musically relevant features of the attended source.
KW - Auditory attention decoding
KW - EEG
KW - Polyphonic music
KW - Stimulus reconstruction model
U2 - 10.1109/WASPAA.2019.8937219
DO - 10.1109/WASPAA.2019.8937219
M3 - Conference contribution
AN - SCOPUS:85078003716
T3 - IEEE Workshop on Applications of Signal Processing to Audio and Acoustics
SP - 80
EP - 84
BT - 2019 IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, WASPAA 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, WASPAA 2019
Y2 - 20 October 2019 through 23 October 2019
ER -