TY - JOUR
T1 - Mindchords
T2 - 9th International Conference on Information Technology and Quantitative Management, ITQM 2022
AU - Díaz M, Hernán A.
AU - Córdova, Felisa
AU - Ozimisa, Gina
AU - Fuentes, Hernán Díaz
N1 - Funding Information:
The present study was conducted as part of the thesis research program supported by the Neuromathlab, in the Department of Mathematics and Computer Science, Faculty of Science, University of Santiago de Chile.
Publisher Copyright:
© 2022 The Authors. Published by Elsevier B.V.
PY - 2022
Y1 - 2022
N2 - In this paper we report the use of a transformation of an EEG signal to a MIDI music representation. The subsequent analysis of the melodic and harmonic structure of this musical representation generated from the EEG, allowed us to have an image of the differential use of communication channels used by the brain, and that can be characterized by its frequency harmonic resonance pattern. The musical model has been previously proved to be very useful and informative with respect to some hidden functional structures, hard to detect from observing data in a table, a set of points or a bar plot describing the phenomena. Here we combined the possibility to have access to the audible experience on the EEG and the visual tools to represent this multidimensional experience, in a 2D mapping depiction. EEG data coming from 11 subjects were transformed into music, to use the two frontal electrodes (AF3 and AF4) and build a stereo musical piece, constructed with the left and right EEG signals coming from the frontal areas of the brain cortex. Results showed high intra- and inter-individual differences, when comparing the predominant frequency resonant structures. We called "mindchords"to this resonant frequency patterns, because we use a musical chords representation for detect and label specific patterns of brain functional dynamic, described in this way. The tool allows an easy characterization of the predominant resonant structures that populates the brain of the sample subjects, during basal, closed eyes, resting condition.
AB - In this paper we report the use of a transformation of an EEG signal to a MIDI music representation. The subsequent analysis of the melodic and harmonic structure of this musical representation generated from the EEG, allowed us to have an image of the differential use of communication channels used by the brain, and that can be characterized by its frequency harmonic resonance pattern. The musical model has been previously proved to be very useful and informative with respect to some hidden functional structures, hard to detect from observing data in a table, a set of points or a bar plot describing the phenomena. Here we combined the possibility to have access to the audible experience on the EEG and the visual tools to represent this multidimensional experience, in a 2D mapping depiction. EEG data coming from 11 subjects were transformed into music, to use the two frontal electrodes (AF3 and AF4) and build a stereo musical piece, constructed with the left and right EEG signals coming from the frontal areas of the brain cortex. Results showed high intra- and inter-individual differences, when comparing the predominant frequency resonant structures. We called "mindchords"to this resonant frequency patterns, because we use a musical chords representation for detect and label specific patterns of brain functional dynamic, described in this way. The tool allows an easy characterization of the predominant resonant structures that populates the brain of the sample subjects, during basal, closed eyes, resting condition.
KW - Brain Functional Assymetry
KW - EEG
KW - EEG Functional Dynamics
KW - EEG-Music Representation
UR - http://www.scopus.com/inward/record.url?scp=85146117674&partnerID=8YFLogxK
U2 - 10.1016/j.procs.2022.11.234
DO - 10.1016/j.procs.2022.11.234
M3 - Conference article
AN - SCOPUS:85146117674
SN - 1877-0509
VL - 214
SP - 720
EP - 726
JO - Procedia Computer Science
JF - Procedia Computer Science
IS - C
Y2 - 9 December 2022 through 11 December 2022
ER -