Brains on Beats

Authors: Umut Güçlü, Jordy Thielen, Michael Hanke, Marcel van Gerven

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We developed task-optimized deep neural networks (DNNs) that achieved state-of-the-art performance in different evaluation scenarios for automatic music tagging. These DNNs were subsequently used to probe the neural representations of music. Representational similarity analysis revealed the existence of a representational gradient across the superior temporal gyrus (STG). Representational similarity analysis revealed the existence of a representational gradient across the superior temporal gyrus (STG).
Researcher Affiliation Academia Umut Güçlü Radboud University, Donders Institute for Brain, Cognition and Behaviour Nijmegen, the Netherlands u.guclu@donders.ru.nl Jordy Thielen Radboud University, Donders Institute for Brain, Cognition and Behaviour Nijmegen, the Netherlands j.thielen@psych.ru.nl Michael Hanke Otto-von-Guericke University Magdeburg Center for Behavioral Brain Sciences Magdeburg, Germany michael.hanke@ovgu.de Marcel A. J. van Gerven Radboud University, Donders Institute for Brain, Cognition and Behaviour Nijmegen, the Netherlands m.vangerven@donders.ru.nl
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks. Figure 1 shows a model architecture diagram, not pseudocode.
Open Source Code No The paper mentions that 'The DNN models were implemented in Keras [30]' with a link to the Keras library. However, it does not provide concrete access to the specific source code for the methodology described in this paper or state that their code is open-sourced.
Open Datasets Yes We used the Magna Tag ATune dataset [22] for DNN estimation. We used the existing studyforrest dataset [23] for representational similarity analysis.
Dataset Splits Yes Parts 1-12 were used for training, part 13 was used for validation and parts 14-16 were used for testing.
Hardware Specification No The paper mentions 'Ultra-high-field (7 Tesla) f MRI images were collected using a Siemens MAGNETOM scanner' for data acquisition, but it does not provide specific hardware details (like GPU/CPU models, processors, or memory) used for running the deep neural network experiments.
Software Dependencies No The paper states, 'The DNN models were implemented in Keras [30].' Reference [30] is 'F. Chollet, Keras. https://github.com/fchollet/keras, 2015.' While it names Keras and its publication year, it does not provide specific version numbers for Keras or any other ancillary software components (e.g., 'Keras 1.2.0', 'Python 3.x').
Experiment Setup Yes We used Adam [28] with parameters α = 0.0002, β1 = 0.5, β2 = 0.999, ϵ = 1e 8 and a mini batch size of 36 to train the models by minimizing the binary cross-entropy loss function. Initial model parameters were drawn from a uniform distribution as described in [29]. Songs in each training mini-batch were randomly cropped to six seconds (96000 samples). The epoch in which the validation performance was the highest was taken as the final model (53, 12 and 12 for T, F and TF models, respectively).