DeepBach: a Steerable Model for Bach Chorales Generation
Authors: Gaëtan Hadjeres, François Pachet, Frank Nielsen
ICML 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We discuss in Sect. 3 the results of an experimental study we conducted to assess the quality of our model. |
| Researcher Affiliation | Collaboration | 1LIP6, Universit e Pierre et Marie Curie 2Sony CSL, Paris 3Sony CSL, Japan. |
| Pseudocode | Yes | Algorithm 1 Pseudo-Gibbs sampling |
| Open Source Code | Yes | All examples can be heard on the accompanying web page3 and the code of our implementation is available on Git Hub4. [...] 4https://github.com/Ghadjeres/Deep Bach |
| Open Datasets | Yes | We used the database of chorale harmonizations by J.S. Bach included in the music21 toolkit (Cuthbert & Ariza, 2010). |
| Dataset Splits | Yes | This gives us a corpus of 2503 chorales and split it between a training set (80%) and a validation set (20%). |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory specifications) used for running its experiments. |
| Software Dependencies | No | We implemented Deep Bach using Keras (Chollet, 2015) with the Tensorflow (Abadi et al., 2015) backend. (No version numbers provided for Keras or TensorFlow themselves, only their citation years, which are not version numbers.) |
| Experiment Setup | Yes | The reported results in Sect. 3 and examples in Sect. 4.3 were obtained with t = 16. We chose as the neural network brick in Fig. 4 a neural network with one hidden layer of size 200 and Re LU (Nair & Hinton, 2010) nonlinearity and as the Deep RNN brick two stacked LSTMs (...), each one being of size 200 (...). There are 20% dropout on input and 50% dropout after each layer. |