About the Cover of Making Things Think8 links

The cover image of Making Things Think is a portion of a song generated entirely by a machine learning model trained to write piano music.

Loading chart…

The code for generating the music is by Ian Simon, Anna Huang, Jesse Engel, and Curtis “Fjord” Hawthorne, published as open source in a Colab notebook.

As they explain, the models used were trained on over 10,000 hours of piano recordings from YouTube, transcribed using Onsets and Frames and represented using the event vocabulary from Performance RNN. It uses a Transformer model for piano music generation, based on the Music Transformer model introduced by Huang et al. in 2018.

All credit for the engineering goes to these developers and researchers.

We (Holloway) used the generated MIDI file and rendered it with MuseScore* (and GarageBand for the MP3).

The AI, of course, did not name the song. But since the file was named “unconditional” (after the type of Transformer model), it feels apt to call it Unconditional.

If you found this post worthwhile, please share!