Humans as Curators of Artificial Creativity: the Implications of Algorithm Guided Music Creation

Developments in machine learning and AI will make it possible for bedroom producers to use algorithms to guide music creation. Not 10 years from now, but in the next 2 or 3 years — at the most.

We’re slowly going to shift from a remix culture of sampling, to a more generative culture. Instead of sampling that perfect bassline, and risking a takedown, you can have AI generate one with the click of a button and have something decent.

To many, this sounds like a terrible prospect. One argument is that music with less human input is ‘worse’. I think it depends on the application of technology and how you look at music.

Music as a means to communicate ideas

Most bedroom producers’ music will not be heard by more than a few thousand people. They engage in online communities where music is created and spread like memes. Their music exists to communicate ideas. Sampling and remixing are tolerated practices, because they are legitimate means of reinterpreting someone else’s idea and taking it in a different direction. Algorithms will speed up the creation process and the communication of ideas. Humans will become curators of the AI’s output. The curator will take the ‘artificial creative output’, filter, and remix it.

It further democratizes music creation for people who don’t know how to ‘make music’. For example, communicating remix ideas: “this song would work really well as dubstep” no longer has to be limited to the imagination. You can go home (or just grab your mobile device), generate a bunch of remixes until something good comes out, and send it to your friend.

Participative culture paradox

Through advanced music creation algorithms, like the dubstep example above, it will be easier for more people to participate in our culture. Instead of being a listener, people will be able to touch and taste the creative process, at least as a type of curator or director.

While AI could put some people in the music creation process out of work, it can ultimately make music more inclusive, involving more people in a participative way.

Examples
  • Reflexive Looper: interactive AI assisted music jamming
  • Ditty: a ‘text to song’ engine + messenger for iOS & Android
  • MusicMixer: a computer-aided DJ system
  • #NeuralBeats: generative techno with recurrent neural networks

Many more examples over at CreativeAI. (Thanks for the tip, Jelle — who’s about to launch a great newsletter on UI programming)


Written for my weekly newsletter MUSIC x TECH x FUTURE. If you enjoyed reading this, please consider sharing and subscribing.

Follow on Twitter.
Follow on Facebook.