Dream: the future of massive interactive live events for music

Back in August, Bas argued that two of the key elements for the longevity of virtual concerts are interactivity and for the audience to have magical powers. One place to find these elements is in MILEs [massive interactive live events, a term I first came across with Jacob Navok, CEO of Genvid]. Such events are cloud-based and use only a single simulation to bring together large amounts of players who all interact and influence the game in real time. Most of these events so far have been game-based and serendipitous: Twitch Plays PokΓ©mon, Reddit‘s Place, and more recently Facebook’s Rival Peak. Basically, organising a MILE means tapping into gaming culture and gamifying what you do if it’s not already a game. The closest we’ve come so far in music is Travis Scott‘s performance in Fortnite. Inside the game the performance was limited to 50 people per ‘room’ but millions watched the show via Twitch or YouTube streams, either live or afterwards. What this virtual concert lacked was the ability for the audience to affect what happened. To unleash true interactivity and provide magical powers to viewers that ability to affect is exactly what we need.

Lord, what fools these mortals be

Those of you who know your Shakespeare will recognise this quote from A Midsummer Night’s Dream. It’s not just fitting because within a MILE we can behave godlike, but also because the Royal Shakespeare Company (RSC) will perform a new play based on this wonderful story as an immersive event in March. It’s a great achievement that brings together a variety of disciplines from gaming, theatre and music to XR, motion capture and live performance. The RSC leads this multidisciplinary group as part of a UK government programme called Audience of the Future. This programme supports storytellers like the RSC to work with immersive technologies and tap into, hopefully, new audiences. Another supporter of this project is Epic Games, through its Megagrants. The idea for Dream goes back to before the pandemic, but as with so many things lockdowns and social distancing emphasizes its importance as an experiment.

If we shadows have offended

In Dream, the RSC welcomes the audience to interact with the performance in various ways. First of all, there’s the fireflies. All the actors perform in motion-capture suits and the viewers influence their movements through a virtual forest – rendered in Epic’s Unreal Engine. Through whichever input device the viewer uses – touchscreen or mouse, for example – they can guide Puck (if you’re new to A Midsummer Night’s Dream, Puck is the sprite, or fairy, who sets the story in motion through his magic) their moonlit travels through the forest.

Photo credit: Stuart Martin, RSC.

Each performance will last 50 minutes and will be a unique experience. Unique, simply because each audience will behave differently. What differentiates Dream from MILEs based on or within games is the element of live performance involving humans in motion-capture suits reacting to the viewers’ input. It’s this element of interaction, however, that sets Dream apart from game-based MILEs as an example for the music industry to take note of.

Photo credit: Stuart Martin, RSC.

While these visions did appear

It’s not just a visual spectacle though, the music is interactive too. The Philharmonia Orchestra performs the score composed by their music director Esa Pekka Salonen [who has developed a knack for pandemic-related performances that excite] and Jesper Nordin, whose Gestrument music software also powers the interactive soundtrack. The orchestra actually recorded the music pre-pandemic as this latest RSC performance of Dream should have been staged last year in the spring. Not only, then, do the viewers influence the movement of the fireflies as they guide Puck through the forest. Since the actors perform with Gestrument, those movements also influence the score in real time. Dream is thus a truly interactive music experience captured within a dynamic soundtrack.

I must go seek some dewdrops here

Dream‘s success starts with the story. To replicate the tech involved onto a concert experience leaves little room for play for the viewers. Travis Scott’s Fortnite performance worked because he took his viewers on a trip and told them a story within the gameplay which he supported by his songs. Going back to another piece by Navok, written with Matthew Ball, where they argue that cloud-gaming, of which MILEs are examples, spawns completely different experiences than, for example, console-based games. It requires a different set of expectations that involve interaction and participatory influence. In their words: “social experiences, not technical capability, drive engagement.” The same holds up for virtual concerts: we will need to come up with a narrative that will allow large groups of viewers to actively engage with the music and with each other while they collectively adapt that music as the story unfolds. This experience will contrast with the shared reality of enjoying a live gig with dozens, hundreds, or thousands of others at the same time. Hosting a music-driven MILE will convince people to stick to their screen for a unique experience unlike anything else in the real world.