![]() ![]() Elevation based and asymmetrical falloff, strategic placement and toggling of emitters to create more controlled diffraction paths, room-progression based mixing and triggering…there were two “real” limitations that I had to work around though. ![]() Did you run into any technical limitations that required unique solutions?īraeger Moore, Senior Sound Designer & Systems Engineer : Most of the “limitations” we came up against simply required using multiple Wwise features to supplement each other. So we’d make these roughly scored videos to agree on the layout, loop & trigger logic, rough mix, vibe, etc. Nigel did a ton of work remixing, re-arranging, and creating new versions of the material - he really made this special. Then I would go head-down and mock up different arrangements and mixes inside of Ableton using videos of our latest project and the original album stems that Nigel had prepped for the exhibition. Since this all went down during the pandemic and everyone was remote, we would meet on zoom to discuss creative direction. Where was the line between diagetic & soundtrack, between spatialized and listener-locked? These were some of my favorite conversations and experiments - riding the lines between placing the viewer in an environment with consequential attributes, while maintaining the original integrity and nuance of the source material. We went back and forth a lot at the beginning over how to use specialization and reflections. This let us map out a continuous experience that had peaks, valleys, ebbs, & flows. ![]() Nigel had a very concrete idea of how this could work - a lot of the audio design was to be more like a gallery, as opposed to hitting you over the head. That there was so much material both in the records and on the floor, that blasting everything open into its component parts and laying them out in some way was not only true to the spirit of the material, but essential to building an experience off of. Matthew Davis, Producer : The first brief I heard was this notion of Exploded Songs. What was the process for translating the music of Radiohead to the spatial audio representation in Kid A Mnesia? Who was involved? What materials did you have access to? How were ideas communicated? What was the approval process once ideas materialized in the experience? The design was to have no one correct path, and contain no dead ends.” 2. At times, the player was to feel overwhelmed. We wanted to instill a feeling of being lost without feeling hopeless. It was to be a museum that combined a labyrinth with the Library of Babel. To quote Sean, “This was to be an exhibition of the output from that Kid A / Amnesiac era, explored in a forgotten alien ruin. There was absolutely a clear vision from the start - Sean, Nigel, Thom, & Dan are all visionaries and were enormously helpful in keeping us on the rails. After a few meetings with the Band, Nigel, and Sean, it became clear that doing this in Unreal (& Wwise) actually unlocked a whole realm of pure possibilities that would let this concept flourish. Due to various constraints, not least of all being the pandemic (!), the idea of doing this virtually gradually gained steam throughout 2020. Matthew Davis, Producer : The idea originated as an ‘IRL’ exhibition with a vaguely similar concept - showcasing the massive amount of artwork, both audio and visual, created during the band’s Kid A / Amnesiac era. How did Kid A Mnesia, the interactive experience, make its start? Were you involved at the beginning of production? Was there a clear vision or vision holder(s) that helped align the creativity of development with the ultimate results? ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |