Real & Virtual Worlds — Review Week #11

This week was focused around the topic of sound and the content was as follows:

  • Chapter 5 from “New Directions in Mobile Media and Performance” by Camille Baker (2016)
  • Voices of VR #607: Principles of Immersive Sound Design with Sally Kellaway (Podcast, 2017)

This was an interesting overview from some sources I have previously read as a part of the unit as well as drawing attention to different experiences.

I enjoyed the section surrounding AR’s use in combination with live performances the most as I feel that it provides a new and exciting layer to the existing art mediums.

The concept behind the second project, P(AR)ticipate, I found to be particuarly intriguing due to it involving tag AR in which users interact with the environment by scanning associated images which are connected to the displayed media. This is an aspect which I wish to explore within my own work as a method of displaying my portfolio pieces in a new perspective. The referenced image appears to be in a empty, vast space which has provided me with the idea of utilising this technique within the UCA end of year show case, if it is to be in person. This involved into Ginsolv’s next project of “S.A.R.A” or Synesthetic Augmented Reality Application” by introducing sound and projected visuals onto dancers within a live performance.

From the information provided within the text and my pre-existing knowledge surrounding performances such as plays, I believe that the introduction of AR could be the next step to modernize theatre plays. For example, traditional plays have been continuously evolving by means of increasing the amount of interaction or participation of the audience over the years and experimenting by including members of the audience. By incorporating interaction via AR on the audiences’ mobile devices, it should provide the opportunity for more dynamic and engaging narratives and content, as well as, providing a new method of connecting the actors to the participants. This, in turn, should increase the empathy and emotional response within the users as they might feel partly responsible for the characters actions.

This was an interview with Sally Kellaway from OSSIC on immersive sound design. I found this discussion to be interesting as she not only covered her own background and how she discovered that she wanted to pursue sound design, but also delved deeper into the technology and processes behind what we hear in games and experiences.

Sally discusses how the sound used in Bioshock inspired her to pursue a career in sound design. On her journey she ended up designing for virtual reality experiences as she wanted to create more dynamic, less structured audio in comparison to original sound design for movies or video.

However, the most interesting part regarded the different plugins available for Unity and Unreal as well as stand alone audio engines available for designing spatialised sound. Despite this podcast being broadcast in 2017, I was surprised at the vast amount available. Some of the plugins mentioned are FMOD and WWISE. It was also interesting to hear about the brief history of sound and how the new 360 sound has originated from a combination of ambisonics and binaural audo techniques. This prompts questions regarding the future of audio within virtual reality and if the methods employed could be utilised for the other senses such as smell and touch?

It also supports the notion of spatialised or 360 sound contributing to a new layer of emotional engagement as I stated within the previous viewing.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store