Phase 2 Reflections

27 Jul 2017

 

 

In Phase 2 we were able to have a more in depth discussion with audience members abut their interaction with the work presented. Post Phase 2 we talked about the relationships between the sound, sensors and audiences. We spent some time looking at the steps we have taken so far and pairing them with the outcomes we would like to achieve. Narrowing down the aims of such an open project has been quite difficult, coupled with developing a work from a concept, rather than from a physical stimulus.

 

Interaction

Whilst the objective has been to create an interactive work, it has been difficult to shed the image of deterministic ‘on/off’ interaction that is conjured by the term. Balancing levels of control between the artists and the audience has consumed a lot of our thought and heavily influenced the decisions we have made particularly with how the sound is structured. This is coupled with the quite open ended nature of R&D, I have been quite hesitant to commit to a particular route until now (where time is of the essence!).

 

Today we devised a model where the body is less of a composer in the space, and more of an influencer in the playing of pre-set composition. The relationship of the sound and the algorithm has been difficult to devise – its very much a chicken/egg conundrum (Joanne needs sounds to work out a structure, Aron needs a structure to develop a group of sounds)

 

Thinking of how and when types of sounds might appear have helped us conceptualise the following model.

 

 

Ultimately audience presence in the space will affect the tonality/texture order and intensity of the composition.

 

The “mystery area” will comprise interactions that are less easy to recognise. Input will have a more cumulative effect on the work.

 

In previous showings we have found that telling audience the work is interactive, can leave them quite frustrated if they are not able to figure out the direct impact they are having on the space.

 

The language we use to contextualise the work needs to avoid this kind of expectation. We need to develop the idea –

  • Ultiverse reflects the bodies in the space

  • Evolves in response to the audience that is present

 

In this way audience members may be more inclined to look for generic relationships, and can be positioned to think of their collective interaction. Ultimately Ultiverse is a sensory, immersive, emotive experience and I would not like to loose this sense of being to a deterministic engagement with the work.

 

In the tester area we will set up 3 gesture interactions

  • XYZ sensor (left right pan/pitch/loudness)

  • XZ (loudness/left right pan)

  • Z (distance)

 

Each sensor will link to a separate movement/effect on the light projection, most likely the gesture sensor will follow your hand

 

 

Share on Facebook
Share on Twitter
Please reload

Recent Posts

August 21, 2017

July 27, 2017

July 10, 2017

May 23, 2017

May 12, 2017

April 26, 2017

Please reload

Archive
Please reload

Search By Tags
Please reload

Follow Us
  • Grey Facebook Icon
  • Grey Instagram Icon