Today Joanne and I were looking at the set up for the installation at the Corn exchange next week. So far we have 6 arpeggio morphs that will be triggered by 6 sensors. We will also pair the sensors with visual reactions, in the form of an animated cube reflected in the foil.
This session was spent working out a set up that would balance mystery and insight. Whilst we want to test the limits of how complex the arrangement can be, we also want to keep it simple enough to collect clear feedback about the different parameters we are exploring in the light and sound.
We will divide the room into thirds. Each third will inhabit 2 sensors. Each sensor is linked to a visual and audio response. The thirds will explore different parameters. In the light these will be speed, transparency and colour. In the sound these will be the different textures created by the morphs on the arpeggio. We are hoping to get feedback about the effects of the parameters as they respond to the audience presence; and about ways the visitors would like to see the installation interact with them.
The ideology/concept of the work is slowly developing as we discover what the technological options are. I’ve always wanted the work to feel alive – and exist outside the linear interactive interaction e.g. press here lights up. Since its inception I have regarded Ultiverse as a living being, and Joanne introducing genetic algorithms will really open up the potential for this. The context for audiences has to be that the work is conversational – frame it outside the ‘responsive’ interactive realm and bring it more into the AI/ Bio-technic/ SOUL interactive. Watching the (red box code) Joanne hacked today it was a visualisation of everything I’d hoped for – like a surface of water with various ripples interacting and effecting each other – complex like our own response to stimulus. Trying to think of it in waves, cycles, complex conversation – really lacking the vocabulary to describe it at the moment – but that programme summed it up.