Akeelah, Aron & Joanne
Quite a tech-spec day, first part spent connecting Aron's synth to Joannes computer! Its taken the first 3 meetings to get our heads round the possibilities of the programming inputs and outputs.
Joanne is coding from a software called SuperCollider, which connects to Arons synth through a midi. The morphs on the synth can then be controlled by SuperCollider. The sensors send data to SuperCollider, which uses this information to decide which morphs to play.
There are numerous ways we can arrange the flow of information (from the sensors) to the output via SuperCollider to the synth. Our foundation process is to use the sound parameters of the synth as responses to movement detected by the Ultrasonic sensors.
Over the session we looked at/ talked about the following options:
-Sensors taking delayed snapshots of input at intervals (e.g. 30 seconds) that set off a wave sequence in the sound and light
-Using sensors as a time stamp with distance to measure speed – match the arpeggio pace of the morphs to the visitors movement in the space
At this stage the audience movement will be interpreted as an effect dial on the synth (morph). A pre-composed arpeggio will undergo different morphs in response to audience movement.
Morphs will be gradual in their changes, so subtle, progressive differences will occur as the audience moves.
By end of the session we had each morph option linked to a sensor, as you moved around a different morph
is activated on the base arpeggio (blank/1/2/3/4/5) 6 sensors